Shaders In VTK: Difference between revisions

From KitwarePublic
Jump to navigationJump to search
No edit summary
No edit summary
 
(15 intermediate revisions by 3 users not shown)
Line 1: Line 1:
=Introduction=
=Introduction=


This document describes the Shader support in VTK. This is only a preliminary document. We are working on a comprehensive VTK shader guide, which should be available soon.  
This document describes the Shader support in VTK. In 2015 we created a new OpenGL backend fully based on shaders. This is the default backend for VTK versions 7.0 and later. You can extend some mapper's shaders using methods the mapper's provide. You can also subclass to create your own mappers that have custom shaders.  VTK is based on OpenGL version 3.2 or later.
 
=Basics=
 
The shaders in VTK consist of three strings consisting of the vertex, fragment and geometry shader code. The geometry shader code is optional, the other two are required. The vtkOpenGLShaderCache handles caching shaders to improve performance. Shaders are stored in vtkShaders and combined into a vtkShaderProgram. You will likely never need to work with those two classes as the vtkOpenGLShaderCache is the main entrance point for creating and binding shaders.
 
Structure of a Shader
 
VTK uses shaders to perform its OpenGL rendering.  VTK supports many different
options when it comes to rendering, resulting in potentially thousands of
possible combinations. While we could make one giant shader that uses defines or
uniforms to switch between all these possibilities it would be limiting. Instead
we build up the shader using string replacements on the fly, and then cache the
results for performance.


Hardware shaders are becoming ever popular with the introduction of sophisticated programmable GPUs. With the advent of higher level languages for shader programming such as NVidia's Cg, OpenGL's GLSL (GL Shading Language), it has become easier it harness the power of the GPU. Off late, there has been tremendous interest in using the GPU for general purpose computing (GPGPU) as well.
When writing your own shaders you can use any approach you want. In the end they
are just strings of code. For vtkOpenGLPolyDataMapper we make use of heavy
string replacments. In other classes we do very little processing as the shader
has far fewer options. Regardless there are a few conventions you should be
aware of.


For shader replacements we tend to use a form of


VTK supports shaders written in both, GLSL as well as Cg. This document does not cover how to write shaders in either language. We discuss, however, ways to apply shaders to VTK geometric objects or actors.
//VTK::SomeConcept::SomeAction


For example


=Basics=
//VTK::Normal::Dec  - declaration any uniforms/varying needed for normals
//VTK::Normal::Impl - Implementation of shader code for handling normals
 
All shaders should start with the following line
 
//VTK::System::Dec
 
Which vtkOpenGLShaderCache will replace with a #version and some other values to
match the system and OpenGL Context it has. The other line you need (only in
your fragment shader) is
 
//VTK::Output::Dec
 
which VTK uses to map shader outputs to the framebufer.
 
All vertex shaders should name their outputs with a postfix of VSOutput All
geometry shaders should name their outputs with a postfix of GSOutput All
fragment shaders should name their inputs with a postfix of VSOutput.  Put
another way fragment shaders should assume their input is coming from the
vertex shader.  If a geometry shader is present VTK will rename the fragment
shader inputs from VSOutput to GSOuput automatically.
 
Variables that represent positions or directions usually have a suffix
indicating the coordinate system they are in. The possible values are
 
*MC  - Model Coordinates
*WC  - WC world coordinates
*VC  - View Coordinates
*DC  - Display Coordinates
*NVC - NormalizeViewCoordinates
 
=PolyDataMapper=
 
The vtkOpenGLPolyDataMapper is probably the most complex shader in VTK and is designed to handle many different situations. As such we have provided a few methods to support customizing the shader to various degrees. The first methods are
 
  void AddShaderReplacement(
    vtkShader::Type shaderType, // vertex, fragment, etc
    std::string originalValue,
    bool replaceFirst,  // do this replacement before the default
    std::string replacementValue,
    bool replaceAll);
  void ClearShaderReplacement(
    vtkShader::Type shaderType, // vertex, fragment, etc
    std::string originalValue,
    bool replaceFirst);


The shaders in VTK are encapsulated in a '''Material'''. A material is described in a xml file. The format of this file is discussed later<font color=red>*TODO*</font color=red>). The material description may include
which allow you to specify your own string replacements to use in the shader. For example see [https://gitlab.kitware.com/vtk/vtk/blob/v7.0.0.rc2/Rendering/OpenGL2/Testing/Cxx/TestUserShader.cxx]
* vertex/fragment shaders
* uniform variables passed to the shaders
* surface properties i.e. attribute values for vtkProperty such as AmbientColor,Specular etc.


To apply a material to an actor, the material must be ''loaded'' on the actor's property using vtkProperty::LoadMaterial(const char*). The argument can be a location a xml file describing the material, or name of a material defined by VTK MaterialLibrary (vtkMaterialLibrary). (It can also be a material provided in the Material Repository, but we will discuss that later <font color=red>*TODO*</font color=red>).
the other methods allow you to completely override the shader code that is used.


When a material is loaded successfully, the vtkProperty is updated using the surface property values defined in the xml. It also instantiates an appropriate vtkShaderProgram subclass, which manages all the shaders defined in the XML. Currently, shaders with different languages cannot be used in the same material. The ShaderProgram compiles the shader code and binds the shaders everytime the actor is rendered.
  vtkSetStringMacro(VertexShaderCode);
  vtkGetStringMacro(VertexShaderCode);
  vtkSetStringMacro(FragmentShaderCode);
  vtkGetStringMacro(FragmentShaderCode);
  vtkSetStringMacro(GeometryShaderCode);
  vtkGetStringMacro(GeometryShaderCode);


The uniform parameters passed to the Shaders can be of following types:
you can find an example of using these along with specifying your own uniforms here [https://gitlab.kitware.com/vtk/vtk/blob/v7.0.0.rc2/Rendering/OpenGL2/Testing/Cxx/TestUserShader2.cxx]
* Parameters whose values are constants defined in the xml: these are identified with xml tag '''Uniform'''
* Parameters whose values are constants defined at runtime. These are called application uniform variables. These are identified by tag '''Uniform''' with attribute '''value''' not specified. The value for Application uniform variables can be set at run time using vtkProperty::AddShaderVariable. (NOTE: we are planning to merge the distinction between Uniform with ''value'' specified in xml or at runtime. Once that's done, it will be possible to change the value of any Uniform variable at run time using vtkProperty::AddShaderVariable).
* Parameters whose values are obtained from the active vtkCamera during rendering: these are identified by xml tag '''CameraUniform'''.
* Parameters whose values are obtained from the vtkProperty to which this material is applied: these are '''PropertyUniform''' parameters.
* Parameters whose values are obtained from the Lights in the renderer: '''LightUniform'''.
* '''MatrixUniform''' parameters which are data matrices (or in case of Cg, can be GL state matrices).


=Textures=


Any vtkTextures assigned to the actor/property will get declared and bound for use in the shader. The names as of VTK 9.0 will be colortexture for the texture used in texture based scalar coloring, actortexture for the actor's texture, and property textures will be named based on the name you passed to SetTexture. The order is that the Actor's texture gets processed first followed by any textures specified in the property. The actual texture unit used for a texture is unknown, but you can count on it having a texture unit and being active when the shader is executed.


=Acknowledgements=
=Acknowledgements=
* Shader support in VTK is a collaborative effort between Sandia National Labs and Kitware Inc.
* Shader support in VTK is a collaborative effort between Sandia National Labs and Kitware Inc.
{{VTK/Template/Footer}}

Latest revision as of 12:33, 2 April 2018

Introduction

This document describes the Shader support in VTK. In 2015 we created a new OpenGL backend fully based on shaders. This is the default backend for VTK versions 7.0 and later. You can extend some mapper's shaders using methods the mapper's provide. You can also subclass to create your own mappers that have custom shaders. VTK is based on OpenGL version 3.2 or later.

Basics

The shaders in VTK consist of three strings consisting of the vertex, fragment and geometry shader code. The geometry shader code is optional, the other two are required. The vtkOpenGLShaderCache handles caching shaders to improve performance. Shaders are stored in vtkShaders and combined into a vtkShaderProgram. You will likely never need to work with those two classes as the vtkOpenGLShaderCache is the main entrance point for creating and binding shaders.

Structure of a Shader

VTK uses shaders to perform its OpenGL rendering. VTK supports many different options when it comes to rendering, resulting in potentially thousands of possible combinations. While we could make one giant shader that uses defines or uniforms to switch between all these possibilities it would be limiting. Instead we build up the shader using string replacements on the fly, and then cache the results for performance.

When writing your own shaders you can use any approach you want. In the end they are just strings of code. For vtkOpenGLPolyDataMapper we make use of heavy string replacments. In other classes we do very little processing as the shader has far fewer options. Regardless there are a few conventions you should be aware of.

For shader replacements we tend to use a form of

//VTK::SomeConcept::SomeAction

For example

//VTK::Normal::Dec - declaration any uniforms/varying needed for normals //VTK::Normal::Impl - Implementation of shader code for handling normals

All shaders should start with the following line

//VTK::System::Dec

Which vtkOpenGLShaderCache will replace with a #version and some other values to match the system and OpenGL Context it has. The other line you need (only in your fragment shader) is

//VTK::Output::Dec

which VTK uses to map shader outputs to the framebufer.

All vertex shaders should name their outputs with a postfix of VSOutput All geometry shaders should name their outputs with a postfix of GSOutput All fragment shaders should name their inputs with a postfix of VSOutput. Put another way fragment shaders should assume their input is coming from the vertex shader. If a geometry shader is present VTK will rename the fragment shader inputs from VSOutput to GSOuput automatically.

Variables that represent positions or directions usually have a suffix indicating the coordinate system they are in. The possible values are

  • MC - Model Coordinates
  • WC - WC world coordinates
  • VC - View Coordinates
  • DC - Display Coordinates
  • NVC - NormalizeViewCoordinates

PolyDataMapper

The vtkOpenGLPolyDataMapper is probably the most complex shader in VTK and is designed to handle many different situations. As such we have provided a few methods to support customizing the shader to various degrees. The first methods are

 void AddShaderReplacement(
   vtkShader::Type shaderType, // vertex, fragment, etc
   std::string originalValue,
   bool replaceFirst,  // do this replacement before the default
   std::string replacementValue,
   bool replaceAll);

 void ClearShaderReplacement(
   vtkShader::Type shaderType, // vertex, fragment, etc
   std::string originalValue,
   bool replaceFirst);

which allow you to specify your own string replacements to use in the shader. For example see [1]

the other methods allow you to completely override the shader code that is used.

 vtkSetStringMacro(VertexShaderCode);
 vtkGetStringMacro(VertexShaderCode);
 vtkSetStringMacro(FragmentShaderCode);
 vtkGetStringMacro(FragmentShaderCode);
 vtkSetStringMacro(GeometryShaderCode);
 vtkGetStringMacro(GeometryShaderCode);

you can find an example of using these along with specifying your own uniforms here [2]

Textures

Any vtkTextures assigned to the actor/property will get declared and bound for use in the shader. The names as of VTK 9.0 will be colortexture for the texture used in texture based scalar coloring, actortexture for the actor's texture, and property textures will be named based on the name you passed to SetTexture. The order is that the Actor's texture gets processed first followed by any textures specified in the property. The actual texture unit used for a texture is unknown, but you can count on it having a texture unit and being active when the shader is executed.

Acknowledgements

  • Shader support in VTK is a collaborative effort between Sandia National Labs and Kitware Inc.



VTK: [Welcome | Site Map]