Shaders In VTK: Difference between revisions
m (Martink moved page Shader In VTK to Shaders In VTK) |
No edit summary |
||
Line 91: | Line 91: | ||
=Textures= | =Textures= | ||
Any vtkTextures assigned to the actor/property will get declared and bound for use in the shader. The names will be | Any vtkTextures assigned to the actor/property will get declared and bound for use in the shader. The names as of VTK 9.0 will be colortexture for the texture used in texture based scalar coloring, actortexture for the actor's texture, and property textures will be named based on the name you passed to SetTexture. The order is that the Actor's texture gets processed first followed by any textures specified in the property. The actual texture unit used for a texture is unknown, but you can count on it having a texture unit and being active when the shader is executed. | ||
=Acknowledgements= | =Acknowledgements= |
Revision as of 16:22, 5 March 2018
Introduction
This document describes the Shader support in VTK. In 2015 we created a new OpenGL backend fully based on shaders. This is the default backend for VTK versions 7.0 and later. You can extend some mapper's shaders using methods the mapper's provide. You can also subclass to create your own mappers that have custom shaders. VTK guarantees you will have at least OpenGL version 2.1 with the gpu_shader4 extension or version 3.2.
Basics
The shaders in VTK consist of three strings consisting of the vertex, fragment and geometry shader code. The geometry shader code is optional, the other two are required. The vtkOpenGLShaderCache handles caching shaders to improve performance. Shaders are stored in vtkShaders and combined into a vtkShaderProgram. You will likely never need to work with those two classes as the vtkOpenGLShaderCache is the main entrance point for creating and binding shaders.
Structure of a Shader
VTK uses shaders to perform its OpenGL rendering. VTK supports many different options when it comes to rendering, resulting in potentially thousands of possible combinations. While we could make one giant shader that uses defines or uniforms to switch between all these possibilities it would be limiting. Instead we build up the shader using string replacements on the fly, and then cache the results for performance.
When writing your own shaders you can use any approach you want. In the end they are just strings of code. For vtkOpenGLPolyDataMapper we make use of heavy string replacments. In other classes we do very little processing as the shader has far fewer options. Regardless there are a few conventions you should be aware of.
For shader replacements we tend to use a form of
//VTK::SomeConcept::SomeAction
For example
//VTK::Normal::Dec - declaration any uniforms/varying needed for normals //VTK::Normal::Impl - Implementation of shader code for handling normals
All shaders should start with the following line
//VTK::System::Dec
Which vtkOpenGLShaderCache will replace with a #version and some other values to match the system and OpenGL Context it has. The other line you need (only in your fragment shader) is
//VTK::Output::Dec
which VTK uses to map shader outputs to the framebufer.
All vertex shaders should name their outputs with a postfix of VSOutput All geometry shaders should name their outputs with a postfix of GSOutput All fragment shaders should name their inputs with a postfix of VSOutput. Put another way fragment shaders should assume their input is coming from the vertex shader. If a geometry shader is present VTK will rename the fragment shader inputs from VSOutput to GSOuput automatically.
Variables that represent positions or directions usually have a suffix indicating the coordinate system they are in. The possible values are
- MC - Model Coordinates
- WC - WC world coordinates
- VC - View Coordinates
- DC - Display Coordinates
- NVC - NormalizeViewCoordinates
PolyDataMapper
The vtkOpenGLPolyDataMapper is probably the most complex shader in VTK and is designed to handle many different situations. As such we have provided a few methods to support customizing the shader to various degrees. The first methods are
void AddShaderReplacement( vtkShader::Type shaderType, // vertex, fragment, etc std::string originalValue, bool replaceFirst, // do this replacement before the default std::string replacementValue, bool replaceAll); void ClearShaderReplacement( vtkShader::Type shaderType, // vertex, fragment, etc std::string originalValue, bool replaceFirst);
which allow you to specify your own string replacements to use in the shader. For example see [1]
the other methods allow you to completely override the shader code that is used.
vtkSetStringMacro(VertexShaderCode); vtkGetStringMacro(VertexShaderCode); vtkSetStringMacro(FragmentShaderCode); vtkGetStringMacro(FragmentShaderCode); vtkSetStringMacro(GeometryShaderCode); vtkGetStringMacro(GeometryShaderCode);
you can find an example of using these along with specifying your own uniforms here [2]
Textures
Any vtkTextures assigned to the actor/property will get declared and bound for use in the shader. The names as of VTK 9.0 will be colortexture for the texture used in texture based scalar coloring, actortexture for the actor's texture, and property textures will be named based on the name you passed to SetTexture. The order is that the Actor's texture gets processed first followed by any textures specified in the property. The actual texture unit used for a texture is unknown, but you can count on it having a texture unit and being active when the shader is executed.
Acknowledgements
- Shader support in VTK is a collaborative effort between Sandia National Labs and Kitware Inc.