ParaView/Users Guide/List of filters: Difference between revisions

From KitwarePublic
Jump to navigationJump to search
No edit summary
No edit summary
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
==AMR Contour==


==AMR Connectivity==


Fragment Identification


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 9: Line 10:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Capping'''<br>''(Capping)''
|
If this property is on, the the boundary of the data set is capped.
| 1
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Degenerate Cells'''<br>''(DegenerateCells)''
|'''Input''' (Input)
|
|
If this property is on, a transition mesh between levels is created.
This property specifies the volume input of the
 
filter.
| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
 
* vtkNonOverlappingAMR
 
The dataset must contain a field array (cell)
The dataset must contain a cell array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.


with 1 component(s).


|-
|-
| '''Merge Points'''<br>''(MergePoints)''
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
|
Use more memory to merge points on the boundaries of blocks.
This property specifies the cell arrays from which the
 
analysis will determine fragments
| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)''
|
|
If this property is off, each process executes independantly.
An array of scalars is required.
 
| 1
|
Only the values 0 and 1 are accepted.
 
 
|-
|-
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)''
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
|
This property specifies the cell arrays from which the contour filter will
This property specifies the values at which to compute
compute contour cells.
the isosurface.
 
|
|
0.1
|
|
An array of scalars is required.


|-
|-
| '''Skip Ghost Copy'''<br>''(SkipGhostCopy)''
|'''Resolve Blocks''' (Resolve Blocks)
|
|
A simple test to see if ghost values are already set properly.
Resolve the fragments between blocks.
 
|
| 1
1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Triangulate'''<br>''(Triangulate)''
|'''Propagate Ghosts''' (Propagate Ghosts)
|
|
Use triangles instead of quads on capping surfaces.
Propagate regionIds into the ghosts.
 
| 1
|
|
Only the values 0 and 1 are accepted.
0
 
 
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|
|
This property specifies the values at which to compute the isosurface.
Accepts boolean values (0 or 1).
 
| 0.1
|
The value must be greater than or equal to 0 and less than or equal to 1.
 


|}
|}


==AMR Contour==


==AMR Dual Clip==
Iso surface cell array.
 
 
Clip with scalars. Tetrahedra.
 


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 119: Line 72:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
This property specifies the input of the
filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|
Accepts input of following types:
* vtkCompositeDataSet
The dataset must contain a field array (cell)


The dataset must contain a cell array with 1 components.
with 1 component(s).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.
 


|-
|-
| '''Internal Decimation'''<br>''(InternalDecimation)''
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
contour filter will compute contour cells.
|
|
If this property is on, internal tetrahedra are decimation


| 1
|
|
Only the values 0 and 1 are accepted.
An array of scalars is required.
 
 
|-
|-
| '''Merge Points'''<br>''(MergePoints)''
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
|
Use more memory to merge points on the boundaries of blocks.
This property specifies the values at which to compute
 
the isosurface.
| 1
|
0.1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)''
|'''Capping''' (Capping)
|
If this property is on, the the boundary of the data set
is capped.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''DegenerateCells''' (DegenerateCells)
|
If this property is on, a transition mesh between levels
is created.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MultiprocessCommunication''' (MultiprocessCommunication)
|
If this property is off, each process executes
independantly.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''SkipGhostCopy''' (SkipGhostCopy)
|
A simple test to see if ghost values are already set
properly.
|
|
If this property is off, each process executes independantly.
1
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)''
|'''Triangulate''' (Triangulate)
|
|
This property specifies the cell arrays from which the clip filter will
Use triangles instead of quads on capping
compute clipped cells.
surfaces.
 
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|'''MergePoints''' (MergePoints)
|
Use more memory to merge points on the boundaries of
blocks.
|
|
This property specifies the values at which to compute the isosurface.
1
 
| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
Accepts boolean values (0 or 1).
 


|}
|}


==AMR CutPlane==


==Annotate Time Filter==
Planar Cut of an AMR grid datasetThis filter
 
creates a cut-plane of the
 
Shows input data time as text annnotation in the view.
 
The Annotate Time filter can be used to show the data time in a text annotation.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 200: Line 173:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Format'''<br>''(Format)''
|'''Input''' (Input)
|
This property specifies the input for this
filter.
|
|
The value of this property is a format string used to display the input time. The format string is specified using printf style.


| Time: %f
|
|
Accepts input of following types:
* vtkOverlappingAMR
|-
|-
| '''Input'''<br>''(Input)''
|'''UseNativeCutter''' (UseNativeCutter)
|
|
This property specifies the input dataset for which to display the time.
This property specifies whether the ParaView's generic
 
dataset cutter is used instead of the specialized AMR
cutter.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''LevelOfResolution''' (LevelOfResolution)
|
Set maximum slice resolution.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|-
| '''Scale'''<br>''(Scale)''
|'''Center''' (Center)
|
|
The factor by which the input time is scaled.


| 1
|
|
0.5 0.5 0.5
|
|-
|-
| '''Shift'''<br>''(Shift)''
|'''Normal''' (Normal)
|
|
The amount of time the input is shifted (after scaling).


| 0
|
|
|}
0 0 1
|




==Append Attributes==
|}


==AMR Dual Clip==


Copies geometry from first input.  Puts all of the arrays into the output.
Clip with scalars. Tetrahedra.
 
The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 247: Line 231:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
|
This property specifies the input to the Append Attributes filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkCompositeDataSet
The dataset must contain a field array (cell)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
clip filter will compute clipped cells.
|


|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
This property specifies the values at which to compute
the isosurface.
|
0.1
|
|-
|'''InternalDecimation''' (InternalDecimation)
|
If this property is on, internal tetrahedra are
decimation
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MultiprocessCommunication''' (MultiprocessCommunication)
|
If this property is off, each process executes
independantly.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MergePoints''' (MergePoints)
|
Use more memory to merge points on the boundaries of
blocks.
|
1
|
Accepts boolean values (0 or 1).


|}
|}


==AMR Fragment Integration==


==Append Datasets==
Fragment Integration
 
 
Takes an input of multiple datasets and output has only one unstructured grid.
 
The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 276: Line 304:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the volume input of the
filter.
|
|
This property specifies the datasets to be merged into a single dataset by the Append Datasets filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkNonOverlappingAMR
The dataset must contain a field array (cell)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragments
|


|
An array of scalars is required.
|-
|'''SelectMassArrays''' (SelectMassArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragment mass
|


|}
|
An array of scalars is required.
|-
|'''SelectVolumeWeightedArrays''' (SelectVolumeWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine volume weighted average values
|


|
An array of scalars is required.
|-
|'''SelectMassWeightedArrays''' (SelectMassWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine mass weighted average values
|


==Append Geometry==
|
An array of scalars is required.


|}


Takes an input of multiple poly data parts and output has only one part.
==AMR Fragments Filter==


The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.<br>
Meta Fragment filterCombines the running of
AMRContour, AMRFragmentIntegration, AMRDualContour and ExtractCTHParts


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 305: Line 369:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the volume input of the
filter.
|
|
Set the input to the Append Geometry filter.


|
|
Accepts input of following types:
* vtkNonOverlappingAMR
The dataset must contain a field array (cell)
with 1 component(s).
|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragments
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|
An array of scalars is required.
|-
|'''SelectMassArrays''' (SelectMassArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragment mass
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|
An array of scalars is required.
|-
|'''SelectVolumeWeightedArrays''' (SelectVolumeWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine volume weighted average values
|


|
An array of scalars is required.
|-
|'''SelectMassWeightedArrays''' (SelectMassWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine mass weighted average values
|


|}
|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
This property specifies the values at which to compute
the isosurface.
|
0.1
|


|-
|'''Extract Surface''' (Extract Surface)
|
Whether or not to extract a surface from this data
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Use Watertight Surface''' (Use Watertight Surface)
|
Whether the extracted surface should be watertight or not
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Integrate Fragments''' (Integrate Fragments)
|
Whether or not to integrate fragments in this data
|
1
|
Accepts boolean values (0 or 1).


==Block Scalars==
|}


==Add Field Arrays==


The Level Scalars filter uses colors to show levels of a multiblock dataset.
Reads arrays from a file and adds them to the input data object.
Takes in an input data object and a filename. Opens the file
and adds any arrays it sees there to the input data.


The Level Scalars filter uses colors to show levels of a multiblock dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 334: Line 469:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
The input.
|
|
This property specifies the input to the Level Scalars filter.


|
|
|-
|'''FileName''' (FileName)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


This property specifies the file to read to get arrays


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
|


|
The value(s) must be a filename (or filenames).


|}
|}


==Angular Periodic Filter==


==Calculator==
This filter generate a periodic multiblock dataset.This filter generate a periodic
 
multiblock dataset


Compute new attribute arrays as function of existing arrays.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector. The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies.<br>
Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.<br>
Clear: Erase the current function (displayed in the read-only text box above the calculator buttons).<br>
/: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses.<br>
*: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses.<br>
-: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses.<br>
+: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses.<br>
sin: Compute the sine of a scalar.<br>
cos: Compute the cosine of a scalar.<br>
tan: Compute the tangent of a scalar.<br>
asin: Compute the arcsine of a scalar.<br>
acos: Compute the arccosine of a scalar.<br>
atan: Compute the arctangent of a scalar.<br>
sinh: Compute the hyperbolic sine of a scalar.<br>
cosh: Compute the hyperbolic cosine of a scalar.<br>
tanh: Compute the hyperbolic tangent of a scalar.<br>
min: Compute minimum of two scalars.<br>
max: Compute maximum of two scalars.<br>
x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses.<br>
sqrt: Compute the square root of a scalar.<br>
e^x: Raise e to the power of a scalar.<br>
log: Compute the logarithm of a scalar (deprecated. same as log10).<br>
log10: Compute the logarithm of a scalar to the base 10.<br>
ln: Compute the logarithm of a scalar to the base 'e'.<br>
ceil: Compute the ceiling of a scalar.<br>
floor: Compute the floor of a scalar.<br>
abs: Compute the absolute value of a scalar.<br>
v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses.<br>
cross: Compute cross product of two vectors.<br>
mag: Compute the magnitude of a vector.<br>
norm: Normalize a vector.<br>
The operands are described below.<br>
The digits 0 - 9 and the decimal point are used to enter constant scalar values.<br>
iHat, jHat, and kHat are vector constants representing unit vectors in the X, Y, and Z directions, respectively.<br>
The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell).<br>
The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Property'''
| '''Description'''
| '''Description'''
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
|
This property determines whether the computation is to be performed on point-centered or cell-centered data.
This property specifies the input to the Periodic filter.


| 0
|
|
The value must be one of the following: point_data (1), cell_data (2), field_data (5).


|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''Coordinate Results'''<br>''(CoordinateResults)''
|'''BlockIndices''' (BlockIndices)
|
This property lists the ids of the blocks to make periodic
from the input multiblock dataset.
|
|
The value of this property determines whether the results of this computation should be used as point coordinates or as a new array.


| 0
|
|
Only the values 0 and 1 are accepted.


|-
|'''IterationMode''' (IterationMode)
|
This property specifies the mode of iteration, either a user-provided number
of periods, or the maximum number of periods to rotate to 360°.
|
1
|
The value(s) is an enumeration of the following:
* Manual (0)
* Maximum (1)
|-
|'''NumberOfPeriods''' (NumberOfPeriods)
|
This property specifies the number of iteration
|
3
|


|-
|-
| '''Function'''<br>''(Function)''
|'''RotationMode''' (RotationMode)
|
|
This property contains the equation for computing the new array.
This property specifies the mode of rotation, either from a user provided
 
angle or from an array in the data.
|
|
0
|
|
The value(s) is an enumeration of the following:
* Direct Angle (0)
* Array Value (1)
|-
|-
| '''Input'''<br>''(Input)''
|'''RotationAngle''' (RotationAngle)
|
|
This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays.
Rotation angle in degree.


|
|
10
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Replace Invalid Results'''<br>''(ReplaceInvalidValues)''
|'''RotationArrayName''' (RotationArrayName)
|
|
This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.)
Field array name that contains the rotation angle in radian.


| 1
|
|
Only the values 0 and 1 are accepted.
periodic angle
 
|


|-
|-
| '''Replacement Value'''<br>''(ReplacementValue)''
|'''Axis''' (Axis)
|
|
If invalid values in the computation are to be replaced with another value, this property contains that value.
This property specifies the axis of rotation
 
|
| 0
0
|
|
The value(s) is an enumeration of the following:
* Axis X (0)
* Axis Y (1)
* Axis Z (2)
|-
|-
| '''Result Array Name'''<br>''(ResultArrayName)''
|'''Center''' (Center)
|
|
This property contains the name for the output array containing the result of this computation.
This property specifies the 3D coordinates for the
 
center of the rotation.
| Result
|
0.0 0.0 0.0
|
|
|}




==Cell Centers==
|}


==Annotate Attribute Data==


Create a point (no geometry) at the center of each input cell.
Adds a text annotation to a Rander View
This filter can be used to add a text annotation to a Render View (or
similar) using a tuple from any attribute array (point/cell/field/row
etc.) from a specific rank (when running in parallel). Use **ArrayName**
property to select the array association and array name. Use
**ElementId* property to set the element number to extract the value to
label with. When running on multiple ranks, use **ProcessId** property
to select the rank of interest. The **Prefix** property can be used to
specify a string that will be used as the prefix to the generated
annotation text.


The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 478: Line 613:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Cell Centers filter.
 
Set the input of the filter. To avoid the complications/confusion when identifying
elements in a composite dataset, this filter doesn't support composite datasets
currently.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
 
* vtkDataSet
 
* vtkTable
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The dataset must contain a field array (any)


with 1 component(s).


|-
|-
| '''Vertex Cells'''<br>''(VertexCells)''
|'''ArrayAssociation''' (ArrayAssociation)
|
Select the attribute to use to popular array names from.
|
2
|
The value(s) is an enumeration of the following:
* Point Data (0)
* Cell Data (1)
* Field Data (2)
* Row Data (6)
|-
|'''ArrayName''' (ArrayName)
|
Choose arrays that is going to be displayed
|
|
If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated.


| 0
|
|
Only the values 0 and 1 are accepted.


|-
|'''ElementId''' (ElementId)
|


|}
Set the element index to annotate with.


|
0
|


==Cell Data to Point Data==
|-
|'''ProcessId''' (ProcessId)
|


Set the process rank to extract element from.


Create point attributes by averaging cell attributes.
|
0
|
 
|-
|'''Prefix''' (Prefix)
|
Text that is used as a prefix to the field value
|
Value is:
|
 
 
|}
 
==Annotate Global Data==
 
Filter for annotating with global data (designed for ExodusII reader)
Annotate Global Data provides a simpler API for creating text
annotations using vtkPythonAnnotationFilter. Instead of users
specifying the annotation expression, this filter determines the
expression based on the array selected by limiting the scope of the
functionality. This filter only allows the user to annotate using
"global-data" aka field data and specify the string prefix to use. If
the field array chosen has as many elements as number of timesteps,
the array is assumed to be "temporal" and indexed using the current
timestep.


The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 517: Line 703:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input of the filter.
|
|
This property specifies the input to the Cell Data to Point Data filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
The dataset must contain a field array (none)


with 1 component(s).


The dataset must contain a cell array.
|-
 
|'''SelectArrays''' (SelectArrays)
 
|
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
Choose arrays that is going to be
displayed
|


|


|-
|-
| '''Pass Cell Data'''<br>''(PassCellData)''
|'''Prefix''' (Prefix)
|
|
If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output.
Text that is used as a prefix to the field
 
value
| 0
|
Value is:
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|'''Suffix''' (Suffix)
|
Text that is used as a suffix to the field
value
|
|
If the value of this property is set to 1, this filter will request ghost levels so that the values at boundary points match across processes. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.


| 0
|
|
Only the values 0 and 1 are accepted.




|}
|}


==Annotate Time Filter==


==Clean==
Shows input data time as text annnotation in the view.The Annotate Time
 
filter can be used to show the data time in a text
 
annotation.
Merge coincident points if they do not meet a feature edge criteria.
 
The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 569: Line 759:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Absolute Tolerance'''<br>''(AbsoluteTolerance)''
|'''Input''' (Input)
|
This property specifies the input dataset for which to
display the time.
|
|
If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set.


| 1
|
|
The value must be greater than or equal to 0.


|-
|-
| '''Convert Lines To Points'''<br>''(ConvertLinesToPoints)''
|'''Format''' (Format)
|
|
If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points.
The value of this property is a format string used to
 
display the input time. The format string is specified using printf
| 1
style.
|
Time: %f
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Convert Polys To Lines'''<br>''(ConvertPolysToLines)''
|'''Shift''' (Shift)
|
The amount of time the input is shifted (after
scaling).
|
|
If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines.
0.0
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Convert Strips To Polys'''<br>''(ConvertStripsToPolys)''
|'''Scale''' (Scale)
|
|
If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles.
The factor by which the input time is
 
scaled.
| 1
|
|
Only the values 0 and 1 are accepted.
1.0
 
 
|-
| '''Input'''<br>''(Input)''
|
|
Set the input to the Clean filter.


|
|
The selected object must be the result of the following: sources (includes readers), filters.


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
==Append Attributes==


Copies geometry from first input. Puts all of the arrays into the output.
The Append Attributes filter takes multiple input data
sets with the same geometry and merges their point and
cell attributes to produce a single output containing all
the point and cell attributes of the inputs. Any inputs
without the same number of points and cells as the first
input are ignored. The input data sets must already be
collected together, either as a result of a reader that
loads multiple parts (e.g., EnSight reader) or because the
Group Parts filter has been run to form a collection of
data sets.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
| '''Property'''
|
| '''Description'''
If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel.
| '''Default Value(s)'''
 
| '''Restrictions'''
| 1
|
Only the values 0 and 1 are accepted.
 


|-
|-
| '''Point Merging'''<br>''(PointMerging)''
|'''Input''' (Input)
|
This property specifies the input to the Append
Attributes filter.
|
|
If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkDataSet


|}


|-
==Append Datasets==
| '''Tolerance'''<br>''(Tolerance)''
|
If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set.


| 0
Takes an input of multiple datasets and output has only one unstructured grid.The Append
|
Datasets filter operates on multiple data sets of any type
The value must be greater than or equal to 0 and less than or equal to 1.
(polygonal, structured, etc.). It merges their geometry
into a single data set. Only the point and cell attributes
that all of the input data sets have in common will appear
in the output. The input data sets must already be
collected together, either as a result of a reader that
loads multiple parts (e.g., EnSight reader) or because the
Group Parts filter has been run to form a collection of
data sets.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Tolerance Is Absolute'''<br>''(ToleranceIsAbsolute)''
|'''Input''' (Input)
|
This property specifies the datasets to be merged into a
single dataset by the Append Datasets filter.
|
|
This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkDataSet


|}
|}


==Append Geometry==


==Clean Cells to Grid==
Takes an input of multiple poly data parts and output has only one part.The Append
 
Geometry filter operates on multiple polygonal data sets.
 
It merges their geometry into a single data set. Only the
This filter merges cells and converts the data set to unstructured grid.
point and cell attributes that all of the input data sets
 
have in common will appear in the output.
Merges degenerate cells. Assumes the input grid does not contain duplicate<br>
points. You may want to run vtkCleanUnstructuredGrid first to assert it. If<br>
duplicated cells are found they are removed in the output. The filter also<br>
handles the case, where a cell may contain degenerate nodes (i.e. one and<br>
the same node is referenced by a cell more than once).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 682: Line 881:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the Append Geometry
filter.
|
|
This property specifies the input to the Clean Cells to Grid filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkPolyData
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
 


|}
|}


==Block Scalars==


==Clean to Grid==
The Level Scalars filter uses colors to show levels of a multiblock dataset.The Level
 
Scalars filter uses colors to show levels of a multiblock
 
dataset.
This filter merges points and converts the data set to unstructured grid.
 
The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 711: Line 907:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Level Scalars
filter.
|
|
This property specifies the input to the Clean to Grid filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkMultiBlockDataSet
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==CTH Surface==


==Clip==
Not finished yet.
 
 
Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.
 
The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 740: Line 931:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
|
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.


|
|
|
Accepts input of following types:
The value must be set to one of the following: Plane, Box, Sphere, Scalar.
* vtkCompositeDataSet


|}


|-
==CacheKeeper==
| '''Input'''<br>''(Input)''
|
This property specifies the dataset on which the Clip filter will operate.


|
vtkPVCacheKeeper manages data cache for flip book
|
animations. When caching is disabled, this simply acts as a pass through
The selected object must be the result of the following: sources (includes readers), filters.
filter. When caching is enabled, is the current time step has been
 
previously cached then this filter shuts the update request, otherwise
 
propagates the update and then cache the result for later use. The
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
current time step is set using SetCacheTime().


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Inside Out'''<br>''(InsideOut)''
|'''Input''' (Input)
|
Set the input to the Update Suppressor
filter.
|
|
If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead.


| 0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''CacheTime''' (CacheTime)
|
|
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.


|
|
0.0
|
|
An array of scalars is required.
Valud array names will be chosen from point and cell data.


|-
|-
| '''Use Value As Offset'''<br>''(UseValueAsOffset)''
|'''CachingEnabled''' (CachingEnabled)
|
|
If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array.
Toggle whether the caching is enabled.
 
| 0
|
|
Only the values 0 and 1 are accepted.
1
 
 
|-
| '''Value'''<br>''(Value)''
|
|
If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface.
Accepts boolean values (0 or 1).
 
| 0
|
The value must lie within the range of the selected data array.
 


|}
|}


==Calculator==


==Clip Closed Surface==
Compute new attribute arrays as function of existing arrays.
The Calculator filter computes a new data array or new point
coordinates as a function of existing scalar or vector arrays. If
point-centered arrays are used in the computation of a new data array,
the resulting array will also be point-centered. Similarly,
computations using cell-centered arrays will produce a new
cell-centered array. If the function is computing point coordinates,
the result of the function must be a three-component vector.


The Calculator interface operates similarly to a scientific
calculator. In creating the function to evaluate, the standard order
of operations applies. Each of the calculator functions is described
below. Unless otherwise noted, enclose the operand in parentheses
using the ( and ) buttons.


Clip a polygonal dataset with a plane to produce closed surfaces
- Clear: Erase the current function (displayed in the read-only text
box above the calculator buttons).
- /: Divide one scalar by another. The operands for this function are
not required to be enclosed in parentheses.
- *: Multiply two scalars, or multiply a vector by a scalar (scalar multiple).
The operands for this function are not required to be enclosed in parentheses.
- -: Negate a scalar or vector (unary minus), or subtract one scalar or vector
from another. The operands for this function are not required to be enclosed
in parentheses.
- +: Add two scalars or two vectors. The operands for this function are not
required to be enclosed in parentheses.
- sin: Compute the sine of a scalar. cos: Compute the cosine of a scalar.
- tan: Compute the tangent of a scalar.
- asin: Compute the arcsine of a scalar.
- acos: Compute the arccosine of a scalar.
- atan: Compute the arctangent of a scalar.
- sinh: Compute the hyperbolic sine of a scalar.
- cosh: Compute the hyperbolic cosine of a scalar.
- tanh: Compute the hyperbolic tangent of a scalar.
- min: Compute minimum of two scalars.
- max: Compute maximum of two scalars.
- x^y: Raise one scalar to the power of another scalar. The operands for
this function are not required to be enclosed in parentheses.
- sqrt: Compute the square root of a scalar.
- e^x: Raise e to the power of a scalar.
- log: Compute the logarithm of a scalar (deprecated. same as log10).
- log10: Compute the logarithm of a scalar to the base 10.
- ln: Compute the logarithm of a scalar to the base 'e'.
- ceil: Compute the ceiling of a scalar. floor: Compute the floor of a scalar.
- abs: Compute the absolute value of a scalar.
- v1.v2: Compute the dot product of two vectors. The operands for this
function are not required to be enclosed in parentheses.
- cross: Compute cross product of two vectors.
- mag: Compute the magnitude of a vector.
- norm: Normalize a vector.


This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.<br>
The operands are described below. The digits 0 - 9 and the decimal
point are used to enter constant scalar values. **iHat**, **jHat**,
and **kHat** are vector constants representing unit vectors in the X,
Y, and Z directions, respectively. The scalars menu lists the names of
the scalar arrays and the components of the vector arrays of either
the point-centered or cell-centered data. The vectors menu lists the
names of the point-centered or cell-centered vector arrays. The
function will be computed for each point (or cell) using the scalar or
vector value of the array at that point (or cell). The filter operates
on any type of data set, but the input data set must have at least one
scalar or vector array. The arrays can be either point-centered or
cell-centered. The Calculator filter's output is of the same data set
type as the input.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 822: Line 1,062:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Base Color'''<br>''(BaseColor)''
|'''Input''' (Input)
|
This property specifies the input dataset to the
Calculator filter. The scalar and vector variables may be chosen from
this dataset's arrays.
|
|
Specify the color for the faces from the input.


| 0.1 0.1 1
|
|
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1).
Accepts input of following types:
 
* vtkDataSet
The dataset must contain a field array ()


|-
|-
| '''Clip Color'''<br>''(ClipColor)''
|'''AttributeMode''' (AttributeMode)
|
|
Specifiy the color for the capping faces (generated on the clipping interface).
This property determines whether the computation is to
 
be performed on point-centered or cell-centered data.
| 1 0.11 0.1
|
1
|
|
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1).
The value(s) is an enumeration of the following:
 
* Point Data (1)
 
* Cell Data (2)
|-
|-
| '''Clipping Plane'''<br>''(ClippingPlane)''
|'''CoordinateResults''' (CoordinateResults)
|
|
This property specifies the parameters of the clipping plane used to clip the polygonal data.
The value of this property determines whether the
 
results of this computation should be used as point coordinates or as a
new array.
|
|
0
|
|
The value must be set to one of the following: Plane.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Generate Cell Origins'''<br>''(GenerateColorScalars)''
|'''ResultNormals''' (ResultNormals)
|
|
Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells.
Set whether to output results as point/cell
 
normals. Outputing as normals is only valid with vector
| 0
results. Point or cell normals are selected using
AttributeMode.
|
0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Generate Faces'''<br>''(GenerateFaces)''
|'''ResultTCoords''' (ResultTCoords)
|
Set whether to output results as point/cell
texture coordinates. Point or cell texture coordinates are
selected using AttributeMode. 2-component texture coordinates
cannot be generated at this time.
|
|
Generate polygonal faces in the output.
0
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Generate Outline'''<br>''(GenerateOutline)''
|'''ResultArrayName''' (ResultArrayName)
|
This property contains the name for the output array
containing the result of this computation.
|
Result
|
|
Generate clipping outlines in the output wherever an input face is cut by the clipping plane.


| 0
|-
|'''Function''' (Function)
|
|
Only the values 0 and 1 are accepted.


This property contains the equation for computing the new
array.


|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the dataset on which the Clip filter will operate.


|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
|-
| '''Inside Out'''<br>''(InsideOut)''
|'''Replace Invalid Results''' (ReplaceInvalidValues)
|
|
If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane.
This property determines whether invalid values in the
 
computation will be replaced with a specific value. (See the
| 0
ReplacementValue property.)
|
1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Clipping Tolerance'''<br>''(Tolerance)''
|'''ReplacementValue''' (ReplacementValue)
|
|
Specify the tolerance for creating new points. A small value might incur degenerate triangles.
If invalid values in the computation are to be replaced
 
with another value, this property contains that value.
| 1e-06
|
0.0
|
|
|}




==Clip Generic Dataset==
|}


==Cell Centers==


Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set. This output data type of this filter is always an unstructured grid.
Create a point (no geometry) at the center of each input cell.The Cell Centers
 
filter places a point at the center of each cell in the
The Generic Clip filter cuts away a portion of the input data set using a plane, a sphere, a box, or a scalar value. The menu in the Clip Function portion of the interface allows the user to select which implicit function to use or whether to clip using a scalar value. Making this selection loads the appropriate user interface. For the implicit functions, the appropriate 3D widget (plane, sphere, or box) is also displayed. The use of these 3D widgets, including their user interface components, is discussed in section 7.4.<br>
input data set. The center computed is the parametric
If an implicit function is selected, the clip filter returns that portion of the input data set that lies inside the function. If Scalars is selected, then the user must specify a scalar array to clip according to. The clip filter will return the portions of the data set whose value in the selected Scalars array is larger than the Clip value. Regardless of the selection from the Clip Function menu, if the Inside Out option is checked, the opposite portions of the data set will be returned.<br>
center of the cell, not necessarily the geometric or
This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>
bounding box center. The cell attributes of the input will
be associated with these newly created points of the
output. You have the option of creating a vertex cell per
point in the outpuut. This is useful because vertex cells
are rendered, but points are not. The points themselves
could be used for placing glyphs (using the Glyph filter).
The Cell Centers filter takes any type of data set as
input and produces a polygonal data set as
output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 930: Line 1,183:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
|'''Input''' (Input)
|
This property specifies the input to the Cell Centers
filter.
|
|
Set the parameters of the clip function.


|
|
|
Accepts input of following types:
The value must be set to one of the following: Plane, Box, Sphere, Scalar.
* vtkDataSet
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''VertexCells''' (VertexCells)
|
|
Set the input to the Generic Clip filter.
If set to 1, a vertex cell will be generated per point
 
in the output. Otherwise only points will be generated.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.
==Cell Data to Point Data==


Create point attributes by averaging cell attributes.The Cell
Data to Point Data filter averages the values of the cell
attributes of the cells surrounding a point to compute
point attributes. The Cell Data to Point Data filter
operates on any type of data set, and the output data set
is of the same type as the input.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Inside Out'''<br>''(InsideOut)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Cell Data to
Point Data filter.
|
|
Choose which portion of the dataset should be clipped away.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkDataSet
The dataset must contain a field array (cell)


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''PassCellData''' (PassCellData)
|
|
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.
If this property is set to 1, then the input cell data
 
is passed through to the output; otherwise, only the generated point
data will be available in the output.
|
|
0
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
Valud array names will be chosen from point and cell data.
 
 
|-
|-
| '''Value'''<br>''(Value)''
|'''PieceInvariant''' (PieceInvariant)
|
|
If clipping with a scalar array, choose the clipping value.
If the value of this property is set to 1, this filter
 
will request ghost levels so that the values at boundary points match
| 0
across processes. NOTE: Enabling this option might cause multiple
executions of the data source because more information is needed to
remove internal surfaces.
|
0
|
|
The value must lie within the range of the selected data array.
Accepts boolean values (0 or 1).
 


|}
|}


==Clean==


==Compute Derivatives==
Merge coincident points if they do not meet a feature edge criteria.The Clean filter
 
takes polygonal data as input and generates polygonal data
 
as output. This filter can merge duplicate points, remove
This filter computes derivatives of scalars and vectors.
unused points, and transform degenerate cells into their
 
appropriate forms (e.g., a triangle is converted into a
CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br>
line if two of its points are merged).


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,002: Line 1,274:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the Clean filter.
|
|
This property specifies the input to the filter.


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''PieceInvariant''' (PieceInvariant)
|
If this property is set to 1, the whole data set will be
processed at once so that cleaning the data set always produces the
same results. If it is set to 0, the data set can be processed one
piece at a time, so it is not necessary for the entire data set to fit
into memory; however the results are not guaranteed to be the same as
they would be if the Piece invariant option was on. Setting this option
to 0 may produce seams in the output dataset when ParaView is run in
parallel.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Tolerance''' (Tolerance)
|
If merging nearby points (see PointMerging property) and
not using absolute tolerance (see ToleranceIsAbsolute property), this
property specifies the tolerance for performing merging as a fraction
of the length of the diagonal of the bounding box of the input data
set.
|
0.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|'''AbsoluteTolerance''' (AbsoluteTolerance)
|
If merging nearby points (see PointMerging property) and
using absolute tolerance (see ToleranceIsAbsolute property), this
property specifies the tolerance for performing merging in the spatial
units of the input data set.
|
1.0
|


|-
|-
| '''Output Tensor Type'''<br>''(OutputTensorType)''
|'''ToleranceIsAbsolute''' (ToleranceIsAbsolute)
|
This property determines whether to use absolute or
relative (a percentage of the bounding box) tolerance when performing
point merging.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ConvertLinesToPoints''' (ConvertLinesToPoints)
|
|
This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.
If this property is set to 1, degenerate lines (a "line"
 
whose endpoints are at the same spatial location) will be converted to
| 1
points.
|
1
|
|
The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Output Vector Type'''<br>''(OutputVectorType)''
|'''ConvertPolysToLines''' (ConvertPolysToLines)
|
|
This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.
If this property is set to 1, degenerate polygons (a
 
"polygon" with only two distinct point coordinates) will be converted
| 1
to lines.
|
1
|
|
The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''ConvertStripsToPolys''' (ConvertStripsToPolys)
|
|
This property indicates the name of the scalar array to differentiate.
If this property is set to 1, degenerate triangle strips
 
(a triangle "strip" containing only one triangle) will be converted to
triangles.
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''PointMerging''' (PointMerging)
|
|
This property indicates the name of the vector array to differentiate.
If this property is set to 1, then points will be merged
 
if they are within the specified Tolerance or AbsoluteTolerance (see
| 1
the Tolerance and AbsoluteTolerance propertys), depending on the value
of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute
property.) If this property is set to 0, points will not be
merged.
|
1
|
|
An array of vectors is required.
Accepts boolean values (0 or 1).
 


|}
|}


==Clean Cells to Grid==


==Connectivity==
This filter merges cells and converts the data set to unstructured grid.Merges degenerate cells. Assumes
 
the input grid does not contain duplicate points. You may
 
want to run vtkCleanUnstructuredGrid first to assert it.
Mark connected components with integer point attribute array.
If duplicated cells are found they are removed in the
 
output. The filter also handles the case, where a cell may
The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br>
contain degenerate nodes (i.e. one and the same node is
referenced by a cell more than once).


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,071: Line 1,394:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Color Regions'''<br>''(ColorRegions)''
|'''Input''' (Input)
|
This property specifies the input to the Clean Cells to
Grid filter.
|
|
Controls the coloring of the connected regions.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkUnstructuredGrid


|}


|-
==Clean to Grid==
| '''Extraction Mode'''<br>''(ExtractionMode)''
|
Controls the extraction of connected surfaces.
 
| 5
|
The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6).


This filter merges points and converts the data set to unstructured grid.The Clean to Grid filter merges
points that are exactly coincident. It also converts the
data set to an unstructured grid. You may wish to do this
if you want to apply a filter to your data set that is
available for unstructured grids but not for the initial
type of your data set (e.g., applying warp vector to
volumetric data). The Clean to Grid filter operates on any
type of data set.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Input'''<br>''(Input)''
| '''Property'''
|
| '''Description'''
This property specifies the input to the Connectivity filter.
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
This property specifies the input to the Clean to Grid
filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|
Accepts input of following types:
* vtkDataSet


|}
|}


==ClientServerMoveData==


==Contingency Statistics==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br>
This filter computes contingency tables between pairs of attributes.  This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.  Data is assessed by computing <br>
*  the probability of observing both variables simultaneously;<br>
*  the probability of each variable conditioned on the other (the two values need not be identical); and<br>
*  the pointwise mutual information (PMI).
<br>
Finally, the summary statistics include the information entropy of the observations.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,126: Line 1,449:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
Set the input to the Client Server Move Data
filter.
|
|
Specify which type of field data the arrays will be drawn from.


| 0
|
|
Valud array names will be chosen from point and cell data.


|-
|-
| '''Input'''<br>''(Input)''
|'''OutputDataType''' (OutputDataType)
|
|
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point or cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.


|-
|-
| '''Model Input'''<br>''(ModelInput)''
|'''WholeExtent''' (WholeExtent)
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
0 -1 0 -1 0 -1
|
|
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
|}
 
==Clip==


Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.The Clip filter
cuts away a portion of the input data set using an
implicit plane. This filter operates on all types of data
sets, and it returns unstructured grid data on
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
| '''Property'''
|
| '''Description'''
Choose arrays whose entries will be used to form observations for statistical analysis.
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
This property specifies the dataset on which the Clip
filter will operate.
|
|
An array of scalars is required.


|-
| '''Task'''<br>''(Task)''
|
|
Specify the task to be performed: modeling and/or assessment.
Accepts input of following types:
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
* vtkDataSet
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
The dataset must contain a field array ()
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
with 1 component(s).


| 3
|-
|'''Clip Type''' (ClipFunction)
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
This property specifies the parameters of the clip
 
function (an implicit plane) used to clip the dataset.
 
|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value can be one of the following:
* Plane (implicit_functions)


* Box (implicit_functions)


|}
* Sphere (implicit_functions)


* Cylinder (implicit_functions)


==Contour==
* Scalar (implicit_functions)
 
 
Generate isolines or isosurfaces using point scalars.
 
The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Compute Gradients'''<br>''(ComputeGradients)''
|'''InputBounds''' (InputBounds)
|
|
If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.


| 0
|
|
Only the values 0 and 1 are accepted.


|


|-
|-
| '''Compute Normals'''<br>''(ComputeNormals)''
|'''Scalars''' (SelectInputScalars)
|
If clipping with scalars, this property specifies the
name of the scalar array on which to perform the clip
operation.
|
|
If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0.
Select whether to compute normals.


| 1
|
|
Only the values 0 and 1 are accepted.
An array of scalars is required.The value must be field array name.
 
 
|-
|-
| '''Compute Scalars'''<br>''(ComputeScalars)''
|'''Value''' (Value)
|
|
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.
If clipping with scalars, this property sets the scalar
 
value about which to clip the dataset based on the scalar array chosen.
| 0
(See SelectInputScalars.) If clipping with a clip function, this
|
property specifies an offset from the clip function to use in the
Only the values 0 and 1 are accepted.
clipping operation. Neither functionality is currently available in
 
ParaView's user interface.
 
|-
| '''Isosurfaces'''<br>''(ContourValues)''
|
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
 
|
|
0.0
|
|
The value must lie within the range of the selected data array.
The value must lie within the range of the selected data array.
|-
|-
| '''Input'''<br>''(Input)''
|'''InsideOut''' (InsideOut)
|
|
This property specifies the input dataset to be used by the contour filter.
If this property is set to 0, the clip filter will
 
return that portion of the dataset that lies within the clip function.
If set to 1, the portions of the dataset that lie outside the clip
function will be returned instead.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The dataset must contain a point array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Point Merge Method'''<br>''(Locator)''
|'''UseValueAsOffset''' (UseValueAsOffset)
|
|
This property specifies an incremental point locator for merging duplicate / coincident points.
If UseValueAsOffset is true, Value is used as an offset
 
parameter to the implicit function. Otherwise, Value is used only when
clipping using a scalar array.
|
|
0
|
|
The selected object must be the result of the following: incremental_point_locators.
Accepts boolean values (0 or 1).
 
 
The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.
 
 
|-
|-
| '''Contour By'''<br>''(SelectInputScalars)''
|'''Crinkle clip''' (PreserveInputCells)
|
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
This parameter controls whether to extract entire cells
 
in the given region or clip those cells so all of the output one stay
only inside that region.
|
|
0
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
Valud array names will be chosen from point and cell data.
 


|}
|}


==Clip Closed Surface==


==Contour Generic Dataset==
Clip a polygonal dataset with a plane to produce closed surfaces
 
This clip filter cuts away a portion of the input polygonal dataset using
 
a plane to generate a new polygonal dataset.
Generate isolines or isosurfaces using point scalars.
 
The Generic Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The available scalar arrays are listed in the Scalars menu. The scalar range of the selected array will be displayed.<br>
The interface for adding contour values is very similar to the one for selecting cut offsets (in the Cut filter). To add a single contour value, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced contours, use the controls in the Generate range of values section. Select the number of contour values to generate using the Number of Values slider. The Range slider controls the interval in which to generate the contour values. Once the number of values and range have been selected, click the Generate button. The new values will be added to the Contour Values list. To delete a value from the Contour Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list. If no values are in the Contour Values list when Accept is pressed, the current value of the New Value slider will be used.<br>
In addition to selecting contour values, you can also select additional computations to perform. If any of Compute Normals, Compute Gradients, or Compute Scalars is selected, the appropriate computation will be performed, and a corresponding point-centered array will be added to the output.<br>
The Generic Contour filter operates on a generic data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,319: Line 1,603:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Gradients'''<br>''(ComputeGradients)''
|'''Input''' (Input)
|
This property specifies the dataset on which the Clip
filter will operate.
|
|
Select whether to compute gradients.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkPolyData
The dataset must contain a field array (point)


with 1 component(s).


|-
|-
| '''Compute Normals'''<br>''(ComputeNormals)''
|'''Clipping Plane''' (ClippingPlane)
|
This property specifies the parameters of the clipping
plane used to clip the polygonal data.
|
|
Select whether to compute normals.


| 1
|
|
Only the values 0 and 1 are accepted.
The value can be one of the following:
 
* Plane (implicit_functions)


|-
|-
| '''Compute Scalars'''<br>''(ComputeScalars)''
|'''GenerateFaces''' (GenerateFaces)
|
Generate polygonal faces in the output.
|
|
Select whether to compute scalars.
1
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Isosurfaces'''<br>''(ContourValues)''
|'''GenerateOutline''' (GenerateOutline)
|
|
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
Generate clipping outlines in the output wherever an
 
input face is cut by the clipping plane.
|
|
0
|
|
The value must lie within the range of the selected data array.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Generate Cell Origins''' (ScalarMode)
|
|
Set the input to the Generic Contour filter.
Generate (cell) data for coloring purposes such that the
 
newly generated cells (including capping faces and clipping outlines)
can be distinguished from the input cells.
|
0
|
The value(s) is an enumeration of the following:
* None (0)
* Color (1)
* Label (2)
|-
|'''InsideOut''' (InsideOut)
|
If this flag is turned off, the clipper will return the
portion of the data that lies within the clipping plane. Otherwise, the
clipper will return the portion of the data that lies outside the
clipping plane.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Clipping Tolerance''' (Tolerance)
|
Specify the tolerance for creating new points. A small
value might incur degenerate triangles.
|
|
0.000001
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.


|-
|-
| '''Point Merge Method'''<br>''(Locator)''
|'''Base Color''' (BaseColor)
|
|
This property specifies an incremental point locator for merging duplicate / coincident points.
Specify the color for the faces from the
 
input.
|
|
0.10 0.10 1.00
|
|
The selected object must be the result of the following: incremental_point_locators.
The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.


|-
|-
| '''Contour By'''<br>''(SelectInputScalars)''
|'''Clip Color''' (ClipColor)
|
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
Specifiy the color for the capping faces (generated on
 
the clipping interface).
|
|
1.00 0.11 0.10
|
|
An array of scalars is required.
Valud array names will be chosen from point and cell data.




|}
|}


==Clip Generic Dataset==


==Curvature==
Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set. This output data type of this filter is always an unstructured grid.
 
The Generic Clip filter cuts away a portion of the input
 
data set using a plane, a sphere, a box, or a scalar
This filter will compute the Gaussian or mean curvature of the mesh at each point.
value. The menu in the Clip Function portion of the
 
interface allows the user to select which implicit
The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br>
function to use or whether to clip using a scalar value.
; the type can be selected from the Curvature type menu button.<br>
Making this selection loads the appropriate user
interface. For the implicit functions, the appropriate 3D
widget (plane, sphere, or box) is also displayed. The use
of these 3D widgets, including their user interface
components, is discussed in section 7.4. If an implicit
function is selected, the clip filter returns that portion
of the input data set that lies inside the function. If
Scalars is selected, then the user must specify a scalar
array to clip according to. The clip filter will return
the portions of the data set whose value in the selected
Scalars array is larger than the Clip value. Regardless of
the selection from the Clip Function menu, if the Inside
Out option is checked, the opposite portions of the data
set will be returned. This filter operates on all types of
data sets, and it returns unstructured grid data on
output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,415: Line 1,731:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Curvature Type'''<br>''(CurvatureType)''
|'''Input''' (Input)
|
Set the input to the Generic Clip
filter.
|
|
This propery specifies which type of curvature to compute.


| 0
|
|
The value must be one of the following: Gaussian (0), Mean (1).
Accepts input of following types:
 
* vtkGenericDataSet
The dataset must contain a field array (point)


|-
|-
| '''Input'''<br>''(Input)''
|'''Clip Type''' (ClipFunction)
|
Set the parameters of the clip function.
|
|
This property specifies the input to the Curvature filter.


|
|
|
The value can be one of the following:
The selected object must be the result of the following: sources (includes readers), filters.
* Plane (implicit_functions)


* Box (implicit_functions)


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
* Sphere (implicit_functions)


* Scalar (implicit_functions)


|-
|-
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)''
|'''InputBounds''' (InputBounds)
|
|
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.


| 0
|
|
Only the values 0 and 1 are accepted.


|


|}
|-
|'''Scalars''' (SelectInputScalars)
|
If clipping with scalars, this property specifies the
name of the scalar array on which to perform the clip
operation.
|


|
An array of scalars is required.The value must be field array name.
|-
|'''InsideOut''' (InsideOut)
|
Choose which portion of the dataset should be clipped
away.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Value''' (Value)
|
If clipping with a scalar array, choose the clipping
value.
|
0.0
|
The value must lie within the range of the selected data array.


==D3==
|}


==Color By Array==


Repartition a data set into load-balanced spatially convex regions.  Create ghost cells if requested.
This filter generate a color based image data based on a selected data scalar
 
The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,464: Line 1,809:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Boundary Mode'''<br>''(BoundaryMode)''
|'''Input''' (Input)
|
|
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.


| 0
|
|
The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2).


|
Accepts input of following types:
* vtkImageData
The dataset must contain a field array (point)
with 1 component(s).


|-
|-
| '''Input'''<br>''(Input)''
|'''LookupTable''' (LookupTable)
|
|
This property specifies the input to the D3 filter.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Color By''' (SelectInputScalars)
|
This property specifies the name of the scalar array
from which we will color by.
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|
An array of scalars is required.The value must be field array name.
|-
|'''RGBA NaN Color''' (NaNColor)
|


|
0 0 0 255
|


|-
|-
| '''Minimal Memory'''<br>''(UseMinimalMemory)''
|'''OutputFormat''' (OutputFormat)
|
|
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.


| 0
|
|
Only the values 0 and 1 are accepted.
3
 
|
The value(s) is an enumeration of the following:
* Luminance (1)
* Luminance Alpha (2)
* RGB (3)
* RGBA (4)


|}
|}


==Compute Derivatives==


==Decimate==
This filter computes derivatives of scalars and vectors.
 
CellDerivatives is a filter that computes derivatives of
 
scalars and vectors at the center of cells. You can choose
Simplify a polygonal model using an adaptive edge collapse algorithm.  This filter works with triangles only.
to generate different output including the scalar gradient
 
(a vector), computed tensor vorticity (a vector), gradient
The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br>
of input vectors (a tensor), and strain matrix of the
input vectors (a tensor); or you may choose to pass data
through to the output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,513: Line 1,880:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)''
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
|
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
 
with 1 component(s).
 
The dataset must contain a field array (point)


with 3 component(s).


|-
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|'''Scalars''' (SelectInputScalars)
|
This property indicates the name of the scalar array to
differentiate.
|
|
The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.


| 15
|
|
The value must be greater than or equal to 0 and less than or equal to 180.
An array of scalars is required.
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Vectors''' (SelectInputVectors)
|
|
This property specifies the input to the Decimate filter.
This property indicates the name of the vector array to
 
differentiate.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
An array of vectors is required.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
 
|-
|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
|'''OutputVectorType''' (OutputVectorType)
|
|
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.
This property Controls how the filter works to generate
 
vector cell data. You can choose to compute the gradient of the input
| 0
scalars, or extract the vorticity of the computed vector gradient
tensor. By default, the filter will take the gradient of the input
scalar data.
|
1
|
|
Only the values 0 and 1 are accepted.
The value(s) is an enumeration of the following:
 
* Nothing (0)
 
* Scalar Gradient (1)
* Vorticity (2)
|-
|-
| '''Target Reduction'''<br>''(TargetReduction)''
|'''OutputTensorType''' (OutputTensorType)
|
|
This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)
This property controls how the filter works to generate
 
tensor cell data. You can choose to compute the gradient of the input
| 0.9
vectors, or compute the strain tensor of the vector gradient tensor. By
default, the filter will take the gradient of the vector data to
construct a tensor.
|
1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value(s) is an enumeration of the following:
 
* Nothing (0)
* Vector Gradient (1)
* Strain (2)


|}
|}


==Compute Quartiles==


==Delaunay 2D==
Compute the quartiles table from a dataset or table.
 
 
Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.
 
Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br>
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br>
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br>
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,588: Line 1,960:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Alpha'''<br>''(Alpha)''
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
|
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.


| 0
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
* vtkDataObject


|}


|-
==Connectivity==
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|
If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.


| 0
Mark connected components with integer point attribute array.The Connectivity
|
filter assigns a region id to connected components of the
Only the values 0 and 1 are accepted.
input data set. (The region id is assigned as a point
scalar value.) This filter takes any data set type as
input and produces unstructured grid
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Connectivity
filter.
|
|
This property specifies the input dataset to the Delaunay 2D filter.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''ExtractionMode''' (ExtractionMode)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Controls the extraction of connected
 
surfaces.
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
 
 
|-
| '''Offset'''<br>''(Offset)''
|
|
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.
5
 
| 1
|
|
The value must be greater than or equal to 0.75.
The value(s) is an enumeration of the following:
 
* Extract Point Seeded Regions (1)
 
* Extract Cell Seeded Regions (2)
* Extract Specified Regions (3)
* Extract Largest Region (4)
* Extract All Regions (5)
* Extract Closes Point Region (6)
|-
|-
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)''
|'''ColorRegions''' (ColorRegions)
|
|
This property determines type of projection plane to use in performing the triangulation.
Controls the coloring of the connected
 
regions.
| 0
|
|
The value must be one of the following: XY Plane (0), Best-Fitting Plane (2).
1
 
 
|-
| '''Tolerance'''<br>''(Tolerance)''
|
This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.
 
| 1e-05
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
Accepts boolean values (0 or 1).
 


|}
|}


==Contingency Statistics==


==Delaunay 3D==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
 
This filter either computes a statistical model of a dataset or takes
 
such a model as its second input. Then, the model (however it is
Create a 3D Delaunay triangulation of input                                points.  It expects a vtkPointSet as input and                                produces vtkUnstructuredGrid as output.
obtained) may optionally be used to assess the input dataset. This filter
 
computes contingency tables between pairs of attributes. This result is a
Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br>
tabular bivariate probability distribution which serves as a
from a list of input points. These points may be represented by any<br>
Bayesian-style prior model. Data is assessed by computing &lt;ul&gt;
dataset of type vtkPointSet and subclasses. The output of the filter<br>
&lt;li&gt; the probability of observing both variables simultaneously;
is an unstructured grid dataset. Usually the output is a tetrahedral<br>
&lt;li&gt; the probability of each variable conditioned on the other (the
mesh, but if a non-zero alpha distance value is specified (called<br>
two values need not be identical); and &lt;li&gt; the pointwise mutual
the "alpha" value), then only tetrahedra, triangles, edges, and<br>
information (PMI). &lt;/ul&gt; Finally, the summary statistics include
vertices lying within the alpha radius are output. In other words,<br>
the information entropy of the observations.
non-zero alpha values may result in arbitrary combinations of<br>
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br>
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br>
The 3D Delaunay triangulation is defined as the triangulation that<br>
satisfies the Delaunay criterion for n-dimensional simplexes (in<br>
this case n=3 and the simplexes are tetrahedra). This criterion<br>
states that a circumsphere of each simplex in a triangulation<br>
contains only the n+1 defining points of the simplex. (See text for<br>
more information.) While in two dimensions this translates into an<br>
"optimal" triangulation, this is not true in 3D, since a measurement<br>
for optimality in 3D is not agreed on.<br><br><br>
Delaunay triangulations are used to build topological structures<br>
from unorganized (or unstructured) points. The input to this filter<br>
is a list of points specified in 3D. (If you wish to create 2D<br>
triangulations see Delaunay2D.) The output is an unstructured<br>
grid.<br><br><br>
The Delaunay triangulation can be numerically sensitive. To prevent<br>
problems, try to avoid injecting points that will result in<br>
triangles with bad aspect ratios (1000:1 or greater). In practice<br>
this means inserting points that are "widely dispersed", and enables<br>
smooth transition of triangle sizes throughout the mesh. (You may<br>
even want to add extra points to create a better point<br>
distribution.) If numerical problems are present, you will see a<br>
warning message to this effect at the end of the triangulation<br>
process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can<br>
be triangulated in more than one way (at least according to the<br>
Delaunay criterion). The choice of triangulation (as implemented by<br>
this algorithm) depends on the order of the input points. The first<br>
four points will form a tetrahedron; other degenerate points<br>
(relative to this initial tetrahedron) will not break it.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the<br>
algorithm. This is because the Delaunay triangulation requires<br>
unique input points. You can control the definition of coincidence<br>
with the "Tolerance" instance variable.<br><br><br>
The output of the Delaunay triangulation is supposedly a convex<br>
hull. In certain cases this implementation may not generate the<br>
convex hull. This behavior can be controlled by the Offset instance<br>
variable. Offset is a multiplier used to control the size of the<br>
initial triangulation. The larger the offset value, the more likely<br>
you will generate a convex hull; and the more likely you are to see<br>
numerical problems.<br><br><br>
The implementation of this algorithm varies from the 2D Delaunay<br>
algorithm (i.e., Delaunay2D) in an important way. When points are<br>
injected into the triangulation, the search for the enclosing<br>
tetrahedron is quite different. In the 3D case, the closest<br>
previously inserted point point is found, and then the connected<br>
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br>
towards the enclosing triangle is performed.) If the triangulation<br>
is Delaunay, then an enclosing tetrahedron will be found. However,<br>
in degenerate cases an enclosing tetrahedron may not be found and<br>
the point will be rejected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,726: Line 2,048:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Alpha'''<br>''(Alpha)''
|'''Input''' (Input)
|
The input to the filter. Arrays from this dataset will
be used for computing statistics and/or assessed by a statistical
model.
|
|
This property specifies the alpha (or distance) value to control
the output of this filter.  For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of
radius alpha) will be output.  Otherwise, only tetrahedra will be
output.


| 0
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
 
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset must contain a field array ()


|-
|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|
|
This boolean controls whether bounding triangulation points (and
associated triangles) are included in the output. (These are
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkTable
 
* vtkMultiBlockDataSet
|-
|-
| '''Input'''<br>''(Input)''
|'''AttributeMode''' (AttributeMode)
|
|
This property specifies the input dataset to the Delaunay 3D filter.
Specify which type of field data the arrays will be
 
drawn from.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must be field array name.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
 
 
|-
|-
| '''Offset'''<br>''(Offset)''
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|
|
This property specifies a multiplier to control the size of the
initial, bounding Delaunay triangulation.


| 2.5
|
|
The value must be greater than or equal to 2.5.


|-
|-
| '''Tolerance'''<br>''(Tolerance)''
|'''Task''' (Task)
|
Specify the task to be performed: modeling and/or
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
|
This property specifies a tolerance to control discarding of
3
closely spaced points. This tolerance is specified as a fraction
|
of the diagonal length of the bounding box of the points.
The value(s) is an enumeration of the following:
 
* Detailed model of input data (0)
| 0.001
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|'''TrainingFraction''' (TrainingFraction)
|
Specify the fraction of values from the input dataset to
be used for model fitting. The exact set of values is chosen at random
from the dataset.
|
0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.




|}
|}


==Contour==


==Descriptive Statistics==
Generate isolines or isosurfaces using point scalars.The Contour
 
filter computes isolines or isosurfaces using a selected
 
point-centered scalar array. The Contour filter operates
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
on any type of data set, but the input is required to have
 
at least one point-centered scalar (single-component)
This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
array. The output of this filter is
<br>
polygonal.
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.
 
<br>
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br>
 


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,811: Line 2,154:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
This property specifies the input dataset to be used by
the contour filter.
|
|
Specify which type of field data the arrays will be drawn from.


| 0
|
|
Valud array names will be chosen from point and cell data.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)


with 1 component(s).


|-
|-
| '''Input'''<br>''(Input)''
|'''Contour By''' (SelectInputScalars)
|
This property specifies the name of the scalar array
from which the contour filter will compute isolines and/or
isosurfaces.
|
|
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


|
|
An array of scalars is required.The value must be field array name.
|-
|'''ComputeNormals''' (ComputeNormals)
|
If this property is set to 1, a scalar array containing
a normal value at each point in the isosurface or isoline will be
created by the contour filter; otherwise an array of normals will not
be computed. This operation is fairly expensive both in terms of
computation time and memory required, so if the output dataset produced
by the contour filter will be processed by filters that modify the
dataset's topology or geometry, it may be wise to set the value of this
property to 0. Select whether to compute normals.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
1
 
|
 
Accepts boolean values (0 or 1).
The dataset must contain a point or cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
 
 
|-
|-
| '''Model Input'''<br>''(ModelInput)''
|'''ComputeGradients''' (ComputeGradients)
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.
If this property is set to 1, a scalar array containing
 
a gradient value at each point in the isosurface or isoline will be
created by this filter; otherwise an array of gradients will not be
computed. This operation is fairly expensive both in terms of
computation time and memory required, so if the output dataset produced
by the contour filter will be processed by filters that modify the
dataset's topology or geometry, it may be wise to set the value of this
property to 0. Not that if ComputeNormals is set to 1, then gradients
will have to be calculated, but they will only be stored in the output
dataset if ComputeGradients is also set to 1.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
 
 
|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|'''ComputeScalars''' (ComputeScalars)
|
|
Choose arrays whose entries will be used to form observations for statistical analysis.
If this property is set to 1, an array of scalars
 
(containing the contour value) will be added to the output dataset. If
set to 0, the output will not contain this array.
|
|
0
|
Accepts boolean values (0 or 1).
|-
|'''OutputPointsPrecision''' (OutputPointsPrecision)
|
|
An array of scalars is required.


Select the output precision of the coordinates. **Single** sets the
output to single-precision floating-point (i.e., float), **Double**
sets it to double-precision floating-point (i.e., double), and
**Default** sets it to the same precision as the precision of the
points in the input. Defaults to ***Single***.


|
0
|
The value(s) is an enumeration of the following:
* Single (0)
* Double (1)
* Same as input (2)
|-
|-
| '''Deviations should be'''<br>''(SignedDeviations)''
|'''GenerateTriangles''' (GenerateTriangles)
|
|
Should the assessed values be signed deviations or unsigned?
This parameter controls whether to produce triangles in the output.
Warning: Many filters do not properly handle non-trianglular polygons.


| 0
|
|
The value must be one of the following: Unsigned (0), Signed (1).
1
 
|
 
Accepts boolean values (0 or 1).
|-
|-
| '''Task'''<br>''(Task)''
|'''Isosurfaces''' (ContourValues)
|
This property specifies the values at which to compute
isosurfaces/isolines and also the number of such
values.
|
|
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
| 3
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
The value must lie within the range of the selected data array.
 
 
|-
|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|'''Point Merge Method''' (Locator)
|
This property specifies an incremental point locator for
merging duplicate / coincident points.
|
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value can be one of the following:
* MergePoints (incremental_point_locators)


* IncrementalOctreeMergePoints (incremental_point_locators)


|}
* NonMergingPointLocator (incremental_point_locators)




==Elevation==
|}


==Contour Generic Dataset==


Create point attribute array by projecting points onto an elevation vector.
Generate isolines or isosurfaces using point scalars.The Generic
 
Contour filter computes isolines or isosurfaces using a
The Elevation filter generates point scalar values for an input dataset along a specified direction vector.<br><br><br>
selected point-centered scalar array. The available scalar
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.<br>
arrays are listed in the Scalars menu. The scalar range of
the selected array will be displayed. The interface for
adding contour values is very similar to the one for
selecting cut offsets (in the Cut filter). To add a single
contour value, select the value from the New Value slider
in the Add value portion of the interface and click the
Add button, or press Enter. To instead add several evenly
spaced contours, use the controls in the Generate range of
values section. Select the number of contour values to
generate using the Number of Values slider. The Range
slider controls the interval in which to generate the
contour values. Once the number of values and range have
been selected, click the Generate button. The new values
will be added to the Contour Values list. To delete a
value from the Contour Values list, select the value and
click the Delete button. (If no value is selected, the
last value in the list will be removed.) Clicking the
Delete All button removes all the values in the list. If
no values are in the Contour Values list when Accept is
pressed, the current value of the New Value slider will be
used. In addition to selecting contour values, you can
also select additional computations to perform. If any of
Compute Normals, Compute Gradients, or Compute Scalars is
selected, the appropriate computation will be performed,
and a corresponding point-centered array will be added to
the output. The Generic Contour filter operates on a
generic data set, but the input is required to have at
least one point-centered scalar (single-component) array.
The output of this filter is polygonal.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,913: Line 2,317:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''High Point'''<br>''(HighPoint)''
|'''Input''' (Input)
|
Set the input to the Generic Contour
filter.
|
|
This property defines the other end of the direction vector (large scalar values).


| 0 0 1
|
|
The coordinate must lie within the bounding box of the dataset. It will default to the maximum in each dimension.
Accepts input of following types:
* vtkGenericDataSet
The dataset must contain a field array (point)


with 1 component(s).


|-
|-
| '''Input'''<br>''(Input)''
|'''Contour By''' (SelectInputScalars)
|
This property specifies the name of the scalar array
from which the contour filter will compute isolines and/or
isosurfaces.
|
|
This property specifies the input dataset to the Elevation filter.


|
|
An array of scalars is required.The value must be field array name.
|-
|'''ComputeNormals''' (ComputeNormals)
|
Select whether to compute normals.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeGradients''' (ComputeGradients)
|
Select whether to compute gradients.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ComputeScalars''' (ComputeScalars)
|
Select whether to compute scalars.
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Low Point'''<br>''(LowPoint)''
|'''Isosurfaces''' (ContourValues)
|
This property specifies the values at which to compute
isosurfaces/isolines and also the number of such
values.
|
|
This property defines one end of the direction vector (small scalar values).


| 0 0 0
|
|
The coordinate must lie within the bounding box of the dataset. It will default to the minimum in each dimension.
The value must lie within the range of the selected data array.
 
 
|-
|-
| '''Scalar Range'''<br>''(ScalarRange)''
|'''Point Merge Method''' (Locator)
|
This property specifies an incremental point locator for
merging duplicate / coincident points.
|
|
This property determines the range into which scalars will be mapped.


| 0 1
|
|
|}
The value can be one of the following:
* MergePoints (incremental_point_locators)
 
* IncrementalOctreeMergePoints (incremental_point_locators)


* NonMergingPointLocator (incremental_point_locators)


==Extract AMR Blocks==


|}


This filter extracts a list of datasets from hierarchical datasets.
==Convert AMR dataset to Multi-block==


This filter extracts a list of datasets from hierarchical datasets.<br>
Convert AMR to Multiblock


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,969: Line 2,404:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Extract Datasets filter.
This property specifies the input for this
 
filter.
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
|-
| '''Selected Data Sets'''<br>''(SelectedDataSets)''
|
|
This property provides a list of datasets to extract.
Accepts input of following types:
* vtkOverlappingAMR


|
|
|}
|}


==ConvertSelection==


==Extract Block==
Converts a selection from one type to
 
another.
 
This filter extracts a range of blocks from a multiblock dataset.
 
This filter extracts a range of groups from a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,005: Line 2,429:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Block Indices'''<br>''(BlockIndices)''
|'''DataInput''' (DataInput)
|
Set the vtkDataObject input used to convert the
selection.
|
|
This property lists the ids of the blocks to extract
from the input multiblock dataset.


|
|
Accepts input of following types:
* vtkDataObject
|-
|'''Input''' (Input)
|
Set the selection to convert.
|
|
|
Accepts input of following types:
* vtkSelection
|-
|-
| '''Input'''<br>''(Input)''
|'''OutputType''' (OutputType)
|
|
This property specifies the input to the Extract Group filter.
Set the ContentType for the output.
 
|
|
5
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value(s) is an enumeration of the following:
 
* SELECTIONS (0)
 
* GLOBALIDs (1)
The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
* PEDIGREEIDS (2)
 
* VALUES (3)
 
* INDICES (4)
* FRUSTUM (5)
* LOCATION (6)
* THRESHOLDS (7)
|-
|-
| '''Maintain Structure'''<br>''(MaintainStructure)''
|'''ArrayNames''' (ArrayNames)
|
|
This is used only when PruneOutput is ON. By default, when pruning the
output i.e. remove empty blocks, if node has only 1 non-null child
block, then that node is removed. To preserve these parent nodes, set
this flag to true.


| 0
|
|
Only the values 0 and 1 are accepted.


|


|-
|-
| '''Prune Output'''<br>''(PruneOutput)''
|'''MatchAnyValues''' (MatchAnyValues)
|
|
When set, the output mutliblock dataset will be pruned to remove empty
nodes. On by default.


| 1
|
|
Only the values 0 and 1 are accepted.
0
 
|
Accepts boolean values (0 or 1).


|}
|}


==Crop==


==Extract CTH Parts==
Efficiently extract an area/volume of interest from a 2-d image or 3-d volume.The Crop filter
 
extracts an area/volume of interest from a 2D image or a
 
3D volume by allowing the user to specify the minimum and
Create a surface from a CTH volume fraction.
maximum extents of each dimension of the data. Both the
 
input and output of this filter are uniform rectilinear
Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.<br>
data.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,066: Line 2,499:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)''
|'''Input''' (Input)
|
This property specifies the input to the Crop
filter.
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
|
|
Accepts input of following types:
An array of scalars is required.
* vtkImageData
 
 
|-
|-
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)''
|'''OutputWholeExtent''' (OutputWholeExtent)
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.
This property gives the minimum and maximum point index
 
(extent) in each dimension for the output dataset.
|
|
0 0 0 0 0 0
|
|
An array of scalars is required.
The value(s) must lie within the structured-extents of the input dataset.


|}


|-
==Curvature==
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)''
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
This filter will compute the Gaussian or mean curvature of the mesh at each point.The
|
Curvature filter computes the curvature at each point in a
An array of scalars is required.
polygonal data set. This filter supports both Gaussian and
mean curvatures. ; the type can be selected from the
Curvature type menu button.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Clip Type'''<br>''(ClipPlane)''
|'''Input''' (Input)
|
This property specifies the input to the Curvature
filter.
|
|
This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.


|
|
|
Accepts input of following types:
The value must be set to one of the following: None, Plane, Box, Sphere.
* vtkPolyData
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''InvertMeanCurvature''' (InvertMeanCurvature)
|
|
This property specifies the input to the Extract CTH Parts filter.
If this property is set to 1, the mean curvature
 
calculation will be inverted. This is useful for meshes with
inward-pointing normals.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The dataset must contain a cell array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|'''CurvatureType''' (CurvatureType)
|
This propery specifies which type of curvature to
compute.
|
|
The value of this property is the volume fraction value for the surface.
0
 
| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value(s) is an enumeration of the following:
 
* Gaussian (0)
* Mean (1)


|}
|}


==D3==


==Extract Cells By Region==
Repartition a data set into load-balanced spatially convex regions. Create ghost cells if requested.The D3 filter is
 
available when ParaView is run in parallel. It operates on
 
any type of data set to evenly divide it across the
This filter extracts cells that are inside/outside a region or at a region boundary.
processors into spatially contiguous regions. The output
 
of this filter is of type unstructured
This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.<br>
grid.
To use this filter you must specify a region  (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,149: Line 2,586:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Extract intersected'''<br>''(Extract intersected)''
|
This parameter controls whether to extract cells that are on the boundary of the region.
| 0
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Extract only intersected'''<br>''(Extract only intersected)''
|'''Input''' (Input)
|
|
This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.
This property specifies the input to the D3
 
filter.
| 0
|
|
Only the values 0 and 1 are accepted.


|-
| '''Extraction Side'''<br>''(ExtractInside)''
|
This parameter controls whether to extract cells that are inside or outside the region.
| 1
|
|
The value must be one of the following: outside (0), inside (1).
Accepts input of following types:
 
* vtkDataSet
 
|-
|-
| '''Intersect With'''<br>''(ImplicitFunction)''
|'''BoundaryMode''' (BoundaryMode)
|
|
This property sets the region used to extract cells.
This property determines how cells that lie on processor
 
boundaries are handled. The "Assign cells uniquely" option assigns each
boundary cell to exactly one process, which is useful for isosurfacing.
Selecting "Duplicate cells" causes the cells on the boundaries to be
copied to each process that shares that boundary. The "Divide cells"
option breaks cells across process boundary lines so that pieces of the
cell lie in different processes. This option is useful for volume
rendering.
|
|
0
|
|
The value must be set to one of the following: Plane, Box, Sphere.
The value(s) is an enumeration of the following:
 
* Assign cells uniquely (0)
 
* Duplicate cells (1)
* Divide cells (2)
|-
|-
| '''Input'''<br>''(Input)''
|'''Minimal Memory''' (UseMinimalMemory)
|
|
This property specifies the input to the Slice filter.
If this property is set to 1, the D3 filter requires
 
communication routines to use minimal memory than without this
restriction.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Decimate==


==Extract Edges==
Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only.
 
The Decimate filter reduces the number of triangles in a
 
polygonal data set. Because this filter only operates on
Extract edges of 2D and 3D cells as lines.
triangles, first run the Triangulate filter on a dataset
 
that contains polygons other than
The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.<br>
triangles.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,218: Line 2,643:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Decimate
filter.
|
|
This property specifies the input to the Extract Edges filter.


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''TargetReduction''' (TargetReduction)
|
This property specifies the desired reduction in the
total number of polygons in the output dataset. For example, if the
TargetReduction value is 0.9, the Decimate filter will attempt to
produce an output dataset that is 10% the size of the
input.)
|
0.9
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''PreserveTopology''' (PreserveTopology)
|
If this property is set to 1, decimation will not split
the dataset or produce holes, but it may keep the filter from reaching
the reduction target. If it is set to 0, better reduction can occur
(reaching the reduction target), but holes in the model may be
produced.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''FeatureAngle''' (FeatureAngle)
|
The value of this property is used in determining where
the data set may be split. If the angle between two adjacent triangles
is greater than or equal to the FeatureAngle value, then their boundary
is considered a feature edge where the dataset can be
split.
|
15.0
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
 
|'''BoundaryVertexDeletion''' (BoundaryVertexDeletion)
|
If this property is set to 1, then vertices on the
boundary of the dataset can be removed. Setting the value of this
property to 0 preserves the boundary of the dataset, but it may cause
the filter not to reach its reduction target.
|
1
|
Accepts boolean values (0 or 1).


|}
|}


==Delaunay 2D==


==Extract Generic Dataset Surface==
Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.
 
Delaunay2D is a filter that constructs a 2D Delaunay
 
triangulation from a list of input points. These points
Extract geometry from a higher-order dataset
may be represented by any dataset of type vtkPointSet and
 
subclasses. The output of the filter is a polygonal
Extract geometry from a higher-order dataset.<br>
dataset containing a triangle mesh. The 2D Delaunay
triangulation is defined as the triangulation that
satisfies the Delaunay criterion for n-dimensional
simplexes (in this case n=2 and the simplexes are
triangles). This criterion states that a circumsphere of
each simplex in a triangulation contains only the n+1
defining points of the simplex. In two dimensions, this
translates into an optimal triangulation. That is, the
maximum interior angle of any triangle is less than or
equal to that of any possible triangulation. Delaunay
triangulations are used to build topological structures
from unorganized (or unstructured) points. The input to
this filter is a list of points specified in 3D, even
though the triangulation is 2D. Thus the triangulation is
constructed in the x-y plane, and the z coordinate is
ignored (although carried through to the output). You can
use the option ProjectionPlaneMode in order to compute the
best-fitting plane to the set of points, project the
points and that plane and then perform the triangulation
using their projected positions and then use it as the
plane in which the triangulation is performed. The
Delaunay triangulation can be numerically sensitive in
some cases. To prevent problems, try to avoid injecting
points that will result in triangles with bad aspect
ratios (1000:1 or greater). In practice this means
inserting points that are "widely dispersed", and enables
smooth transition of triangle sizes throughout the mesh.
(You may even want to add extra points to create a better
point distribution.) If numerical problems are present,
you will see a warning message to this effect at the end
of the triangulation process. Warning: Points arranged on
a regular lattice (termed degenerate cases) can be
triangulated in more than one way (at least according to
the Delaunay criterion). The choice of triangulation (as
implemented by this algorithm) depends on the order of the
input points. The first three points will form a triangle;
other degenerate points will not break this triangle.
Points that are coincident (or nearly so) may be discarded
by the algorithm. This is because the Delaunay
triangulation requires unique input points. The output of
the Delaunay triangulation is supposedly a convex hull. In
certain cases this implementation may not generate the
convex hull.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,247: Line 2,761:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input dataset to the
Delaunay 2D filter.
|
|
Set the input to the Generic Geometry Filter.


|
|
Accepts input of following types:
* vtkPointSet
|-
|'''ProjectionPlaneMode''' (ProjectionPlaneMode)
|
This property determines type of projection plane to use
in performing the triangulation.
|
0
|
The value(s) is an enumeration of the following:
* XY Plane (0)
* Best-Fitting Plane (2)
|-
|'''Alpha''' (Alpha)
|
The value of this property controls the output of this
filter. For a non-zero alpha value, only edges or triangles contained
within a sphere centered at mesh vertices will be output. Otherwise,
only triangles will be output.
|
0.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Tolerance''' (Tolerance)
|
This property specifies a tolerance to control
discarding of closely spaced points. This tolerance is specified as a
fraction of the diagonal length of the bounding box of the
points.
|
0.00001
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.
|-
 
|'''Offset''' (Offset)
|
This property is a multiplier to control the size of the
initial, bounding Delaunay triangulation.
|
1.0
|


|-
|-
| '''Pass Through Cell Ids'''<br>''(PassThroughCellIds)''
|'''BoundingTriangulation''' (BoundingTriangulation)
|
|
Select whether to forward original ids.
If this property is set to 1, bounding triangulation
 
points (and associated triangles) are included in the output. These are
| 1
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.
|
0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Delaunay 3D==


==Extract Level==
Create a 3D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkUnstructuredGrid as output.Delaunay3D is a filter that constructs
 
a 3D Delaunay triangulation from a list of input points. These points may be
 
represented by any dataset of type vtkPointSet and subclasses. The output of
This filter extracts a range of groups from a hierarchical dataset.
the filter is an unstructured grid dataset. Usually the output is a tetrahedral
 
mesh, but if a non-zero alpha distance value is specified (called the "alpha"
This filter extracts a range of levels from a hierarchical dataset<br>
value), then only tetrahedra, triangles, edges, and vertices lying within the
alpha radius are output. In other words, non-zero alpha values may result in
arbitrary combinations of tetrahedra, triangles, lines, and vertices. (The
notion of alpha value is derived from Edelsbrunner's work on "alpha shapes".)
The 3D Delaunay triangulation is defined as the triangulation that satisfies
the Delaunay criterion for n-dimensional simplexes (in this case n=3 and the
simplexes are tetrahedra). This criterion states that a circumsphere of each
simplex in a triangulation contains only the n+1 defining points of the
simplex. (See text for more information.) While in two dimensions this
translates into an "optimal" triangulation, this is not true in 3D, since a
measurement for optimality in 3D is not agreed on. Delaunay triangulations are
used to build topological structures from unorganized (or unstructured) points.
The input to this filter is a list of points specified in 3D. (If you wish to
create 2D triangulations see Delaunay2D.) The output is an unstructured grid.
The Delaunay triangulation can be numerically sensitive. To prevent problems,
try to avoid injecting points that will result in triangles with bad aspect
ratios (1000:1 or greater). In practice this means inserting points that are
"widely dispersed", and enables smooth transition of triangle sizes throughout
the mesh. (You may even want to add extra points to create a better point
distribution.) If numerical problems are present, you will see a warning
message to this effect at the end of the triangulation process. Warning: Points
arranged on a regular lattice (termed degenerate cases) can be triangulated in
more than one way (at least according to the Delaunay criterion). The choice of
triangulation (as implemented by this algorithm) depends on the order of the
input points. The first four points will form a tetrahedron; other degenerate
points (relative to this initial tetrahedron) will not break it. Points that
are coincident (or nearly so) may be discarded by the algorithm. This is
because the Delaunay triangulation requires unique input points. You can
control the definition of coincidence with the "Tolerance" instance variable.
The output of the Delaunay triangulation is supposedly a convex hull. In
certain cases this implementation may not generate the convex hull. This
behavior can be controlled by the Offset instance variable. Offset is a
multiplier used to control the size of the initial triangulation. The larger
the offset value, the more likely you will generate a convex hull; and the more
likely you are to see numerical problems. The implementation of this algorithm
varies from the 2D Delaunay algorithm (i.e., Delaunay2D) in an important way.
When points are injected into the triangulation, the search for the enclosing
tetrahedron is quite different. In the 3D case, the closest previously inserted
point point is found, and then the connected tetrahedra are searched to find
the containing one. (In 2D, a "walk" towards the enclosing triangle is
performed.) If the triangulation is Delaunay, then an enclosing tetrahedron
will be found. However, in degenerate cases an enclosing tetrahedron may not be
found and the point will be rejected.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,286: Line 2,885:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input dataset to the
Delaunay 3D filter.
|
|
This property specifies the input to the Extract Group filter.


|
|
Accepts input of following types:
* vtkPointSet
|-
|'''Alpha''' (Alpha)
|
This property specifies the alpha (or distance) value to
control the output of this filter. For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of radius
alpha) will be output. Otherwise, only tetrahedra will be
output.
|
0.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.


|-
|-
| '''Levels'''<br>''(Levels)''
|'''Tolerance''' (Tolerance)
|
|
This property lists the levels to extract
This property specifies a tolerance to control
from the input hierarchical dataset.
discarding of closely spaced points. This tolerance is specified as a
 
fraction of the diagonal length of the bounding box of the
points.
|
|
0.001
|
|
|}


==Extract Selection==
Extract different type of selections.
This filter extracts a set of cells/points given a selection.<br>
The selection can be obtained from a rubber-band selection<br>
(either cell, visible or in a frustum) or threshold selection<br>
and passed to the filter or specified by providing an ID list.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Offset''' (Offset)
|
|
This property specifies the input from which the selection is extracted.
This property specifies a multiplier to control the size
 
of the initial, bounding Delaunay triangulation.
|
|
2.5
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable.


|-
|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
|'''BoundingTriangulation''' (BoundingTriangulation)
|
This boolean controls whether bounding triangulation
points (and associated triangles) are included in the output. (These
are introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)
|
0
|
Accepts boolean values (0 or 1).
|-
|'''AlphaTets''' (AlphaTets)
|
This boolean controls whether tetrahedrons which satisfy
the alpha criterion output when alpha is non-zero.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''AlphaTris''' (AlphaTris)
|
This boolean controls whether triangles which satisfy
the alpha criterion output when alpha is non-zero.
|
|
If this property is set to 1 the output preserves the topology of its
1
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Selection'''<br>''(Selection)''
|'''AlphaLines''' (AlphaLines)
|
|
The input that provides the selection object.
This boolean controls whether lines which satisfy the
 
alpha criterion output when alpha is non-zero.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.
 
 
|-
|-
| '''Show Bounds'''<br>''(ShowBounds)''
|'''AlphaVerts''' (AlphaVerts)
|
|
For frustum selection, if this property is set to 1 the output is the
This boolean controls whether vertices which satisfy the
outline of the frustum instead of the contents of the input that lie
alpha criterion are output when alpha is non-zero.
within the frustum.
|
 
0
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Descriptive Statistics==


==Extract Subset==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
 
This filter either computes a statistical model of a dataset or takes
 
such a model as its second input. Then, the model (however it is
Extract a subgrid from a structured grid with the option of setting subsample strides.
obtained) may optionally be used to assess the input dataset.&lt;p&gt;
 
This filter computes the min, max, mean, raw moments M2 through M4,
The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.<br>
standard deviation, skewness, and kurtosis for each array you
select.&lt;p&gt; The model is simply a univariate Gaussian distribution
with the mean and standard deviation provided. Data is assessed using
this model by detrending the data (i.e., subtracting the mean) and then
dividing by the standard deviation. Thus the assessment is an array whose
entries are the number of standard deviations from the mean that each
input point lies.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,393: Line 2,999:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Include Boundary'''<br>''(IncludeBoundary)''
|'''Input''' (Input)
|
The input to the filter. Arrays from this dataset will
be used for computing statistics and/or assessed by a statistical
model.
|
|
If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset must contain a field array ()


|-
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|


|
Accepts input of following types:
* vtkTable
* vtkMultiBlockDataSet
|-
|-
| '''Input'''<br>''(Input)''
|'''AttributeMode''' (AttributeMode)
|
|
This property specifies the input to the Extract Grid filter.
Specify which type of field data the arrays will be
 
drawn from.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must be field array name.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkRectilinearGrid, vtkStructuredPoints, vtkStructuredGrid.
 
 
|-
|-
| '''Sample Rate I'''<br>''(SampleRateI)''
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|
|
This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


| 1
|
|
The value must be greater than or equal to 1.


|-
|-
| '''Sample Rate J'''<br>''(SampleRateJ)''
|'''Task''' (Task)
|
|
This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
Specify the task to be performed: modeling and/or
 
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
| 1
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
3
|
|
The value must be greater than or equal to 1.
The value(s) is an enumeration of the following:
 
* Detailed model of input data (0)
 
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|-
| '''Sample Rate K'''<br>''(SampleRateK)''
|'''TrainingFraction''' (TrainingFraction)
|
Specify the fraction of values from the input dataset to
be used for model fitting. The exact set of values is chosen at random
from the dataset.
|
|
This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
0.1
 
| 1
|
|
The value must be greater than or equal to 1.


|-
|-
| '''V OI'''<br>''(VOI)''
|'''Deviations should be''' (SignedDeviations)
|
|
This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.
Should the assessed values be signed deviations or
 
unsigned?
| 0 0 0 0 0 0
|
|
The values must lie within the extent of the input dataset.
0
 
|
The value(s) is an enumeration of the following:
* Unsigned (0)
* Signed (1)


|}
|}


==Elevation==


==Extract Surface==
Create point attribute array by projecting points onto an elevation vector.
 
The Elevation filter generates point scalar values for an
 
input dataset along a specified direction vector. The
Extract a 2D boundary surface using neighbor relations to eliminate internal faces.
Input menu allows the user to select the data set to which
 
this filter will be applied. Use the Scalar range entry
The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.<br>
boxes to specify the minimum and maximum scalar value to
be generated. The Low Point and High Point define a line
onto which each point of the data set is projected. The
minimum scalar value is associated with the Low Point, and
the maximum scalar value is associated with the High
Point. The scalar value for each point in the data set is
determined by the location along the line to which that
point projects.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,472: Line 3,122:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input dataset to the
Elevation filter.
|
|
This property specifies the input to the Extract Surface filter.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''ScalarRange''' (ScalarRange)
|
This property determines the range into which scalars
will be mapped.
|
0 1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Nonlinear Subdivision Level'''<br>''(NonlinearSubdivisionLevel)''
|'''Low Point''' (LowPoint)
|
This property defines one end of the direction vector
(small scalar values).
|
0 0 0
|
|
If the input is an unstructured grid with nonlinear faces, this
parameter determines how many times the face is subdivided into
linear faces.  If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear
interpolation are discarded).  If 1, the nonlinear face is
triangulated based on the midpoints.  If greater than 1, the
triangulated pieces are recursively subdivided to reach the
desired subdivision.  Setting the value to greater than 1 may
cause some point data to not be passed even if no quadratic faces
exist.  This option has no effect if the input is not an
unstructured grid.


| 1
The value must lie within the bounding box of the dataset.
|
The value must be greater than or equal to 0 and less than or equal to 4.


It will default to the min in each dimension.


|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|'''High Point''' (HighPoint)
|
|
If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.
This property defines the other end of the direction
 
vector (large scalar values).
| 1
|
0 0 1
|
|
Only the values 0 and 1 are accepted.


The value must lie within the bounding box of the dataset.


|}
It will default to the max in each dimension.




==FFT Of Selection Over Time==
|}


==Environment Annotation==


Extracts selection over time and plots the FFT
Allows annotation of user name, date/time, OS, and possibly filename.
Apply to any source. Gui allows manual selection of desired annotation options.
If the source is a file, can display the filename.


Extracts the data of a selection (e.g. points or cells) over time,<br>
takes the FFT of them, and plots them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,532: Line 3,184:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input of the filter.
|
|
The input from which the selection is extracted.


|
|
Accepts input of following types:
* vtkDataObject
|-
|'''DisplayUserName''' (DisplayUserName)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Toggle User Name Visibility.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable, vtkCompositeDataSet.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''DisplaySystemName''' (DisplaySystemName)
|


Toggle System Name Visibility.


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Selection'''<br>''(Selection)''
|'''DisplayDate''' (DisplayDate)
|
|
The input that provides the selection object.
 
Toggle Date/Time Visibility.


|
|
0
|
Accepts boolean values (0 or 1).
|-
|'''DisplayFileName''' (DisplayFileName)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Toggle File Name Visibility.


The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''FileName''' (FileName)
|
Annotation of file name.
|
 
|




|}
|}


==Extract AMR Blocks==


==FOF/SOD Halo Finder==
This filter extracts a list of datasets from hierarchical datasets.This filter extracts a list
 
of datasets from hierarchical datasets.
 
Sorry, no help is currently available.
 


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,573: Line 3,256:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''bb (linking length)'''<br>''(BB)''
|'''Input''' (Input)
|
This property specifies the input to the Extract
Datasets filter.
|
|
Linking length measured in units of interparticle spacing and is dimensionless.  Used to link particles into halos for the friends-of-friends (FOF) algorithm.


| 0.2
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
 
* vtkUniformGridAMR
 
|-
|-
| '''Compute the most bound particle'''<br>''(ComputeMostBoundParticle)''
|'''SelectedDataSets''' (SelectedDataSets)
|
This property provides a list of datasets to
extract.
|
|
If checked, the most bound particle for an FOF halo will be calculated.  WARNING: This can be very slow.


| 0
|
|
Only the values 0 and 1 are accepted.




|-
|}
| '''Compute the most connected particle'''<br>''(ComputeMostConnectedParticle)''
 
|
==Extract Attributes==
If checked, the most connected particle for an FOF halo will be calculated.  WARNING: This can be very slow.


| 0
Extract attribute data as a table.This is a
|
filter that produces a vtkTable from the chosen attribute
Only the values 0 and 1 are accepted.
in the input dataobject. This filter can accept composite
datasets. If the input is a composite dataset, the output
is a multiblock with vtkTable leaves.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Compute spherical overdensity (SOD) halos'''<br>''(ComputeSOD)''
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
|
If checked, spherical overdensity (SOD) halos will be calculated in addition to friends-of-friends (FOF) halos.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkDataObject
 
|-
|-
| '''Copy FOF halo catalog to original particles'''<br>''(CopyHaloDataToParticles)''
|'''FieldAssociation''' (FieldAssociation)
|
|
If checked, the friends-of-friends (FOF) halo catalog information will be copied to the original particles as well.
Select the attribute data to pass.
 
|
| 0
0
|
|
Only the values 0 and 1 are accepted.
The value(s) is an enumeration of the following:
 
* Points (0)
 
* Cells (1)
* Field Data (2)
* Vertices (4)
* Edges (5)
* Rows (6)
|-
|-
| '''Input'''<br>''(Input)''
|'''AddMetaData''' (AddMetaData)
| This property specifies the input of the filter.
|
It is possible for this filter to add additional
meta-data to the field data such as point coordinates (when point
attributes are selected and input is pointset) or structured
coordinates etc. To enable this addition of extra information, turn
this flag on. Off by default.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
==Extract Bag Plots==


Extract Bag Plots.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''maximum radius factor'''<br>''(MaxRadiusFactor)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
|
Maximum radius factor for SOD finding.


| 2
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
* vtkTable
The dataset must contain a field array (row)


with 1 component(s).


|-
|-
| '''minimum FOF mass'''<br>''(MinFOFMass)''
|'''Variables of Interest''' (SelectArrays)
|
|
Minimum FOF mass to calculate an SOD halo.
Choose arrays whose entries will be used to form
 
observations for statistical analysis.
| 5e+12
|
|
|-
| '''minimum FOF size'''<br>''(MinFOFSize)''
|
Minimum FOF halo size to calculate an SOD halo.


| 1000
|
|
The value must be greater than or equal to 0.


|-
|-
| '''minimum radius factor'''<br>''(MinRadiusFactor)''
|'''Process the transposed of the input table''' (TransposeTable)
|
This flag indicates if the input table must
be transposed first.
|
|
Minimum radius factor for SOD finding.
1
 
| 0.5
|
|
The value must be greater than or equal to 0.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)''
|'''RobustPCA''' (RobustPCA)
|
|
Number of seeded particles in one dimension.  Therefore, total simulation particles is np^3 (cubed).
This flag indicates if the PCA should be run
 
in robust mode or not.
| 256
|
0
|
|
The value must be greater than or equal to 0.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)''
|'''HDR smoothing parameter''' (Sigma)
|
Specify the smoothing parameter of the
HDR.
|
|
The space (in rL units) to extend processor particle ownership for ghost particles/cells.  Needed for correct halo calculation when halos cross processor boundaries in parallel computation.
1
 
| 5
|
|
The value must be greater than or equal to 0.


|-
|-
| '''pmin (minimum particle threshold for an FOF halo)'''<br>''(PMin)''
|'''GridSize''' (GridSize)
|
Minimum number of particles (threshold) needed before a group is called a friends-of-friends (FOF) halo.
 
| 100
|
|
The value must be greater than or equal to 1.


Width and height of the grid image to perform the PCA on.


|-
| '''rL (physical box side length)'''<br>''(RL)''
|
|
The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or they are wrapped around).
100
 
| 100
|
|
The value must be greater than or equal to 0.




|}
==Extract Block==
This filter extracts a range of blocks from a multiblock dataset.This filter extracts a range
of groups from a multiblock dataset
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''rho_c'''<br>''(RhoC)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Extract Group
filter.
|
|
rho_c (critical density) for SOD halo finding.


| 2.77537e+11
|
|
Accepts input of following types:
* vtkMultiBlockDataSet
|-
|-
| '''number of bins'''<br>''(SODBins)''
|'''BlockIndices''' (BlockIndices)
|
This property lists the ids of the blocks to extract
from the input multiblock dataset.
|
|
Number of bins for SOD finding.


| 20
|
|
The value must be greater than or equal to 1.


|-
|-
| '''initial SOD center'''<br>''(SODCenterType)''
|'''PruneOutput''' (PruneOutput)
|
|
The initial friends-of-friends (FOF) center used for calculating a spherical overdensity (SOD) halo. WARNING: Using MBP or MCP can be very slow.
When set, the output mutliblock dataset will be pruned
 
to remove empty nodes. On by default.
| 0
|
1
|
|
The value must be one of the following: Center of mass (0), Average position (1), Most bound particle (2), Most connected particle (3).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''initial SOD mass'''<br>''(SODMass)''
|'''MaintainStructure''' (MaintainStructure)
|
|
The initial SOD mass.
This is used only when PruneOutput is ON. By default,
 
when pruning the output i.e. remove empty blocks, if node has only 1
| 1e+14
non-null child block, then that node is removed. To preserve these
parent nodes, set this flag to true.
|
0
|
|
The value must be greater than or equal to 0.
Accepts boolean values (0 or 1).
 


|}
|}


==Extract CTH Parts==


==Feature Edges==
Create a surface from a CTH volume fraction.Extract
 
CTH Parts is a specialized filter for visualizing the data
 
from a CTH simulation. It first converts the selected
This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.
cell-centered arrays to point-centered ones. It then
 
contours each array at a value of 0.5. The user has the
The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.<br>
option of clipping the resulting surface(s) with a plane.
This filter only operates on unstructured data. It
produces polygonal output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,764: Line 3,478:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Boundary Edges'''<br>''(BoundaryEdges)''
|'''Input''' (Input)
|
This property specifies the input to the Extract CTH
Parts filter.
|
|
If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (cell)


with 1 component(s).


|-
|-
| '''Coloring'''<br>''(Coloring)''
|'''Clip Type''' (ClipPlane)
|
This property specifies whether to clip the dataset, and
if so, it also specifies the parameters of the plane with which to
clip.
|
|
If the value of this property is set to 1, then the extracted edges are assigned a scalar value based on the type of the edge.


| 0
|
|
Only the values 0 and 1 are accepted.
The value can be one of the following:
* None (implicit_functions)


* Plane (implicit_functions)


|-
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|'''Volume Arrays''' (VolumeArrays)
|
This property specifies the name(s) of the volume
fraction array(s) for generating parts.
|
|
Ths value of this property is used to define a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. (See the FeatureEdges property.)


| 30
|
|
The value must be greater than or equal to 0 and less than or equal to 180.
An array of scalars is required.
 
 
|-
|-
| '''Feature Edges'''<br>''(FeatureEdges)''
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
The value of this property is the volume fraction value
for the surface.
|
0.1
|
|
If the value of this property is set to 1, feature edges will be extracted. Feature edges are defined as edges that are used by two polygons whose dihedral angle is greater than the feature angle. (See the FeatureAngle property.)
Toggle whether to extract feature edges.


| 1
|-
|'''CapSurfaces''' (CapSurfaces)
|
|
Only the values 0 and 1 are accepted.


When enabled, volume surfaces are capped to produce visually closed
surface.


|
1
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''RemoveGhostCells''' (RemoveGhostCells)
|
|
This property specifies the input to the Feature Edges filter.
 
When set to false, the output surfaces will not hide contours
extracted from ghost cells. This results in overlapping contours but
overcomes holes. Default is set to true.


|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
 
|-
|-
| '''Manifold Edges'''<br>''(ManifoldEdges)''
|'''GenerateTriangles''' (GenerateTriangles)
|
If the value of this property is set to 1, manifold edges will be extracted. Manifold edges are defined as edges that are used by exactly two polygons.
 
| 0
|
|
Only the values 0 and 1 are accepted.


Triangulate results. When set to false, the internal cut and contour filters
are told not to triangulate results if possible.


|-
| '''Non-Manifold Edges'''<br>''(NonManifoldEdges)''
|
|
If the value of this property is set to 1, non-manifold ediges will be extracted. Non-manifold edges are defined as edges that are use by three or more polygons.
0
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Extract Cells By Region==


==Gaussian Resampling==
This filter extracts cells that are inside/outside a region or at a region boundary.
 
This filter extracts from its input dataset all cells that are either
 
completely inside or outside of a specified region (implicit function).
Splat points into a volume with an elliptical, Gaussian distribution.
On output, the filter generates an unstructured grid. To use this filter
 
you must specify a region (implicit function). You must also specify
vtkGaussianSplatter is a filter that injects input points into a<br>
whethter to extract cells lying inside or outside of the region. An
structured points (volume) dataset. As each point is injected, it "splats"<br>
option exists to extract cells that are neither inside or outside (i.e.,
or distributes values to nearby voxels. Data is distributed using an<br>
boundary).
elliptical, Gaussian distribution function. The distribution function is<br>
modified using scalar values (expands distribution) or normals<br>
(creates ellipsoidal distribution rather than spherical).<br><br><br>
Warning: results may be incorrect in parallel as points can't splat<br>
into other processor's cells.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,861: Line 3,579:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Splat Accumulation Mode'''<br>''(Accumulation Mode)''
|'''Input''' (Input)
|
This property specifies the input to the Slice
filter.
|
|
Specify the scalar accumulation mode. This mode expresses how scalar values are combined when splats are overlapped. The Max mode acts like a set union operation and is the most commonly used; the Min mode acts like a set intersection, and the sum is just weird.


| 1
|
|
The value must be one of the following: Min (0), Max (1), Sum (2).
Accepts input of following types:
 
* vtkDataSet
 
|-
|-
| '''Fill Value'''<br>''(CapValue)''
|'''Intersect With''' (ImplicitFunction)
|
This property sets the region used to extract
cells.
|
|
Specify the cap value to use. (This instance variable only has effect if the ivar Capping is on.)


| 0
|
|
The value can be one of the following:
* Plane (implicit_functions)
* Box (implicit_functions)
* Sphere (implicit_functions)
|-
|-
| '''Fill Volume Boundary'''<br>''(Capping)''
|'''InputBounds''' (InputBounds)
|
|
Turn on/off the capping of the outer boundary of the volume to a specified cap value. This can be used to close surfaces (after iso-surfacing) and create other effects.


| 1
|
|
Only the values 0 and 1 are accepted.


|


|-
|-
| '''Ellipitical Eccentricity'''<br>''(Eccentricity)''
|'''Extraction Side''' (ExtractInside)
|
|
Control the shape of elliptical splatting. Eccentricity is the ratio of the major axis (aligned along normal) to the minor (axes) aligned along other two axes. So Eccentricity gt 1 creates needles with the long axis in the direction of the normal; Eccentricity lt 1 creates pancakes perpendicular to the normal vector.
This parameter controls whether to extract cells that
 
are inside or outside the region.
| 2.5
|
1
|
|
The value(s) is an enumeration of the following:
* outside (0)
* inside (1)
|-
|-
| '''Gaussian Exponent Factor'''<br>''(ExponentFactor)''
|'''Extract only intersected''' (Extract only intersected)
|
|
Set / get the sharpness of decay of the splats. This is the exponent constant in the Gaussian equation. Normally this is a negative value.
This parameter controls whether to extract only cells
 
that are on the boundary of the region. If this parameter is set, the
| -5
Extraction Side parameter is ignored. If Extract Intersected is off,
this parameter has no effect.
|
0
|
|
The value must be less than or equal to 0.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Extract intersected''' (Extract intersected)
|
|
This property specifies the input to the filter.
This parameter controls whether to extract cells that
 
are on the boundary of the region.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).


|}


The dataset must contain a point array with 1 components.
==Extract Component==


This filter extracts a component of a multi-component attribute array.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Extent to Resample'''<br>''(ModelBounds)''
|'''Input''' (Input)
|
|
Set / get the (xmin,xmax, ymin,ymax, zmin,zmax) bounding box in which the sampling is performed. If any of the (min,max) bounds values are min >= max, then the bounds will be computed automatically from the input data. Otherwise, the user-specified bounds will be used.


| 0 0 0 0 0 0
This property specifies the input of the Extract Component filter.
 
|
|
|-
 
| '''Elliptical Splats'''<br>''(NormalWarping)''
|
|
Turn on/off the generation of elliptical splats. If normal warping is on, then the input normals affect the distribution of the splat. This boolean is used in combination with the Eccentricity ivar.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array ()


| 1
|-
|'''Input Array''' (SelectInputArray)
|
|
Only the values 0 and 1 are accepted.


This property indicates the name of the array to be extracted.


|-
| '''Empty Cell Value'''<br>''(NullValue)''
|
|
Set the Null value for output points not receiving a contribution from the input points. (This is the initial value of the voxel samples.)


| 0
|
|
The value must be field array name.
|-
|-
| '''Gaussian Splat Radius'''<br>''(Radius)''
|'''Component''' (Component)
|
|
Set / get the radius of propagation of the splat. This value is expressed as a percentage of the length of the longest side of the sampling volume. Smaller numbers greatly reduce execution time.


| 0.1
This property indicates the component of the array to be extracted.
|
|-
| '''Resampling Grid'''<br>''(SampleDimensions)''
|
Set / get the dimensions of the sampling structured point set. Higher values produce better results but are much slower.


| 50 50 50
|
|-
| '''Scale Splats'''<br>''(ScalarWarping)''
|
|
Turn on/off the scaling of splats by scalar value.
0
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Scale Factor'''<br>''(ScaleFactor)''
|'''Output Array Name''' (OutputArrayName)
|
|
Multiply Gaussian splat distribution by this value. If ScalarWarping is on, then the Scalar value will be multiplied by the ScaleFactor times the Gaussian function.


| 1
This property indicates the name of the output scalar array.
|
|-
| '''Resample Field'''<br>''(SelectInputScalars)''
|
Choose a scalar array to splat into the output cells. If ignore arrays is chosen, point density will be counted instead.


|
|
Result
|
|
An array of scalars is required.




Valud array names will be chosen from point and cell data.
|}


==Extract Edges==


|}
Extract edges of 2D and 3D cells as lines.The Extract Edges
 
filter produces a wireframe version of the input dataset
 
by extracting all the edges of the dataset's cells as
==Generate Ids==
lines. This filter operates on any type of data set and
 
produces polygonal output.
 
Generate scalars from point and cell ids.
 
This filter generates scalars  using cell and point ids. That is, the point attribute data scalars are generated from the point ids, and the cell attribute data scalars or field data are generated from the the cell ids.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,005: Line 3,718:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Array Name'''<br>''(ArrayName)''
|'''Input''' (Input)
|
The name of the array that will contain ids.
 
| Ids
|
|
|-
This property specifies the input to the Extract Edges
| '''Input'''<br>''(Input)''
filter.
|
|
This property specifies the input to the Cell Data to Point Data filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Extract Generic Dataset Surface==


==Generate Quadrature Points==
Extract geometry from a higher-order dataset
 
Extract geometry from a higher-order
 
dataset.
Create a point set with data at quadrature points.
 
"Create a point set with data at quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,041: Line 3,744:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
Set the input to the Generic Geometry
Filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.


|
Accepts input of following types:
* vtkGenericDataSet
|-
|-
| '''Select Source Array'''<br>''(SelectSourceArray)''
|'''PassThroughCellIds''' (PassThroughCellIds)
|
|
Specifies the offset array from which we generate quadrature points.
Select whether to forward original ids.
 
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 


|}
|}


==Extract Level==


==Generate Quadrature Scheme Dictionary==
This filter extracts a range of groups from a hierarchical dataset.This filter extracts a range
 
of levels from a hierarchical dataset
 
Generate quadrature scheme dictionaries in data sets that do not have them.
 
Generate quadrature scheme dictionaries in data sets that do not have them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,081: Line 3,777:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
This property specifies the input to the Extract Group
filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|
Accepts input of following types:
* vtkUniformGridAMR
|-
|'''Levels''' (Levels)
|
This property lists the levels to extract from the input
hierarchical dataset.
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
|




|}
|}


==Extract Location==


==Generate Surface Normals==
Sample or extract cells at a point.
 
This filter allows you to specify a location and then either interpolate
 
the data attributes from the input dataset at that location or extract the
This filter will produce surface normals used for smooth shading. Splitting is used to avoid smoothing across feature edges.
cell(s) at the location.


This filter generates surface normals at the points of the input polygonal dataset to provide smooth shading of the dataset. The resulting dataset is also polygonal. The filter works by calculating a normal vector for each polygon in the dataset and then averaging the normals at the shared points.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,108: Line 3,814:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Cell Normals'''<br>''(ComputeCellNormals)''
|'''Input''' (Input)
|
Set the input dataset producer
|
|
This filter computes the normals at the points in the data set. In the process of doing this it computes polygon normals too. If you want these normals to be passed to the output of this filter, set the value of this property to 1.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkDataSet
* vtkCompositeDataSet
The dataset must contain a field array ()


|-
|-
| '''Consistency'''<br>''(Consistency)''
|'''Mode''' (Mode)
|
|
The value of this property controls whether consistent polygon ordering is enforced. Generally the normals for a data set should either all point inward or all point outward. If the value of this property is 1, then this filter will reorder the points of cells that whose normal vectors are oriented the opposite direction from the rest of those in the data set.
| 1
|
Only the values 0 and 1 are accepted.


Select whether to interpolate (probe) data attributes at the specified
location, or to extract cell(s) containing the specified location.


|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|
|
The value of this property  defines a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. If Splitting is on, points are duplicated along these feature edges. (See the Splitting property.)
1
 
| 30
|
|
The value must be greater than or equal to 0 and less than or equal to 180.
The value(s) is an enumeration of the following:
 
* Interpolate At Location (0)
 
* Extract Cell At Location (1)
|-
|-
| '''Flip Normals'''<br>''(FlipNormals)''
|'''Location''' (Location)
|
If the value of this property is 1, this filter will reverse the normal direction (and reorder the points accordingly) for all polygons in the data set; this changes front-facing polygons to back-facing ones, and vice versa. You might want to do this if your viewing position will be inside the data set instead of outside of it.
 
| 0
|
Only the values 0 and 1 are accepted.
 
 
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Normals Generation filter.
 
|
|
Select the location of interest in 3D space.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
0.0 0.0 0.0
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
 
|-
| '''Non-Manifold Traversal'''<br>''(NonManifoldTraversal)''
|
|
Turn on/off traversal across non-manifold edges. Not traversing non-manifold edges will prevent problems where the consistency of polygonal ordering is corrupted due to topological loops.


| 1
The value must lie within the bounding box of the dataset.
|
Only the values 0 and 1 are accepted.


 
It will default to the mid in each dimension.
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
Turn this option to to produce the same results regardless of the number of processors used (i.e., avoid seams along processor boundaries). Turn this off if you do want to process ghost levels and do not mind seams.
 
| 1
|
Only the values 0 and 1 are accepted.
 
 
|-
| '''Splitting'''<br>''(Splitting)''
|
This property controls the splitting of sharp edges. If sharp edges are split (property value = 1), then points are duplicated along these edges, and separate normals are computed for both sets of points to give crisp (rendered) surface definition.
 
| 1
|
Only the values 0 and 1 are accepted.




|}
|}


==Extract Region Surface==


==Glyph==
Extract a 2D boundary surface using neighbor relations to eliminate internal faces.The Extract
 
Surface filter extracts the polygons forming the outer
 
surface of the input dataset. This filter operates on any
This filter generates an arrow, cone, cube, cylinder, line, sphere, or 2D glyph at each point of the input data set.  The glyphs can be oriented and scaled by point attributes of the input dataset.
type of data and produces polygonal data as
 
output.
The Glyph filter generates a glyph (i.e., an arrow, cone, cube, cylinder, line, sphere, or 2D glyph) at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,207: Line 3,869:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Glyph Transform'''<br>''(GlyphTransform)''
|'''Input''' (Input)
|
This property specifies the input to the Extract Surface
filter.
|
|
The values in this property allow you to specify the transform
(translation, rotation, and scaling) to apply to the glyph source.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''PieceInvariant''' (PieceInvariant)
|
If the value of this property is set to 1, internal
surfaces along process boundaries will be removed. NOTE: Enabling this
option might cause multiple executions of the data source because more
information is needed to remove internal surfaces.
|
1
|
|
The value must be set to one of the following: Transform2.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''NonlinearSubdivisionLevel''' (NonlinearSubdivisionLevel)
|
|
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.
If the input is an unstructured grid with nonlinear
 
faces, this parameter determines how many times the face is subdivided
into linear faces. If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear interpolation
are discarded). If 1, the nonlinear face is triangulated based on the
midpoints. If greater than 1, the triangulated pieces are recursively
subdivided to reach the desired subdivision. Setting the value to
greater than 1 may cause some point data to not be passed even if no
quadratic faces exist. This option has no effect if the input is not an
unstructured grid.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|'''RegionArrayName''' (RegionArrayName)
|
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)
This property specifies the name of the material
 
array for generating parts.
| 5000
|
material
|
|
The value must be greater than or equal to 0.


|-
|-
| '''Random Mode'''<br>''(RandomMode)''
|'''SingleSided''' (SingleSided)
|
If the value of this property is set to 1 (the default),
surfaces along the boundary are 1 layer thick. Otherwise there is
a surface for the material on each side.
|
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.
1
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''MaterialPropertiesName''' (MaterialPropertiesName)
|
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)
This the name of the input material property field data array
 
|
|
material_properties
|
|
An array of scalars is required.


|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''MaterialIDsName''' (MaterialIDsName)
|
|
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)
This the name of the input and output material id field data array
 
|
| 1
material_ids
|
|
An array of vectors is required.


|-
|-
| '''Orient'''<br>''(SetOrient)''
|'''MaterialPIDsName''' (MaterialPIDsName)
|
This the name of the output material ancestry id field data array
|
|
If this property is set to 1, the glyphs will be oriented based on the selected vector array.
material_ancestors
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|'''InterfaceIDsName''' (InterfaceIDsName)
|
This the name of the input and output interface id field data array
|
|
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.
interface_ids
 
| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.




The value must lie within the range of the selected data array.
|}


==Extract Selection==


The value must lie within the range of the selected data array.
Extract different type of selections.This
filter extracts a set of cells/points given a selection.
The selection can be obtained from a rubber-band selection
(either cell, visible or in a frustum) or threshold
selection and passed to the filter or specified by
providing an ID list.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Scale Mode'''<br>''(SetScaleMode)''
|'''Input''' (Input)
|
This property specifies the input from which the
selection is extracted.
|
|
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.


| 1
|
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).
Accepts input of following types:
 
* vtkDataSet
* vtkTable
|-
|'''Selection''' (Selection)
|
The input that provides the selection
object.
|


|
Accepts input of following types:
* vtkSelection
|-
|-
| '''Glyph Type'''<br>''(Source)''
|'''PreserveTopology''' (PreserveTopology)
|
|
This property determines which type of glyph will be placed at the points in the input dataset.
If this property is set to 1 the output preserves the
 
topology of its input and adds an insidedness array to mark which cells
are inside or out. If 0 then the output is an unstructured grid which
contains only the subset of cells that are inside.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), glyph_sources.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
 
The value must be set to one of the following: ArrowSource, ConeSource, CubeSource, CylinderSource, LineSource, SphereSource, GlyphSource2D.
 
 
|-
|-
| '''Mask Points'''<br>''(UseMaskPoints)''
|'''ShowBounds''' (ShowBounds)
|
For frustum selection, if this property is set to 1 the
output is the outline of the frustum instead of the contents of the
input that lie within the frustum.
|
|
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)
0
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


 
==Extract Selection (internal)==
==Glyph With Custom Source==




This filter generates a glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.
This filter extracts a given set of cells or points given
a selection. The selection can be obtained from a rubber-band selection
(either point, cell, visible or in a frustum) and passed to the filter or
specified by providing an ID list. This is an internal filter, use
"ExtractSelection" instead.


The Glyph filter generates a glyph at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,349: Line 4,039:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
 
The input from which the selection is
extracted.
 
|
|
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''Selection''' (Selection)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The input that provides the selection
object.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|
 
|
Accepts input of following types:
* vtkSelection


|}
==Extract Subset==
Extract a subgrid from a structured grid with the option of setting subsample strides.The Extract
Grid filter returns a subgrid of a structured input data
set (uniform rectilinear, curvilinear, or nonuniform
rectilinear). The output data set type of this filter is
the same as the input type.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|'''Input''' (Input)
|
This property specifies the input to the Extract Grid
filter.
|
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)


| 5000
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
 
* vtkImageData
 
* vtkRectilinearGrid
* vtkStructuredPoints
* vtkStructuredGrid
|-
|-
| '''Random Mode'''<br>''(RandomMode)''
|'''VOI''' (VOI)
|
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.
This property specifies the minimum and maximum point
 
indices along each of the I, J, and K axes; these values indicate the
| 1
volume of interest (VOI). The output will have the (I,J,K) extent
specified here.
|
0 0 0 0 0 0
|
|
Only the values 0 and 1 are accepted.
The value(s) must lie within the structured-extents of the input dataset.
 
 
|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''SampleRateI''' (SampleRateI)
|
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)
This property indicates the sampling rate in the I
 
dimension. A value grater than 1 results in subsampling; every nth
index will be included in the output.
|
|
1
|
|
An array of scalars is required.


|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''SampleRateJ''' (SampleRateJ)
|
|
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)
This property indicates the sampling rate in the J
 
dimension. A value grater than 1 results in subsampling; every nth
| 1
index will be included in the output.
|
1
|
|
An array of vectors is required.


|-
|-
| '''Orient'''<br>''(SetOrient)''
|'''SampleRateK''' (SampleRateK)
|
This property indicates the sampling rate in the K
dimension. A value grater than 1 results in subsampling; every nth
index will be included in the output.
|
|
If this property is set to 1, the glyphs will be oriented based on the selected vector array.
1
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|'''IncludeBoundary''' (IncludeBoundary)
|
|
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.
If the value of this property is 1, then if the sample
 
rate in any dimension is greater than 1, the boundary indices of the
| 1
input dataset will be passed to the output even if the boundary extent
is not an even multiple of the sample rate in a given
dimension.
|
0
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.
Accepts boolean values (0 or 1).


|}


The value must lie within the range of the selected data array.
==Extract Surface==


Extract a 2D boundary surface using neighbor relations to eliminate internal faces.The Extract
Surface filter extracts the polygons forming the outer
surface of the input dataset. This filter operates on any
type of data and produces polygonal data as
output.


The value must lie within the range of the selected data array.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Scale Mode'''<br>''(SetScaleMode)''
|'''Input''' (Input)
|
This property specifies the input to the Extract Surface
filter.
|
|
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.


| 1
|
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).
Accepts input of following types:
 
* vtkDataSet
 
|-
|-
| '''Glyph Type'''<br>''(Source)''
|'''PieceInvariant''' (PieceInvariant)
|
|
This property determines which type of glyph will be placed at the points in the input dataset.
If the value of this property is set to 1, internal
 
surfaces along process boundaries will be removed. NOTE: Enabling this
option might cause multiple executions of the data source because more
information is needed to remove internal surfaces.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), glyph_sources.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
 
|-
|-
| '''Mask Points'''<br>''(UseMaskPoints)''
|'''NonlinearSubdivisionLevel''' (NonlinearSubdivisionLevel)
|
|
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)
If the input is an unstructured grid with nonlinear
 
faces, this parameter determines how many times the face is subdivided
| 1
into linear faces. If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear interpolation
are discarded). If 1, the nonlinear face is triangulated based on the
midpoints. If greater than 1, the triangulated pieces are recursively
subdivided to reach the desired subdivision. Setting the value to
greater than 1 may cause some point data to not be passed even if no
quadratic faces exist. This option has no effect if the input is not an
unstructured grid.
|
1
|
|
Only the values 0 and 1 are accepted.




|}
|}


==FFT Of Selection Over Time==
Extracts selection over time and plots the FFT
Extracts the data of a selection (e.g. points or cells)
over time, takes the FFT of them, and plots
them.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


==Gradient==


|}


This filter computes gradient vectors for an image/volume.
==Feature Edges==


The Gradient filter computes the gradient vector at each point in an image or volume. This filter uses central differences to compute the gradients. The Gradient filter operates on uniform rectilinear (image) data and produces image data output.<br>
This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.
The Feature Edges filter extracts various subsets of edges
from the input data set. This filter operates on polygonal
data and produces polygonal output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,477: Line 4,237:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Dimensionality'''<br>''(Dimensionality)''
|'''Input''' (Input)
|
This property specifies the input to the Feature Edges
filter.
|
|
This property indicates whether to compute the gradient in two dimensions or in three. If the gradient is being computed in two dimensions, the X and Y dimensions are used.


| 3
|
|
The value must be one of the following: Two (2), Three (3).
Accepts input of following types:
 
* vtkPolyData
 
|-
|'''BoundaryEdges''' (BoundaryEdges)
|
If the value of this property is set to 1, boundary
edges will be extracted. Boundary edges are defined as lines cells or
edges that are used by only one polygon.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''FeatureEdges''' (FeatureEdges)
|
If the value of this property is set to 1, feature edges
will be extracted. Feature edges are defined as edges that are used by
two polygons whose dihedral angle is greater than the feature angle.
(See the FeatureAngle property.) Toggle whether to extract feature
edges.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Non-Manifold Edges''' (NonManifoldEdges)
|
If the value of this property is set to 1, non-manifold
ediges will be extracted. Non-manifold edges are defined as edges that
are use by three or more polygons.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ManifoldEdges''' (ManifoldEdges)
|
If the value of this property is set to 1, manifold
edges will be extracted. Manifold edges are defined as edges that are
used by exactly two polygons.
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''Coloring''' (Coloring)
|
|
This property specifies the input to the Gradient filter.
If the value of this property is set to 1, then the
 
extracted edges are assigned a scalar value based on the type of the
edge.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The dataset must contain a point array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData.
 
 
|-
|-
| '''Select Input Scalars'''<br>''(SelectInputScalars)''
|'''FeatureAngle''' (FeatureAngle)
|
|
This property lists the name of the array from which to compute the gradient.
Ths value of this property is used to define a feature
 
edge. If the surface normal between two adjacent triangles is at least
as large as this Feature Angle, a feature edge exists. (See the
FeatureEdges property.)
|
|
30.0
|
|
An array of scalars is required.




|}
|}


==FlattenFilter==


==Gradient Of Unstructured DataSet==


Estimate the gradient for each point or cell in any type of dataset.
The Gradient (Unstructured) filter estimates the gradient vector at each point or cell. It operates on any type of vtkDataSet, and the output is the same type as the input. If the dataset is a vtkImageData, use the Gradient filter instead; it will be more efficient for this type of dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,529: Line 4,324:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute QCriterion'''<br>''(ComputeQCriterion)''
|'''Input''' (Input)
|
Set the input to the Flatten Filter.
|
|
When this flag is on, the gradient filter will compute the
Q-criterion of a 3 component array.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkPointSet
* vtkGraph
* vtkCompositeDataSet


|}
==Gaussian Resampling==
Splat points into a volume with an elliptical, Gaussian distribution.vtkGaussianSplatter
is a filter that injects input points into a structured
points (volume) dataset. As each point is injected, it
"splats" or distributes values to nearby voxels. Data is
distributed using an elliptical, Gaussian distribution
function. The distribution function is modified using
scalar values (expands distribution) or normals (creates
ellipsoidal distribution rather than spherical). Warning:
results may be incorrect in parallel as points can't splat
into other processor's cells.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
|
When this flag is on, the gradient filter will compute the
vorticity/curl of a 3 component array.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)


with 1 component(s).


|-
|-
| '''Faster Approximation'''<br>''(FasterApproximation)''
|'''Resample Field''' (SelectInputScalars)
|
Choose a scalar array to splat into the output cells. If
ignore arrays is chosen, point density will be counted
instead.
|
|
When this flag is on, the gradient filter will provide a less
accurate (but close) algorithm that performs fewer derivative
calculations (and is therefore faster).  The error contains some
smoothing of the output data and some possible errors on the
boundary.  This parameter has no effect when performing the
gradient of cell data.


| 0
|
|
Only the values 0 and 1 are accepted.
An array of scalars is required.The value must be field array name.
|-
|'''Resampling Grid''' (SampleDimensions)
|
Set / get the dimensions of the sampling structured
point set. Higher values produce better results but are much
slower.
|
50 50 50
|


|-
|'''Extent to Resample''' (ModelBounds)
|
Set / get the (xmin,xmax, ymin,ymax, zmin,zmax) bounding
box in which the sampling is performed. If any of the (min,max) bounds
values are min &gt;= max, then the bounds will be computed
automatically from the input data. Otherwise, the user-specified bounds
will be used.
|
0.0 0.0 0.0 0.0 0.0 0.0
|


|-
|-
| '''Input'''<br>''(Input)''
|'''Gaussian Splat Radius''' (Radius)
|
Set / get the radius of propagation of the splat. This
value is expressed as a percentage of the length of the longest side of
the sampling volume. Smaller numbers greatly reduce execution
time.
|
0.1
|
|
This property specifies the input to the Gradient (Unstructured) filter.


|-
|'''Gaussian Exponent Factor''' (ExponentFactor)
|
Set / get the sharpness of decay of the splats. This is
the exponent constant in the Gaussian equation. Normally this is a
negative value.
|
|
-5.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point or cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|'''Scale Splats''' (ScalarWarping)
|
Turn on/off the scaling of splats by scalar
value.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Scale Factor''' (ScaleFactor)
|
Multiply Gaussian splat distribution by this value. If
ScalarWarping is on, then the Scalar value will be multiplied by the
ScaleFactor times the Gaussian function.
|
1.0
|


|-
|-
| '''Result Array Name'''<br>''(ResultArrayName)''
|'''Elliptical Splats''' (NormalWarping)
|
Turn on/off the generation of elliptical splats. If
normal warping is on, then the input normals affect the distribution of
the splat. This boolean is used in combination with the Eccentricity
ivar.
|
|
This property provides a name for the output array containing the gradient vectors.
1
 
| Gradients
|
|
Accepts boolean values (0 or 1).
|-
|-
| '''Scalar Array'''<br>''(SelectInputScalars)''
|'''Ellipitical Eccentricity''' (Eccentricity)
|
|
This property lists the name of the scalar array from which to compute the gradient.
Control the shape of elliptical splatting. Eccentricity
 
is the ratio of the major axis (aligned along normal) to the minor
(axes) aligned along other two axes. So Eccentricity gt 1 creates
needles with the long axis in the direction of the normal; Eccentricity
lt 1 creates pancakes perpendicular to the normal
vector.
|
|
2.5
|
|
An array of scalars is required.
Valud array names will be chosen from point and cell data.
|}


|-
|'''Fill Volume Boundary''' (Capping)
|
Turn on/off the capping of the outer boundary of the
volume to a specified cap value. This can be used to close surfaces
(after iso-surfacing) and create other effects.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Fill Value''' (CapValue)
|
Specify the cap value to use. (This instance variable
only has effect if the ivar Capping is on.)
|
0.0
|
|-
|'''Splat Accumulation Mode''' (Accumulation Mode)
|
Specify the scalar accumulation mode. This mode
expresses how scalar values are combined when splats are overlapped.
The Max mode acts like a set union operation and is the most commonly
used; the Min mode acts like a set intersection, and the sum is just
weird.
|
1
|
The value(s) is an enumeration of the following:
* Min (0)
* Max (1)
* Sum (2)
|-
|'''Empty Cell Value''' (NullValue)
|
Set the Null value for output points not receiving a
contribution from the input points. (This is the initial value of the
voxel samples.)
|
0.0
|


==Grid Connectivity==


|}


Mass properties of connected fragments for unstructured grids.
==Generate Ids==


This filter works on multiblock unstructured grid inputs and also works in<br>
Generate scalars from point and cell ids.
parallel.  It Ignores any cells with a cell data Status value of 0.<br>
This filter generates scalars using cell and point ids.
It performs connectivity to distict fragments separately.  It then integrates<br>
That is, the point attribute data scalars are generated
attributes of the fragments.<br>
from the point ids, and the cell attribute data scalars or
field data are generated from the the cell
ids.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,621: Line 4,531:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
This property specifies the input to the Cell Data to
Point Data filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


 
|
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid, vtkCompositeDataSet.
Accepts input of following types:
* vtkDataSet
|-
|'''ArrayName''' (ArrayName)
|
The name of the array that will contain
ids.
|
Ids
|




|}
|}


==Generate Quadrature Points==


==Group Datasets==
Create a point set with data at quadrature points.
 
"Create a point set with data at quadrature
 
points."
Group data sets.
 
Groups multiple datasets to create a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,648: Line 4,566:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input of the filter.
|
|
This property indicates the the inputs to the Group Datasets filter.


|
|
Accepts input of following types:
* vtkUnstructuredGrid
The dataset must contain a field array (cell)
|-
|'''Quadrature Scheme Def''' (QuadratureSchemeDefinition)
|
Specifies the offset array from which we generate
quadrature points.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.


|
An array of scalars is required.


|}
|}


==Generate Quadrature Scheme Dictionary==


==Histogram==
Generate quadrature scheme dictionaries in data sets that do not have them.
 
Generate quadrature scheme dictionaries in data sets that do not have
 
them.
Extract a histogram from field data.
 


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,676: Line 4,602:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Bin Count'''<br>''(BinCount)''
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
|
The value of this property specifies the number of bins for the histogram.


| 10
|
|
The value must be greater than or equal to 1 and less than or equal to 256.
Accepts input of following types:
* vtkUnstructuredGrid
 
|}


==Generate Surface Normals==


This filter will produce surface normals used for smooth shading. Splitting is used to avoid smoothing across feature edges.This filter
generates surface normals at the points of the input
polygonal dataset to provide smooth shading of the
dataset. The resulting dataset is also polygonal. The
filter works by calculating a normal vector for each
polygon in the dataset and then averaging the normals at
the shared points.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Calculate Averages'''<br>''(CalculateAverages)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Normals
Generation filter.
|
|
This option controls whether the algorithm calculates averages
of variables other than the primary variable that fall into each
bin.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkPolyData
|-
|'''FeatureAngle''' (FeatureAngle)
|
The value of this property defines a feature edge. If
the surface normal between two adjacent triangles is at least as large
as this Feature Angle, a feature edge exists. If Splitting is on,
points are duplicated along these feature edges. (See the Splitting
property.)
|
30
|


|-
|-
| '''Component'''<br>''(Component)''
|'''Splitting''' (Splitting)
|
This property controls the splitting of sharp edges. If
sharp edges are split (property value = 1), then points are duplicated
along these edges, and separate normals are computed for both sets of
points to give crisp (rendered) surface definition.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Consistency''' (Consistency)
|
The value of this property controls whether consistent
polygon ordering is enforced. Generally the normals for a data set
should either all point inward or all point outward. If the value of
this property is 1, then this filter will reorder the points of cells
that whose normal vectors are oriented the opposite direction from the
rest of those in the data set.
|
|
The value of this property specifies the array component from which the histogram should be computed.
1
 
| 0
|
|
Accepts boolean values (0 or 1).
|-
|-
| '''Custom Bin Ranges'''<br>''(CustomBinRanges)''
|'''FlipNormals''' (FlipNormals)
|
|
Set custom bin ranges to use. These are used only when
If the value of this property is 1, this filter will
UseCustomBinRanges is set to true.
reverse the normal direction (and reorder the points accordingly) for
 
all polygons in the data set; this changes front-facing polygons to
| 0 100
back-facing ones, and vice versa. You might want to do this if your
viewing position will be inside the data set instead of outside of
it.
|
0
|
|
The value must lie within the range of the selected data array.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Non-Manifold Traversal''' (NonManifoldTraversal)
|
|
This property specifies the input to the Histogram filter.
Turn on/off traversal across non-manifold edges. Not
 
traversing non-manifold edges will prevent problems where the
consistency of polygonal ordering is corrupted due to topological
loops.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The dataset must contain a point or cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Select Input Array'''<br>''(SelectInputArray)''
|'''ComputeCellNormals''' (ComputeCellNormals)
|
|
This property indicates the name of the array from which to compute the histogram.
This filter computes the normals at the points in the
 
data set. In the process of doing this it computes polygon normals too.
If you want these normals to be passed to the output of this filter,
set the value of this property to 1.
|
|
0
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
Valud array names will be chosen from point and cell data.
 
 
|-
|-
| '''Use Custom Bin Ranges'''<br>''(UseCustomBinRanges)''
|'''PieceInvariant''' (PieceInvariant)
|
Turn this option to to produce the same results
regardless of the number of processors used (i.e., avoid seams along
processor boundaries). Turn this off if you do want to process ghost
levels and do not mind seams.
|
|
When set to true, CustomBinRanges will  be used instead of using the
1
full range for the selected array. By default, set to false.
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


 
==GeometryFilter==
==Integrate Variables==




This filter integrates cell and point attributes.
The Integrate Attributes filter integrates point and cell data over lines and surfaces.  It also computes length of lines, area of surface, or volume.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,772: Line 4,738:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the Geoemtry Filter.
|
 
|
|
This property specifies the input to the Integrate Attributes filter.


|-
|'''UseStrips''' (UseStrips)
|
Toggle whether to generate faces containing triangle
strips. This should render faster and use less memory, but no cell data
is copied.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ForceStrips''' (ForceStrips)
|
This makes UseStrips call Modified() after changing its
setting to ensure that the filter's output is immediatley
changed.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''UseOutline''' (UseOutline)
|
Toggle whether to generate an outline or a
surface.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''NonlinearSubdivisionLevel''' (NonlinearSubdivisionLevel)
|
|
Nonlinear faces are approximated with flat polygons.
This parameter controls how many times to subdivide nonlinear surface
cells. Higher subdivisions generate closer approximations but take more
memory and rendering time. Subdivision is recursive, so the number of
output polygons can grow exponentially with this
parameter.
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|'''PassThroughIds''' (PassThroughIds)
|
If on, the output polygonal dataset will have a celldata
array that holds the cell index of the original 3D cell that produced
each output cell. This is useful for cell picking.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''PassThroughPointIds''' (PassThroughPointIds)
|
If on, the output polygonal dataset will have a
pointdata array that holds the point index of the original 3D vertex
that produced each output vertex. This is useful for
picking.
|
1
|
Accepts boolean values (0 or 1).


|}
|}


==Ghost Cells Generator==


==Interpolate to Quadrature Points==
Generate ghost cells for unstructured grids.
The GhostCellGenerator filter is available when ParaView is
run in parallel (ie. with MPI). It operates on unstructured
grids only.
This filter does not redistribute the input data, it only
generates ghost cells at processor boundary by fetching
topological and geometrical information of those cells on
neighbor ranks. The filter can take benefit of global point
ids if they are available - if so it will perform faster,
otherwise point coordinates will be exchanged and processed.


Create scalar/vector data arrays interpolated to quadrature points.
"Create scalar/vector data arrays interpolated to quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,801: Line 4,833:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
This property specifies the input to the ghost cells
generator.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.


|
Accepts input of following types:
* vtkUnstructuredGrid
|-
|'''BuildIfRequired''' (BuildIfRequired)
|
Specify if the filter must generate the ghost cells only
if required by the pipeline downstream.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''UseGlobalIds''' (UseGlobalIds)
|
Specify if the filter must take benefit of global point
ids if they exist or if point coordinates should be used instead.


|
1
|
Accepts boolean values (0 or 1).
|-
|-
| '''Select Source Array'''<br>''(SelectSourceArray)''
|'''GlobalPointIdsArrayName''' (GlobalPointIdsArrayName)
|
|
Specifies the offset array from which we interpolate values to quadrature points.
This property provides the name for the input array
 
containing the global point ids if the GlobalIds array of the point
data if not set. Default is GlobalNodeIds.
|
|
GlobalNodeIds
|
|
An array of scalars is required.




|}
|}


==Glyph==


==Intersect Fragments==
This filter produces a glyph at each point of in input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.
The Glyph filter generates a glyph (i.e., an arrow, cone, cube,
cylinder, line, sphere, or 2D glyph) at each point in the input
dataset. The glyphs can be oriented and scaled by the input
point-centered scalars and vectors. The Glyph filter operates on any
type of data set. Its output is polygonal


To use this filter, you first select the arrays to use for as the
**Scalars** and **Vectors**, if any. To orient the glyphs using the
selected **Vectors**, use **Orient** property. To scale the glyphs using
the selected **Scalars** or **Vectors**, use the **Scale Mode** property.


The Intersect Fragments filter perform geometric intersections on sets of fragments.
The **Glyph Mode** property controls which points in the input dataset
are selected for glyphing (since in most cases, glyping all points in
the input dataset can be both performance impeding as well as visually
cluttred.


The Intersect Fragments filter perform geometric intersections on sets of<br>
fragments. The filter takes two inputs, the first containing fragment<br>
geometry and the second containing fragment centers. The filter has two<br>
outputs. The first is geometry that results from the intersection. The<br>
second is a set of points that is an approximation of the center of where<br>
each fragment has been intersected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,843: Line 4,902:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Slice Type'''<br>''(CutFunction)''
|'''Input''' (Input)
|
|
This property sets the type of intersecting geometry, and
associated parameters.
|
|
The value must be set to one of the following: Plane, Box, Sphere.


This property specifies the input to the Glyph filter. This is the
dataset from which the points are selecetd to be glyphed.


|-
| '''Input'''<br>''(Input)''
|
|
This input must contian fragment geometry.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
The dataset must contain a field array ()


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
The dataset must contain a field array ()


with 3 component(s).


|-
|-
| '''Source'''<br>''(Source)''
|'''Glyph Type''' (Source)
|
This property determines which type of glyph will be
placed at the points in the input dataset.
|
|
This input must contian fragment centers.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkPolyDataThe value can be one of the following:
* ArrowSource (sources)
 
* ConeSource (sources)


* CubeSource (sources)


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
* CylinderSource (sources)


* LineSource (sources)


|}
* SphereSource (sources)


* GlyphSource2D (sources)


==Iso Volume==
|-
|'''Scalars''' (Scalars)
|


Select the input array to be treated as the active **Scalars**. You
can scale the glyphs using the selected scalars by setting the **Scale
Mode** property to **scalar**.


This filter extracts cells by clipping cells that have point        scalars not in the specified range.
|
0
|
An array of scalars is required.The value must be field array name.
|-
|'''Vectors''' (Vectors)
|


This filter clip away the cells using lower and upper thresholds.<br>
Select the input array to be treated as the active **Vectors**. You can
scale the glyphs using the selected vectors by setting the **Scale
Mode** property to **vector** or **vector_components**. You can orient the
glyphs using the selected vectors by checking the **Orient** property.


{| class="PropertiesTable" border="1" cellpadding="5"
|
1
|
An array of vectors is required.The value must be field array name.
|-
|-
| '''Property'''
|'''Orient''' (Orient)
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Threshold filter.
 
If this property is set to 1, the glyphs will be oriented based on the
vectors selected using the **Vectors** property.


|
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ScaleMode''' (ScaleMode)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Select how to scale the glyphs. Set to **off** to disable scaling
entirely. Set to **scalar** to scale the glyphs using the array selected
using the **Scalars** property. Set to **vector** to scale the glyphs
using the magnitude of the array selected using the **Vectors**
property. Set to **vector_components** to scale using the **Vectors**,
scaling each component individually.


The dataset must contain a point or cell array with 1 components.
|
3
|
The value(s) is an enumeration of the following:
* scalar (0)
* vector (1)
* vector_components (2)
* off (3)
|-
|'''ScaleFactor''' (ScaleFactor)
|
Specify the constant multiplier to use to scale the glyphs.


|
1.0
|
The value must lie within the range of the selected data array.The value must lie within the range of the selected data array.
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
0.1.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
|'''GlyphMode''' (GlyphMode)
|


This property indicates the mode that will be used to generate
glyphs from the dataset.


|
2
|
The value(s) is an enumeration of the following:
* All Points (0)
* Every Nth Point (1)
* Uniform Spatial Distribution (2)
|-
|-
| '''Input Scalars'''<br>''(SelectInputScalars)''
|'''MaximumNumberOfSamplePoints''' (MaximumNumberOfSamplePoints)
|
|
The value of this property contains the name of the scalar array from which to perform thresholding.
 
This property specifies the maximum number of sample points to use
when sampling the space when Uniform Spatial Distribution is used.


|
|
5000
|
|
An array of scalars is required.


|-
|'''Seed''' (Seed)
|


Valud array names will be chosen from point and cell data.
This property specifies the seed that will be used for generating a
uniform distribution of glyph points when a Uniform Spatial
Distribution is used.


|
10339
|


|-
|-
| '''Threshold Range'''<br>''(ThresholdBetween)''
|'''Stride''' (Stride)
|
|
The values of this property specify the upper and lower bounds of the thresholding operation.


| 0 0
This property specifies the stride that will be used when glyphing by
Every Nth Point.
 
|
1
|
|
The value must lie within the range of the selected data array.


|-
|'''GlyphTransform''' (GlyphTransform)
|


|}
The values in this property allow you to specify the transform
(translation, rotation, and scaling) to apply to the glyph
source.
|


|
The value can be one of the following:
* Transform2 (extended_sources)


==K Means==


|}


Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
==Glyph With Custom Source==
 
This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.
<br>
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select. The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center. The model is then a set of cluster centers. Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br>


This filter generates a glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.
The Glyph filter generates a glyph at each point in the input dataset.
The glyphs can be oriented and scaled by the input point-centered scalars
and vectors. The Glyph filter operates on any type of data set. Its
output is polygonal. This filter is available on the
Toolbar.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,954: Line 5,094:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
|
Specify which type of field data the arrays will be drawn from.
This property specifies the input to the Glyph filter.
This is the dataset from which the points are selecetd to be glyphed.


| 0
|
|
Valud array names will be chosen from point and cell data.


|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
with 1 component(s).
The dataset must contain a field array (point)
with 3 component(s).


|-
|-
| '''Input'''<br>''(Input)''
|'''Glyph Type''' (Source)
|
This property determines which type of glyph will be
placed at the points in the input dataset.
|
|
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''Scalars''' (Scalars)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Select the input array to be treated as the active "Scalars".
You can scale the glyphs using the selected scalars by setting the
"Scale Mode" property to "scalar"


|


The dataset must contain a point or cell array.
|
 
An array of scalars is required.
 
|-
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
|'''Vectors''' (Vectors)
 
|
Select the input array to be treated as the active "Vectors".
You can scale the glyphs using the selected vectors by setting the "Scale Mode"
property to "vector" or "vector_components". You can orient the glyphs using the
selected vectors by checking the "Orient" property.


|
1
|
An array of vectors is required.
|-
|-
| '''k'''<br>''(K)''
|'''Orient''' (Orient)
|
|
Specify the number of clusters.
If this property is set to 1, the glyphs will be
oriented based on the vectors selected using the "Vectors" property.


| 5
|
|
The value must be greater than or equal to 1.
1
 
|
 
Accepts boolean values (0 or 1).
|-
|-
| '''Max Iterations'''<br>''(MaxNumIterations)''
|'''ScaleMode''' (ScaleMode)
|
|
Specify the maximum number of iterations in which cluster centers are moved before the algorithm terminates.
Select how to scale the glyphs. Set to "off" to disable
scaling entirely. Set to "scalar" to scale the glyphs using the
array selected using the "Scalars" property. Set to "vector" to scale the
glyphs using the magnitude of the array selected using the "Vectors" property.
Set to "vector_components" to scale using the "Vectors", scaling each component
individually.


| 50
|
|
The value must be greater than or equal to 1.
3
 
|
 
The value(s) is an enumeration of the following:
* scalar (0)
* vector (1)
* vector_components (2)
* off (3)
|-
|-
| '''Model Input'''<br>''(ModelInput)''
|'''ScaleFactor''' (ScaleFactor)
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.
Specify the constant multiplier to use to scale the glyphs.


|
|
1.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must lie within the range of the selected data array.The value must lie within the range of the selected data array.
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
0.1.


|-
|'''GlyphMode''' (GlyphMode)
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
This property indicates the mode that will be used to generate
 
glyphs from the dataset.


|
2
|
The value(s) is an enumeration of the following:
* All Points (0)
* Every Nth Point (1)
* Uniform Spatial Distribution (2)
|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|'''MaximumNumberOfSamplePoints''' (MaximumNumberOfSamplePoints)
|
|
Choose arrays whose entries will be used to form observations for statistical analysis.
 
This property specifies the maximum number of sample points to use
when sampling the space when Uniform Spatial Distribution is used.


|
|
5000
|
|
An array of scalars is required.


|-
|-
| '''Task'''<br>''(Task)''
|'''Seed''' (Seed)
|
|
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
This property specifies the seed that will be used for generating
a uniform distribution of glyph points when a Uniform Spatial
Distribution is used.


| 3
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
10339
 
|


|-
|-
| '''Tolerance'''<br>''(Tolerance)''
|'''Stride''' (Stride)
|
|
Specify the relative tolerance that will cause early termination.


| 0.01
This property specifies the stride that will be used when glyphing
by Every Nth Point.
 
|
1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.


|-
|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|'''GlyphTransform''' (GlyphTransform)
|
The values in this property allow you to specify the
transform (translation, rotation, and scaling) to apply to the glyph
source.
|
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value can be one of the following:
* Transform2 (extended_sources)




|}
|}


==Gradient==


==Level Scalars==
This filter computes gradient vectors for an image/volume.The Gradient filter
 
computes the gradient vector at each point in an image or
 
volume. This filter uses central differences to compute
The Level Scalars filter uses colors to show levels of a hierarchical dataset.
the gradients. The Gradient filter operates on uniform
 
rectilinear (image) data and produces image data
The Level Scalars filter uses colors to show levels of a hierarchical dataset.<br>
output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,075: Line 5,266:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Gradient
filter.
|
|
This property specifies the input to the Level Scalars filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkImageData
The dataset must contain a field array (point)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
|-
|'''SelectInputScalars''' (SelectInputScalars)
|
This property lists the name of the array from which to
compute the gradient.
|


|
An array of scalars is required.
|-
|'''Dimensionality''' (Dimensionality)
|
This property indicates whether to compute the gradient
in two dimensions or in three. If the gradient is being computed in two
dimensions, the X and Y dimensions are used.
|
3
|
The value(s) is an enumeration of the following:
* Two (2)
* Three (3)


|}
|}


==Gradient Magnitude==


==Linear Extrusion==
Compute the magnitude of the gradient vectors for an image/volume.The Gradient
 
Magnitude filter computes the magnitude of the gradient
 
vector at each point in an image or volume. This filter
This filter creates a swept surface defined by translating the input along a vector.
operates on uniform rectilinear (image) data and produces
 
image data output.
The Linear Extrusion filter creates a swept surface by translating the input dataset along a specified vector. This filter is intended to operate on 2D polygonal data. This filter operates on polygonal data and produces polygonal data output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,104: Line 5,319:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Capping'''<br>''(Capping)''
|'''Input''' (Input)
|
This property specifies the input to the Gradient
Magnitude filter.
|
|
The value of this property indicates whether to cap the ends of the swept surface. Capping works by placing a copy of the input dataset on either end of the swept surface, so it behaves properly if the input is a 2D surface composed of filled polygons. If the input dataset is a closed solid (e.g., a sphere), then if capping is on (i.e., this property is set to 1), two copies of the data set will be displayed on output (the second translated from the first one along the specified vector). If instead capping is off (i.e., this property is set to 0), then an input closed solid will produce no output.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkImageData
The dataset must contain a field array (point)


with 1 component(s).


|-
|-
| '''Input'''<br>''(Input)''
|'''Dimensionality''' (Dimensionality)
|
|
This property specifies the input to the Linear Extrusion filter.
This property indicates whether to compute the gradient
 
magnitude in two or three dimensions. If computing the gradient
magnitude in 2D, the gradients in X and Y are used for computing the
gradient magnitude.
|
|
3
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value(s) is an enumeration of the following:
* Two (2)
* Three (3)


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
==Gradient Of Unstructured DataSet==


Estimate the gradient for each point or cell in any type of dataset.
The Gradient (Unstructured) filter estimates the gradient
vector at each point or cell. It operates on any type of
vtkDataSet, and the output is the same type as the input.
If the dataset is a vtkImageData, use the Gradient filter
instead; it will be more efficient for this type of
dataset.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Gradient
(Unstructured) filter.
|
|
The value of this property determines whether the output will be the same regardless of the number of processors used to compute the result. The difference is whether there are internal polygonal faces on the processor boundaries. A value of 1 will keep the results the same; a value of 0 will allow internal faces on processor boundaries.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkDataSet
The dataset must contain a field array ()


|-
|-
| '''Scale Factor'''<br>''(ScaleFactor)''
|'''Scalar Array''' (SelectInputScalars)
|
This property lists the name of the scalar array from
which to compute the gradient.
|
|
The value of this property determines the distance along the vector the dataset will be translated. (A scale factor of 0.5 will move the dataset half the length of the vector, and a scale factor of 2 will move it twice the vector's length.)


| 1
|
|
An array of scalars is required.The value must be field array name.
|-
|-
| '''Vector'''<br>''(Vector)''
|'''ComputeGradient''' (ComputeGradient)
|
|
The value of this property indicates the X, Y, and Z components of the vector along which to sweep the input dataset.
When this flag is on, the gradient filter will compute
 
the gradient of the input array.
| 0 0 1
|
1
|
|
|}
Accepts boolean values (0 or 1).
 
 
==Loop Subdivision==
 
 
This filter iteratively divides each triangle into four triangles.  New points are placed so the output surface is smooth.
 
The Loop Subdivision filter increases the granularity of a polygonal mesh. It works by dividing each triangle in the input into four new triangles. It is named for Charles Loop, the person who devised this subdivision scheme. This filter only operates on triangles, so a data set that contains other types of polygons should be passed through the Triangulate filter before applying this filter to it. This filter only operates on polygonal data (specifically triangle meshes), and it produces polygonal output.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''ResultArrayName''' (ResultArrayName)
|
|
This property specifies the input to the Loop Subdivision filter.
This property provides a name for the output array
 
containing the gradient vectors.
|
|
Gradients
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
|-
| '''Number of Subdivisions'''<br>''(NumberOfSubdivisions)''
|'''FasterApproximation''' (FasterApproximation)
|
When this flag is on, the gradient filter will provide a
less accurate (but close) algorithm that performs fewer derivative
calculations (and is therefore faster). The error contains some
smoothing of the output data and some possible errors on the boundary.
This parameter has no effect when performing the gradient of cell
data or when the input grid is not a vtkUnstructuredGrid.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ComputeDivergence''' (ComputeDivergence)
|
When this flag is on, the gradient filter will compute
the divergence of a 3 component array.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''DivergenceArrayName''' (DivergenceArrayName)
|
This property provides a name for the output array
containing the divergence vector.
|
Divergence
|
 
|-
|'''ComputeVorticity''' (ComputeVorticity)
|
When this flag is on, the gradient filter will compute
the vorticity/curl of a 3 component array.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''VorticityArrayName''' (VorticityArrayName)
|
This property provides a name for the output array
containing the vorticity vector.
|
Vorticity
|
|
Set the number of subdivision iterations to perform. Each subdivision divides single triangles into four new triangles.


| 1
|-
|'''ComputeQCriterion''' (ComputeQCriterion)
|
When this flag is on, the gradient filter will compute
the Q-criterion of a 3 component array.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''QCriterionArrayName''' (QCriterionArrayName)
|
This property provides a name for the output array
containing Q criterion.
|
Q-criterion
|
|
The value must be greater than or equal to 1 and less than or equal to 4.




|}
|}


==Grid Connectivity==


==Mask Points==
Mass properties of connected fragments for unstructured grids.This
 
filter works on multiblock unstructured grid inputs and
 
also works in parallel. It Ignores any cells with a cell
Reduce the number of points. This filter is often used before glyphing. Generating vertices is an option.
data Status value of 0. It performs connectivity to
 
distict fragments separately. It then integrates
The Mask Points filter reduces the number of points in the dataset. It operates on any type of dataset, but produces only points / vertices as output.<br>
attributes of the fragments.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,206: Line 5,491:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Generate Vertices'''<br>''(GenerateVertices)''
|'''Input''' (Input)
|
|
This property specifies whether to generate vertex cells as the topography of the output. If set to 1, the geometry (vertices) will be displayed in the rendering window; otherwise no geometry will be displayed.
This property specifies the input of the
 
filter.
| 0
|
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Mask Points filter.
Accepts input of following types:
* vtkUnstructuredGrid
* vtkCompositeDataSet


|
|}
|
The selected object must be the result of the following: sources (includes readers), filters.


==Group Datasets==


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
Group data sets.
Groups multiple datasets to create a multiblock
dataset


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|'''Input''' (Input)
|
|
The value of this property indicates the maximum number of points in the output dataset.
This property indicates the the inputs to the Group
 
Datasets filter.
| 5000
|
|
The value must be greater than or equal to 0.


|-
| '''Offset'''<br>''(Offset)''
|
|
The value of this property indicates the starting point id in the ordered list of input points from which to start masking.
Accepts input of following types:
* vtkDataObject


| 0
|}
|
The value must be greater than or equal to 0.


==Histogram==
Extract a histogram from field data.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''On Ratio'''<br>''(OnRatio)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Histogram
filter.
|
|
The value of this property specifies that every OnStride-th points will be retained in the output when not using Random (the skip or stride size for point ids). (For example, if the on ratio is 3, then the output will contain every 3rd point, up to the the maximum number of points.)


| 2
|
|
The value must be greater than or equal to 1.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array ()


|-
|'''SelectInputArray''' (SelectInputArray)
|
This property indicates the name of the array from which
to compute the histogram.
|


|
An array of scalars is required.The value must be field array name.
|-
|-
| '''Proportionally Distribute Maximum Number Of Points'''<br>''(ProportionalMaximumNumberOfPoints)''
|'''BinCount''' (BinCount)
|
The value of this property specifies the number of bins
for the histogram.
|
10
|
|
When this is off, the maximum number of points is taken per processor when running in parallel (total number of points = number of processors * maximum number of points).  When this is on, the maximum number of points is proportionally distributed across processors depending on the number of points per processor (total number of points = maximum number of points; maximum number of points per processor = number of points on a processor * maximum number of points / total number of points across all processors).


| 0
|-
|'''Component''' (Component)
|
The value of this property specifies the array component
from which the histogram should be computed.
|
0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Random Sampling'''<br>''(RandomMode)''
|'''CalculateAverages''' (CalculateAverages)
|
|
If the value of this property is set to true, then the points in the output will be randomly selected from the input in various ways set by Random Mode; otherwise this filter will subsample point ids regularly.
This option controls whether the algorithm calculates
 
averages of variables other than the primary variable that fall into
| 0
each bin.
|
0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Random Sampling Mode'''<br>''(RandomModeType)''
|'''UseCustomBinRanges''' (UseCustomBinRanges)
|
|
Randomized Id Strides picks points with random id increments starting at Offset (the output probably isn't a statistically random sample).  Random Sampling generates a statistically random sample of the input, ignoring Offset (fast - O(sample size)).  Spatially Stratified Random Sampling is a variant of random sampling that splits the points into equal sized spatial strata before randomly sampling (slow - O(N log N)).
When set to true, CustomBinRanges will be used instead
 
of using the full range for the selected array. By default, set to
| 0
false.
|
0
|
|
The value must be one of the following: Randomized Id Strides (0), Random Sampling (1), Spatially Stratified Random Sampling (2).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Single Vertex Per Cell'''<br>''(SingleVertexPerCell)''
|'''CustomBinRanges''' (CustomBinRanges)
|
|
Tell filter to only generate one vertex per cell instead of multiple vertices in one cell.
Set custom bin ranges to use. These are used only when
 
UseCustomBinRanges is set to true.
| 0
|
0.0 100.0
|
|
Only the values 0 and 1 are accepted.
The value must lie within the range of the selected data array.
 


|}
|}


==Image Data To AMR==


==Material Interface Filter==
Converts certain images to AMR.
 
 
The Material Interface filter finds volumes in the input data containg material above a certain material fraction.
 
The Material Interface filter finds voxels inside of which a material<br>
fraction (or normalized amount of material) is higher than a given<br>
threshold. As these voxels are identified surfaces enclosing adjacent<br>
voxels above the threshold are generated. The resulting volume and its<br>
surface are what we call a fragment. The filter has the ability to<br>
compute various volumetric attributes such as fragment volume, mass,<br>
center of mass as well as volume and mass weighted averages for any of<br>
the fields present. Any field selected for such computation will be also<br>
be coppied into the fragment surface's point data for visualization. The<br>
filter also has the ability to generate Oriented Bounding Boxes (OBB) for<br>
each fragment.<br><br><br>
The data generated by the filter is organized in three outputs. The<br>
"geometry" output, containing the fragment surfaces. The "statistics"<br>
output, containing a point set of the centers of mass. The "obb<br>
representaion" output, containing OBB representations (poly data). All<br>
computed attributes are coppied into the statistics and geometry output.<br>
The obb representation output is used for validation and debugging<br>
puproses and is turned off by default.<br><br><br>
To measure the size of craters, the filter can invert a volume fraction<br>
and clip the volume fraction with a sphere and/or a plane.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,334: Line 5,624:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
|'''Input''' (Input)
|
 
This property specifies the input to the Cell Data to
Point Data filter.
 
|
|
This property sets the type of clip geometry, and
associated parameters.


|
|
Accepts input of following types:
* vtkImageData
|-
|'''Number of levels''' (NumberOfLevels)
|
|
The value must be set to one of the following: None, Plane, Sphere.


This property specifies the number of levels in the amr data structure.


|-
| '''Compute OBB'''<br>''(ComputeOBB)''
|
|
Compute Object Oriented Bounding boxes (OBB). When active the result of
2
this computation is coppied into the statistics output. In the case
that the filter is built in its validation mode, the OBB's are
rendered.
 
| 0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Input'''<br>''(Input)''
|'''Maximum Number of Blocks''' (MaximumNumberOfLevels)
|
|
Input to the filter can be a hierarchical box data set containing image
 
data or a multi-block of rectilinear grids.
This property specifies the maximum number of blocks in the output
amr data structure.


|
|
100
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.


|-
|-
| '''Invert Volume Fraction'''<br>''(InvertVolumeFraction)''
|'''Refinement Ratio''' (RefinementRatio)
|
|
Inverting the volume fraction generates the negative of the material.
It is useful for analyzing craters.


| 0
This property specifies the refinement ratio between levels.
|
Only the values 0 and 1 are accepted.


|-
| '''Material Fraction Threshold'''<br>''(MaterialFractionThreshold)''
|
|
Material fraction is defined as normalized amount of material per
2
voxel. Any voxel in the input data set with a material fraction greater
than this value is included in the output data set.
 
| 0.5
|
|
The value must be greater than or equal to 0.08 and less than or equal to 1.




|-
|}
| '''Output Base Name'''<br>''(OutputBaseName)''
|
This property specifies the base including path of where to write the
statistics and gemoetry output text files. It follows the pattern
"/path/to/folder/and/file" here file has no extention, as the filter
will generate a unique extention.


|
==Image Data To Uniform Grid==
|
|-
| '''Select Mass Arrays'''<br>''(SelectMassArray)''
|
Mass arrays are paired with material fraction arrays. This means that
the first selected material fraction array is paired with the first
selected mass array, and so on sequentially. As the filter identifies
voxels meeting the minimum material fraction threshold, these voxel's
mass will be used in fragment center of mass and mass calculation.


A warning is generated if no mass array is selected for an individual
Create a uniform grid from an image data by specified blanking arrays.
material fraction array. However, in that case the filter will run
Create a vtkUniformGrid from a vtkImageData by passing in arrays to be used
without issue because the statistics output can be generated using
for point and/or cell blanking. By default, values of 0 in the specified
fragments' centers computed from axis aligned bounding boxes.
array will result in a point or cell being blanked. Use Reverse to switch this.


|
|
An array of scalars is required.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Compute mass weighted average over:'''<br>''(SelectMassWtdAvgArray)''
|'''Input''' (Input)
|
|
For arrays selected a mass weighted average is computed. These arrays
are also coppied into fragment geometry cell data as the fragment
surfaces are generated.


|
|
|
|
An array of scalars is required.
Accepts input of following types:
* vtkImageData
The dataset must contain a field array ()


with 1 component(s).


|-
|-
| '''Select Material Fraction Arrays'''<br>''(SelectMaterialArray)''
|'''SelectInputScalars''' (SelectInputScalars)
|
Specify the array to use for blanking.
|
|
Material fraction is defined as normalized amount of material per
voxel. It is expected that arrays containing material fraction data has
been down converted to a unsigned char.


|
|
|
An array of scalars is required.The value must be field array name.
An array of scalars is required.
 
 
|-
|-
| '''Compute volume weighted average over:'''<br>''(SelectVolumeWtdAvgArray)''
|'''Reverse''' (Reverse)
|
|
For arrays selected a volume weighted average is computed. The values
Reverse the array value to whether or not a point or cell is blanked.
of these arrays are also coppied into fragment geometry cell data as
the fragment surfaces are generated.
 
|
|
0
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).


|}


|-
==Image Data to Point Set==
| '''Write Geometry Output'''<br>''(WriteGeometryOutput)''
|
If this property is set, then the geometry output is written to a text
file. The file name will be coonstructed using the path in the "Output
Base Name" widget.


| 0
Converts an Image Data to a Point SetThe Image
|
Data to Point Set filter takes an image data (uniform
Only the values 0 and 1 are accepted.
rectilinear grid) object and outputs an equivalent structured
grid (which as a type of point set). This brings the data to a
broader category of data storage but only adds a small amount of
overhead. This filter can be helpful in applying filters that
expect or manipulate point coordinates.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Write Statistics Output'''<br>''(WriteStatisticsOutput)''
|'''Input''' (Input)
|
|
If this property is set, then the statistics output is written to a
text file. The file name will be coonstructed using the path in the
"Output Base Name" widget.


| 0
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkImageData


|}
|}


==Image Shrink==


==Median==
Reduce the size of an image/volume by subsampling.The Image Shrink
 
filter reduces the size of an image/volume dataset by
 
subsampling it (i.e., extracting every nth pixel/voxel in
Compute the median scalar values in a specified neighborhood for image/volume datasets.
integer multiples). The sbsampling rate can be set
 
separately for each dimension of the
The Median filter operates on uniform rectilinear (image or volume) data and produces uniform rectilinear output. It replaces the scalar value at each pixel / voxel with the median scalar value in the specified surrounding neighborhood. Since the median operation removes outliers, this filter is useful for removing high-intensity, low-probability noise (shot noise).<br>
image/volume.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,503: Line 5,762:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Image Shrink
filter.
|
|
This property specifies the input to the Median filter.


|
|
Accepts input of following types:
* vtkImageData
|-
|'''ShrinkFactors''' (ShrinkFactors)
|
The value of this property indicates the amount by which
to shrink along each axis.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
1 1 1
 
 
The dataset must contain a point array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData.
 
 
|-
| '''Kernel Size'''<br>''(KernelSize)''
|
|
The value of this property specifies the number of pixels/voxels in each dimension to use in computing the median to assign to each pixel/voxel. If the kernel size in a particular dimension is 1, then the median will not be computed in that direction.


| 1 1 1
|
|-
|-
| '''Select Input Scalars'''<br>''(SelectInputScalars)''
|'''Averaging''' (Averaging)
|
|
The value of thie property lists the name of the scalar array to use in computing the median.
If the value of this property is 1, an average of
 
neighborhood scalar values will be used as the output scalar value for
each output point. If its value is 0, only subsampling will be
performed, and the original scalar values at the points will be
retained.
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 


|}
|}


==ImageResampling==


==Merge Blocks==
Sample data attributes using a 3D image as probing mesh.
 
 
vtkCompositeDataToUnstructuredGridFilter appends all vtkDataSet<br>
leaves of the input composite dataset to a single unstructure grid. The<br>
subtree to be combined can be choosen using the SubTreeCompositeIndex. If<br>
the SubTreeCompositeIndex is a leaf node, then no appending is required.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,553: Line 5,807:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the dataset whose data will
be probed
|
|
Set the input composite dataset.


|
|
Accepts input of following types:
* vtkDataSet
* vtkCompositeDataSet
|-
|'''SamplingDimension''' (SamplingDimension)
|
|
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.


How many linear resampling we want along each axis
|
10 10 10
|


|-
|-
| '''Merge Points'''<br>''(MergePoints)''
|'''UseInputBounds''' (UseInputBounds)
|
|
| 1
 
Do we use input bounds or custom ones?
 
|
1
|
Accepts boolean values (0 or 1).
|-
|'''CustomSamplingBounds''' (CustomSamplingBounds)
|
|
Only the values 0 and 1 are accepted.


Custom probing bounds if needed


|}
|
0 1 0 1 0 1
|




==Mesh Quality==
|}


==InSituParticlePath==


This filter creates a new cell array containing a geometric measure of each cell's fitness. Different quality measures can be chosen for different cell shapes.
Trace Particle Paths through time in a vector field.
 
The Particle Trace filter generates pathlines in a vector
This filter creates a new cell array containing a geometric measure of each cell's fitness. Different quality measures can be chosen for different cell shapes. Supported shapes include triangles, quadrilaterals, tetrahedra, and hexahedra. For other shapes, a value of 0 is assigned.<br>
field from a collection of seed points. The vector field
used is selected from the Vectors menu, so the input data
set is required to have point-centered vectors. The Seed
portion of the interface allows you to select whether the
seed points for this integration lie in a point cloud or
along a line. Depending on which is selected, the
appropriate 3D widget (point or line widget) is displayed
along with traditional user interface controls for
positioning the point cloud or line within the data set.
Instructions for using the 3D widgets and the
corresponding manual controls can be found in section 7.4.
This filter operates on any type of data set, provided it
has point-centered vectors. The output is polygonal data
containing polylines. This filter is available on the
Toolbar.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,587: Line 5,878:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Hex Quality Measure'''<br>''(HexQualityMeasure)''
|'''Restart Source''' (RestartSource)
|
Specify the restart dataset. This is optional and
can be used to have particle histories that were computed
previously be included in this filter's computation.
|
|
This property indicates which quality measure will be used to evaluate hexahedral quality.


| 5
|
|
The value must be one of the following: Diagonal (21), Dimension (22), Distortion (15), Edge Ratio (0), Jacobian (25), Maximum Edge Ratio (16), Maximum Aspect Frobenius (5), Mean Aspect Frobenius (4), Oddy (23), Relative Size Squared (12), Scaled Jacobian (10), Shape (13), Shape and Size (14), Shear (11), Shear and Size (24), Skew (17), Stretch (20), Taper (18), Volume (19).
Accepts input of following types:
 
* vtkDataSet
 
|-
|-
| '''Input'''<br>''(Input)''
|'''ClearCache''' (ClearCache)
|
|
This property specifies the input to the Mesh Quality filter.
Clear the particle cache from previous time steps.


|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Quad Quality Measure'''<br>''(QuadQualityMeasure)''
|'''FirstTimeStep''' (FirstTimeStep)
|
|
This property indicates which quality measure will be used to evaluate quadrilateral quality.
Set the first time step. Default is 0.


| 0
|
|
The value must be one of the following: Area (28), Aspect Ratio (1), Condition (9), Distortion (15), Edge Ratio (0), Jacobian (25), Maximum Aspect Frobenius (5), Maximum Aspect Frobenius (5), Maximum Edge Ratio (16), Mean Aspect Frobenius (4), Minimum Angle (6), Oddy (23), Radius Ratio (2), Relative Size Squared (12), Scaled Jacobian (10), Shape (13), Shape and Size (14), Shear (11), Shear and Size (24), Skew (17), Stretch (20), Taper (18), Warpage (26).
0
 
|


|-
|-
| '''Tet Quality Measure'''<br>''(TetQualityMeasure)''
|'''RestartedSimulation''' (RestartedSimulation)
|
Specify whether or not this is a restarted simulation.
|
|
This property indicates which quality measure will be used to evaluate tetrahedral quality. The radius ratio is the size of a sphere circumscribed by a tetrahedron's 4 vertices divided by the size of a circle tangent to a tetrahedron's 4 faces. The edge ratio is the ratio of the longest edge length to the shortest edge length. The collapse ratio is the minimum ratio of height of a vertex above the triangle opposite it divided by the longest edge of the opposing triangle across all vertex/triangle pairs.
0
 
| 2
|
|
The value must be one of the following: Edge Ratio (0), Aspect Beta (29), Aspect Gamma (27), Aspect Frobenius (3), Aspect Ratio (1), Collapse Ratio (7), Condition (9), Distortion (15), Jacobian (25), Minimum Dihedral Angle (6), Radius Ratio (2), Relative Size Squared (12), Scaled Jacobian (10), Shape (13), Shape and Size (14), Volume (19).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Triangle Quality Measure'''<br>''(TriangleQualityMeasure)''
|'''DisableResetCache''' (DisableResetCache)
|
|
This property indicates which quality measure will be used to evaluate triangle quality. The radius ratio is the size of a circle circumscribed by a triangle's 3 vertices divided by the size of a circle tangent to a triangle's 3 edges. The edge ratio is the ratio of the longest edge length to the shortest edge length.
Prevents cache from getting reset so that new computation
 
always start from previous results.
| 2
|
0
|
|
The value must be one of the following: Area (28), Aspect Ratio (1), Aspect Frobenius (3), Condition (9), Distortion (15), Edge Ratio (0), Maximum Angle (8), Minimum Angle (6), Scaled Jacobian (10), Radius Ratio (2), Relative Size Squared (12), Shape (13), Shape and Size (14).
Accepts boolean values (0 or 1).
 


|}
|}


==Integrate Variables==


==Multicorrelative Statistics==
This filter integrates cell and point attributes.
The Integrate Attributes filter integrates point and cell
data over lines and surfaces. It also computes length of
lines, area of surface, or volume.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
|-
|'''Input''' (Input)
|
This property specifies the input to the Integrate
Attributes filter.
|


This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
|
<br>
Accepts input of following types:
This filter computes the covariance matrix for all the arrays you select plus the mean of each array. The model is thus a multivariate Gaussian distribution with the mean vector and variances provided. Data is assessed using this model by computing the Mahalanobis distance for each input point. This distance will always be positive.
* vtkDataSet


<br>
|}
The learned model output format is rather dense and can be confusing, so it is discussed here. The first filter output is a multiblock dataset consisting of 2 tables:
<br>
#  Raw covariance data.<br>
#  Covariance matrix and its Cholesky decomposition.
<br>
The raw covariance table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.
<br>
The second table (covariance matrix and Cholesky decomposition) contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, Mean contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.


<br>
==Interpolate to Quadrature Points==
The remaining columns (there are N, one for each array) contain 2 matrices in triangular format. The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred). The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero). Because the diagonal must be stored for both matrices, an additional row is required — hence the N+1 rows and the final entry of the column named "Column".<br>


Create scalar/vector data arrays interpolated to quadrature points.
"Create scalar/vector data arrays interpolated to quadrature
points."


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,672: Line 5,967:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
This property specifies the input of the filter.
|
|
Specify which type of field data the arrays will be drawn from.


| 0
|
|
Valud array names will be chosen from point and cell data.
Accepts input of following types:
 
* vtkUnstructuredGrid
The dataset must contain a field array (cell)


|-
|-
| '''Input'''<br>''(Input)''
|'''Quadrature Scheme Def''' (QuadratureSchemeDefinition)
|
Specifies the offset array from which we interpolate
values to quadrature points.
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


|
|
|
An array of scalars is required.
The selected object must be the result of the following: sources (includes readers), filters.


|}


The dataset must contain a point or cell array.
==Intersect Fragments==


The Intersect Fragments filter perform geometric intersections on sets of fragments.
The Intersect Fragments filter perform geometric intersections on sets of
fragments. The filter takes two inputs, the first containing fragment
geometry and the second containing fragment centers. The filter has two
outputs. The first is geometry that results from the intersection. The
second is a set of points that is an approximation of the center of where
each fragment has been intersected.


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Model Input'''<br>''(ModelInput)''
|'''Input''' (Input)
|
This input must contian fragment
geometry.
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
Accepts input of following types:
* vtkMultiBlockDataSet
|-
|'''Source''' (Source)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
This input must contian fragment
 
centers.
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
 
 
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|
|
Choose arrays whose entries will be used to form observations for statistical analysis.


|
|
Accepts input of following types:
* vtkMultiBlockDataSet
|-
|'''Slice Type''' (CutFunction)
|
|
An array of scalars is required.
This property sets the type of intersecting geometry,
 
and associated parameters.
 
|-
| '''Task'''<br>''(Task)''
|
|
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
| 3
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
The value can be one of the following:
 
* Plane (implicit_functions)


|-
* Box (implicit_functions)
| '''Training Fraction'''<br>''(TrainingFraction)''
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
* Sphere (implicit_functions)
|
The value must be greater than or equal to 0 and less than or equal to 1.




|}
|}


==Iso Volume==


==Normal Glyphs==
This filter extracts cells by clipping cells that have point scalars not in the specified range.
 
This filter clip away the cells using lower and upper
 
thresholds.
Filter computing surface normals.
 
Filter computing surface normals.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,763: Line 6,058:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Consistency'''<br>''(Consistency)''
|'''Input''' (Input)
|
|
The value of this property controls whether consistent polygon ordering is enforced. Generally the normals for a data set should either all point inward or all point outward. If the value of this property is 1, then this filter will reorder the points of cells that whose normal vectors are oriented the opposite direction from the rest of those in the data set.
This property specifies the input to the Threshold
 
filter.
| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Maximum Number of Points'''<br>''(Glyph Max. Points)''
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)
| 5000
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array ()


with 1 component(s).


|-
|-
| '''Random Mode'''<br>''(Glyph Random Mode)''
|'''Input Scalars''' (SelectInputScalars)
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.
 
| 1
|
|
Only the values 0 and 1 are accepted.
The value of this property contains the name of the
 
scalar array from which to perform thresholding.
 
|-
| '''Set Scale Factor'''<br>''(Glyph Scale Factor)''
|
|
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.


| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.
An array of scalars is required.The value must be field array name.
 
 
The value must lie within the range of the selected data array.
 
 
The value must lie within the range of the selected data array.
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Threshold Range''' (ThresholdBetween)
|
|
This property specifies the input to the Extract Surface filter.
The values of this property specify the upper and lower
 
bounds of the thresholding operation.
|
|
0 0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must lie within the range of the selected data array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
| '''Invert'''<br>''(InvertArrow)''
|
Inverts the arrow direction.
 
| 0
|
Only the values 0 and 1 are accepted.
 


|}
|}


==K Means==


==Octree Depth Limit==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
 
This filter either computes a statistical model of a dataset or takes
 
such a model as its second input. Then, the model (however it is
This filter takes in a octree and produces a new octree which is no deeper than the maximum specified depth level.
obtained) may optionally be used to assess the input dataset.&lt;p&gt;
 
This filter iteratively computes the center of k clusters in a space
The Octree Depth Limit filter takes in an octree and produces a new octree that is nowhere deeper than the maximum specified depth level. The attribute data of pruned leaf cells are integrated in to their ancestors at the cut level.<br>
whose coordinates are specified by the arrays you select. The clusters
are chosen as local minima of the sum of square Euclidean distances from
each point to its nearest cluster center. The model is then a set of
cluster centers. Data is assessed by assigning a cluster center and
distance to the cluster to each point in the input data
set.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,848: Line 6,114:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
The input to the filter. Arrays from this dataset will
be used for computing statistics and/or assessed by a statistical
model.
|
|
This property specifies the input to the Octree Depth Limit filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkImageData
 
* vtkStructuredGrid
 
* vtkPolyData
The selected dataset must be one of the following types (or a subclass of one of them): vtkHyperOctree.
* vtkUnstructuredGrid
 
* vtkTable
* vtkGraph
The dataset must contain a field array ()


|-
|-
| '''Maximum Level'''<br>''(MaximumLevel)''
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|
|
The value of this property specifies the maximum depth of the output octree.


| 4
|
|
The value must be greater than or equal to 3 and less than or equal to 255.
Accepts input of following types:
 
* vtkTable
 
* vtkMultiBlockDataSet
|}
|-
|'''AttributeMode''' (AttributeMode)
|
Specify which type of field data the arrays will be
drawn from.
|
0
|
The value must be field array name.
|-
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|


|


==Octree Depth Scalars==
This filter adds a scalar to each leaf of the octree that represents the leaf's depth within the tree.
The vtkHyperOctreeDepth filter adds a scalar to each leaf of the octree that represents the leaf's depth within the tree.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''Task''' (Task)
| '''Description'''
|
| '''Default Value(s)'''
Specify the task to be performed: modeling and/or
| '''Restrictions'''
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
3
|
The value(s) is an enumeration of the following:
* Detailed model of input data (0)
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|-
| '''Input'''<br>''(Input)''
|'''TrainingFraction''' (TrainingFraction)
|
Specify the fraction of values from the input dataset to
be used for model fitting. The exact set of values is chosen at random
from the dataset.
|
0.1
|
|
This property specifies the input to the Octree Depth Scalars filter.


|-
|'''k''' (K)
|
|
Specify the number of clusters.
|
5
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Max Iterations''' (MaxNumIterations)
|
Specify the maximum number of iterations in which
cluster centers are moved before the algorithm
terminates.
|
50
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkHyperOctree.
|-
|'''Tolerance''' (Tolerance)
|
Specify the relative tolerance that will cause early
termination.
|
0.01
|




|}
|}


==Legacy Glyph==


==Outline==
This filter generates an arrow, cone, cube, cylinder, line, sphere, or 2D glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.
 
This the implementation of the Glyph filter available in ParaView version 4.1 and earlier.
 
This filter generates a bounding box representation of the input.


The Outline filter generates an axis-aligned bounding box for the input dataset. This filter operates on any type of dataset and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 4,916: Line 6,243:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Glyph filter.
This is the dataset to which the glyphs will be
applied.
|
|
This property specifies the input to the Outline filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
The dataset must contain a field array (point)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The dataset must contain a field array (point)


with 3 component(s).


|}
|-
|'''Scalars''' (SelectInputScalars)
|
This property indicates the name of the scalar array on
which to operate. The indicated array may be used for scaling the
glyphs. (See the SetScaleMode property.)
|


 
|
==Outline Corners==
An array of scalars is required.
 
 
This filter generates a bounding box representation of the input. It only displays the corners of the bounding box.
 
The Outline Corners filter generates the corners of a bounding box for the input dataset. This filter operates on any type of dataset and produces polygonal output.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''Vectors''' (SelectInputVectors)
| '''Description'''
|
| '''Default Value(s)'''
This property indicates the name of the vector array on
| '''Restrictions'''
which to operate. The indicated array may be used for scaling and/or
|-
orienting the glyphs. (See the SetScaleMode and SetOrient
| '''Corner Factor'''<br>''(CornerFactor)''
properties.)
|
|
The value of this property sets the size of the corners as a percentage of the length of the corresponding bounding box edge.
1
 
| 0.2
|
|
The value must be greater than or equal to 0.001 and less than or equal to 0.5.
An array of vectors is required.
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Glyph Type''' (Source)
|
This property determines which type of glyph will be
placed at the points in the input dataset.
|
|
This property specifies the input to the Outline Corners filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkPolyDataThe value can be one of the following:
* ArrowSource (sources)


* ConeSource (sources)


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
* CubeSource (sources)


* CylinderSource (sources)


|}
* LineSource (sources)


* SphereSource (sources)


==Outline Curvilinear DataSet==
* GlyphSource2D (sources)


|-
|'''GlyphTransform''' (GlyphTransform)
|
The values in this property allow you to specify the
transform (translation, rotation, and scaling) to apply to the glyph
source.
|


This filter generates an outline representation of the input.
|
The value can be one of the following:
* Transform2 (extended_sources)


The Outline filter generates an outline of the outside edges of the input dataset, rather than the dataset's bounding box. This filter operates on structured grid datasets and produces polygonal output.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''Orient''' (SetOrient)
| '''Description'''
|
| '''Default Value(s)'''
If this property is set to 1, the glyphs will be
| '''Restrictions'''
oriented based on the selected vector array.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Scale Mode''' (SetScaleMode)
|
The value of this property specifies how/if the glyphs
should be scaled based on the point-centered scalars/vectors in the
input dataset.
|
1
|
The value(s) is an enumeration of the following:
* scalar (0)
* vector (1)
* vector_components (2)
* off (3)
|-
|-
| '''Input'''<br>''(Input)''
|'''SetScaleFactor''' (SetScaleFactor)
|
|
This property specifies the input to the outline (curvilinear) filter.
The value of this property will be used as a multiplier
for scaling the glyphs before adding them to the
output.
|
1.0
|
The value must lie within the range of the selected data array.The value must lie within the range of the selected data array.
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
0.1.


|-
|'''Maximum Number of Points''' (MaximumNumberOfPoints)
|
|
The value of this property specifies the maximum number
of glyphs that should appear in the output dataset if the value of the
UseMaskPoints property is 1. (See the UseMaskPoints
property.)
|
5000
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Mask Points''' (UseMaskPoints)
|
If the value of this property is set to 1, limit the
maximum number of glyphs to the value indicated by
MaximumNumberOfPoints. (See the MaximumNumberOfPoints
property.)
|
1
|
Accepts boolean values (0 or 1).
|-
|'''RandomMode''' (RandomMode)
|
If the value of this property is 1, then the points to
glyph are chosen randomly. Otherwise the point ids chosen are evenly
spaced.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''KeepRandomPoints''' (KeepRandomPoints)
|
If the value of this property is 1 and RandomMode is
1, then the randomly chosen points to glyph are saved and reused for
other timesteps. This is only useful if the coordinates are the same
and in the same order between timesteps.


The selected dataset must be one of the following types (or a subclass of one of them): vtkStructuredGrid.
|
 
0
|
Accepts boolean values (0 or 1).


|}
|}


==LegacyArbitrarySourceGlyph==


==Particle Pathlines==
This filter generates a glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.
 
The Glyph filter generates a glyph at each point in the input dataset.
 
The glyphs can be oriented and scaled by the input point-centered scalars
Creates polylines representing pathlines of animating particles
and vectors. The Glyph filter operates on any type of data set. Its
 
output is polygonal. This filter is available on the
Particle Pathlines takes any dataset as input, it extracts the<br>
Toolbar.
point locations of all cells over time to build up a polyline<br>
trail.  The point number (index) is used as the 'key' if the points<br>
are randomly changing their respective order in the points list,<br>
then you should specify a scalar that represents the unique<br>
ID. This is intended to handle the output of a filter such as the<br>
TemporalStreamTracer.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,019: Line 6,419:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Id Channel Array'''<br>''(IdChannelArray)''
|'''Input''' (Input)
|
This property specifies the input to the Glyph filter.
This is the dataset to which the glyphs will be
applied.
|
|
Specify the name of a scalar array which will be used to fetch
the index of each point. This is necessary only if the particles
change position (Id order) on each time step. The Id can be used
to identify particles at each step and hence track them properly.
If this array is set to "Global or Local IDs", the global point
ids are used if they exist or the point index is otherwise.


| Global or Local IDs
|
|
An array of scalars is required.
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
 
with 1 component(s).
 
The dataset must contain a field array (point)


with 3 component(s).


|-
|-
| '''Input'''<br>''(Input)''
|'''Glyph Type''' (Source)
|
This property determines which type of glyph will be
placed at the points in the input dataset.
|
|
The input cells to create pathlines for.


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''Scalars''' (SelectInputScalars)
|
This property indicates the name of the scalar array on
which to operate. The indicated array may be used for scaling the
glyphs. (See the SetScaleMode property.)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.


|
An array of scalars is required.
|-
|-
| '''Mask Points'''<br>''(MaskPoints)''
|'''Vectors''' (SelectInputVectors)
|
|
Set the number of particles to track as a ratio of the input.
This property indicates the name of the vector array on
Example: setting MaskPoints to 10 will track every 10th point.
which to operate. The indicated array may be used for scaling and/or
 
orienting the glyphs. (See the SetScaleMode and SetOrient
| 100
properties.)
|
1
|
|
An array of vectors is required.
|-
|-
| '''Max Step Distance'''<br>''(MaxStepDistance)''
|'''Orient''' (SetOrient)
|
|
If a particle disappears from one end of a simulation and
If this property is set to 1, the glyphs will be
reappears on the other side, the track left will be
oriented based on the selected vector array.
unrepresentative.  Set a MaxStepDistance{x,y,z} which acts as a
|
threshold above which if a step occurs larger than the value (for
1
the dimension), the track will be dropped and restarted after the
step. (ie the part before the wrap around will be dropped and the
newer part kept).
 
| 1 1 1
|
|
Accepts boolean values (0 or 1).
|-
|-
| '''Max Track Length'''<br>''(MaxTrackLength)''
|'''Scale Mode''' (SetScaleMode)
|
|
If the Particles being traced animate for a long time, the trails
The value of this property specifies how/if the glyphs
or traces will become long and stringy. Setting the
should be scaled based on the point-centered scalars/vectors in the
MaxTraceTimeLength will limit how much of the trace is
input dataset.
displayed. Tracks longer then the Max will disappear and the
|
trace will apppear like a snake of fixed length which progresses
1
as the particle moves.  This length is given with respect to
timesteps.
 
| 25
|
|
The value(s) is an enumeration of the following:
* scalar (0)
* vector (1)
* vector_components (2)
* off (3)
|-
|-
| '''Selection'''<br>''(Selection)''
|'''SetScaleFactor''' (SetScaleFactor)
|
|
Set a second input, which is a selection. Particles with the same
The value of this property will be used as a multiplier
Id in the selection as the primary input will be chosen for
for scaling the glyphs before adding them to the
pathlines Note that you must have the same IdChannelArray in the
output.
selection as the input
 
|
|
1.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must lie within the range of the selected data array.The value must lie within the range of the selected data array.
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
0.1.


|-
|'''Maximum Number of Points''' (MaximumNumberOfPoints)
|
The value of this property specifies the maximum number
of glyphs that should appear in the output dataset if the value of the
UseMaskPoints property is 1. (See the UseMaskPoints
property.)
|
5000
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
|'''Mask Points''' (UseMaskPoints)
|
If the value of this property is set to 1, limit the
maximum number of glyphs to the value indicated by
MaximumNumberOfPoints. (See the MaximumNumberOfPoints
property.)
|
1
|
Accepts boolean values (0 or 1).
|-
|'''RandomMode''' (RandomMode)
|
If the value of this property is 1, then the points to
glyph are chosen randomly. Otherwise the point ids chosen are evenly
spaced.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''KeepRandomPoints''' (KeepRandomPoints)
|
If the value of this property is 1 and RandomMode is
1, then the randomly chosen points to glyph are saved and reused for
other timesteps. This is only useful if the coordinates are the same
and in the same order between timesteps.


|
0
|
Accepts boolean values (0 or 1).


|}
|}


==Level Scalars(Non-Overlapping AMR)==


==ParticleTracer==
The Level Scalars filter uses colors to show levels of a hierarchical dataset.The Level
 
Scalars filter uses colors to show levels of a
 
hierarchical dataset.
Trace Particles through time in a vector field.
 
The Particle Trace filter generates pathlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4.<br>
This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,114: Line 6,566:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
|'''Input''' (Input)
|
This property specifies the input to the Level Scalars
filter.
|
|
Compute vorticity and angular rotation of particles as they progress


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkNonOverlappingAMR


|}


|-
==Level Scalars(Overlapping AMR)==
| '''Enable Particle Writing'''<br>''(EnableParticleWriting)''
|
Turn On/Off particle writing


| 0
The Level Scalars filter uses colors to show levels of a hierarchical dataset.The Level
|
Scalars filter uses colors to show levels of a
Only the values 0 and 1 are accepted.
hierarchical dataset.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Force Reinjection Every NSteps'''<br>''(ForceReinjectionEveryNSteps)''
|'''Input''' (Input)
|
This property specifies the input to the Level Scalars
filter.
|
|
When animating particles, it is nice to inject new ones every Nth step
to produce a continuous flow. Setting ForceReinjectionEveryNSteps to a
non zero value will cause the particle source to reinject particles
every Nth step even if it is otherwise unchanged.
Note that if the particle source is also animated, this flag will be
redundant as the particles will be reinjected whenever the source changes
anyway


| 1
|
|
|-
Accepts input of following types:
| '''Ignore Pipeline Time'''<br>''(IgnorePipelineTime)''
* vtkOverlappingAMR
|
 
Ignore the TIME_ requests made by the pipeline and only use the TimeStep set manually
|}
 
==Linear Extrusion==


| 0
This filter creates a swept surface defined by translating the input along a vector.The Linear
|
Extrusion filter creates a swept surface by translating
Only the values 0 and 1 are accepted.
the input dataset along a specified vector. This filter is
intended to operate on 2D polygonal data. This filter
operates on polygonal data and produces polygonal data
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Initial Integration Step'''<br>''(InitialIntegrationStep)''
|'''Input''' (Input)
|
This property specifies the input to the Linear
Extrusion filter.
|
|
Specify the Initial step size used for line integration, expressed in:
LENGTH_UNIT      = 1
CELL_LENGTH_UNIT = 2
(either the starting size for an adaptive integrator, e.g., RK45,
or the constant / fixed size for non-adaptive ones, i.e., RK2 and RK4)


| 0.25
|
|
Accepts input of following types:
* vtkPolyData
|-
|-
| '''Input'''<br>''(Input)''
|'''ScaleFactor''' (ScaleFactor)
|
|
Specify which is the Input of the StreamTracer filter.
The value of this property determines the distance along
 
the vector the dataset will be translated. (A scale factor of 0.5 will
move the dataset half the length of the vector, and a scale factor of 2
will move it twice the vector's length.)
|
|
1.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.


 
|-
The dataset must contain a point array with 3 components.
|'''Vector''' (Vector)
 
|
 
The value of this property indicates the X, Y, and Z
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.
components of the vector along which to sweep the input
 
dataset.
 
|
|-
0 0 1
| '''Particle File Name'''<br>''(ParticleFileName)''
|
|
Provide a name for the particle file generated if writing is enabled


| /project/csvis/biddisco/ptracer/run-1
|
|-
|-
| '''Select Input Vectors'''<br>''(SelectInputVectors)''
|'''Capping''' (Capping)
|
|
Specify which vector array should be used for the integration
The value of this property indicates whether to cap the
through that filter.
ends of the swept surface. Capping works by placing a copy of the input
 
dataset on either end of the swept surface, so it behaves properly if
the input is a 2D surface composed of filled polygons. If the input
dataset is a closed solid (e.g., a sphere), then if capping is on
(i.e., this property is set to 1), two copies of the data set will be
displayed on output (the second translated from the first one along the
specified vector). If instead capping is off (i.e., this property is
set to 0), then an input closed solid will produce no
output.
|
|
1
|
|
An array of vectors is required.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Source'''<br>''(Source)''
|'''PieceInvariant''' (PieceInvariant)
|
|
Specify the seed dataset. Typically fron where the vector field
The value of this property determines whether the output
integration should begin. Usually a point/radius or a line with a
will be the same regardless of the number of processors used to compute
given resolution.
the result. The difference is whether there are internal polygonal
 
faces on the processor boundaries. A value of 1 will keep the results
the same; a value of 0 will allow internal faces on processor
boundaries.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
==Loop Subdivision==


This filter iteratively divides each triangle into four triangles. New points are placed so the output surface is smooth.
The Loop Subdivision filter increases the granularity of a
polygonal mesh. It works by dividing each triangle in the
input into four new triangles. It is named for Charles
Loop, the person who devised this subdivision scheme. This
filter only operates on triangles, so a data set that
contains other types of polygons should be passed through
the Triangulate filter before applying this filter to it.
This filter only operates on polygonal data (specifically
triangle meshes), and it produces polygonal
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Static Mesh'''<br>''(StaticMesh)''
| '''Property'''
|
| '''Description'''
Force the use of static mesh optimizations
| '''Default Value(s)'''
 
| '''Restrictions'''
| 0
|
Only the values 0 and 1 are accepted.
 


|-
|-
| '''Static Seeds'''<br>''(StaticSeeds)''
|'''Input''' (Input)
|
|
Force the use of static seed optimizations
This property specifies the input to the Loop
 
Subdivision filter.
| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Term. Speed'''<br>''(TerminalSpeed)''
|
If at any point the speed is below the value of this property, the integration is terminated.
| 1e-12
|
|
Accepts input of following types:
* vtkPolyData
|-
|-
| '''Termination Time'''<br>''(TerminationTime)''
|'''Number of Subdivisions''' (NumberOfSubdivisions)
|
|
Setting TerminationTime to a positive value will cause particles
Set the number of subdivision iterations to perform.
to terminate when the time is reached. Use a vlue of zero to
Each subdivision divides single triangles into four new
diable termination. The units of time should be consistent with the
triangles.
primary time variable.
 
| 0
|
|
|-
1
| '''Termination Time Unit'''<br>''(TerminationTimeUnit)''
|
The termination time may be specified as TimeSteps or Simulation  time
 
| 1
|
|
The value must be one of the following: Simulation Time (0), TimeSteps (1).




|-
| '''Time Step'''<br>''(TimeStep)''
|
Set/Get the TimeStep. This is the primary means of advancing
the particles. The TimeStep should be animated and this will drive
the pipeline forcing timesteps to be fetched from upstream.
| 0
|
|}
|}


 
==MPIMoveData==
==Plot Data==




This filter prepare arbitrary data to be plotted in any of the plots.<br>
By default the data is shown in a XY line plot.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,288: Line 6,740:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the MPI Move Data
filter.
|
 
|
|
The input.


|-
|'''MoveMode''' (MoveMode)
|
|
Specify how the data is to be
redistributed.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
0
 
|
 
The value(s) is an enumeration of the following:
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.
* PassThrough (0)
 
* Collect (1)
 
* Clone (2)
|}
 
 
==Plot Global Variables Over Time==
 
 
Extracts and plots data in field data over time.
 
This filter extracts the variables that reside in a dataset's field data and are<br>
defined over time. The output is a 1D rectilinear grid where the x coordinates<br>
correspond to time (the same array is also copied to a point array named Time or<br>
TimeData (if Time exists in the input)).<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''OutputDataType''' (OutputDataType)
|
|
The input from which the selection is extracted.
Specify the type of the dataset.
 
|
|
none
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value(s) is an enumeration of the following:
 
* PolyData (0)
 
* Unstructured Grid (4)
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
* ImageData (6)
 


|}
|}


==Mask Points==


==Plot On Intersection Curves==
Reduce the number of points. This filter is often used before glyphing. Generating vertices is an option.The Mask Points
 
filter reduces the number of points in the dataset. It
 
operates on any type of dataset, but produces only points
Extracts the edges in a 2D plane and plots them
/ vertices as output.
 
Extracts the surface, intersect it with a 2D plane.<br>
Plot the resulting polylines.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,350: Line 6,789:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Mask Points
filter.
|
|
This property specifies the input to the Extract Surface filter.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''OnRatio''' (OnRatio)
|
The value of this property specifies that every
OnStride-th points will be retained in the output when not using Random
(the skip or stride size for point ids). (For example, if the on ratio
is 3, then the output will contain every 3rd point, up to the the
maximum number of points.)
|
2
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Slice Type'''<br>''(Slice Type)''
|'''Maximum Number of Points''' (MaximumNumberOfPoints)
|
|
This property sets the parameters of the slice function.
The value of this property indicates the maximum number
 
of points in the output dataset.
|
|
5000
|
|
The value must be set to one of the following: Plane, Box, Sphere.


|}
==Plot On Sorted Lines==
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Proportionally Distribute Maximum Number Of Points''' (ProportionalMaximumNumberOfPoints)
|
|
This property specifies the input to the Plot Edges filter.
When this is off, the maximum number of points is taken
per processor when running in parallel (total number of points = number
of processors * maximum number of points). When this is on, the maximum
number of points is proportionally distributed across processors
depending on the number of points per processor
("total number of points" is the same as "maximum number of points"
maximum number of points per processor = number of points on a processor
* maximum number of points / total number of points across all processors
).


|
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Offset''' (Offset)
|
The value of this property indicates the starting point
id in the ordered list of input points from which to start
masking.
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|}
==Plot Over Line==
Sample data attributes at the points along a line.  Probed lines will be displayed in a graph of the attributes.
The Plot Over Line filter samples the data set attributes of the current<br>
data set at the points along a line. The values of the point-centered variables<br>
along that line will be displayed in an XY Plot. This filter uses interpolation<br>
to determine the values at the selected point, whether or not it lies at an<br>
input point. The Probe filter operates on any type of data and produces<br>
polygonal output (a line).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''Random Sampling''' (RandomMode)
| '''Description'''
|
| '''Default Value(s)'''
If the value of this property is set to true, then the
| '''Restrictions'''
points in the output will be randomly selected from the input in
various ways set by Random Mode; otherwise this filter will subsample
point ids regularly.
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''Random Sampling Mode''' (RandomModeType)
|
|
This property specifies the dataset from which to obtain probe values.
Randomized Id Strides picks points with random id
 
increments starting at Offset (the output probably isn't a
statistically random sample). Random Sampling generates a statistically
random sample of the input, ignoring Offset (fast - O(sample size)).
Spatially Stratified Random Sampling is a variant of random sampling
that splits the points into equal sized spatial strata before randomly
sampling (slow - O(N log N)).
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value(s) is an enumeration of the following:
 
* Randomized Id Strides (0)
 
* Random Sampling (1)
The dataset must contain a point or cell array.
* Spatially Stratified Random Sampling (2)
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkCompositeDataSet.
 
 
|-
|-
| '''Pass Partial Arrays'''<br>''(PassPartialArrays)''
|'''GenerateVertices''' (GenerateVertices)
|
|
When dealing with composite datasets, partial arrays are common i.e.
This property specifies whether to generate vertex cells
data-arrays that are not available in all of the blocks. By default,
as the topography of the output. If set to 1, the geometry (vertices)
this filter only passes those point and cell data-arrays that are
will be displayed in the rendering window; otherwise no geometry will
available in all the blocks i.e. partial array are removed.  When
be displayed.
PassPartialArrays is turned on, this behavior is changed to take a
|
union of all arrays present thus partial arrays are passed as well.
0
However, for composite dataset input, this filter still produces a
non-composite output. For all those locations in a block of where a
particular data array is missing, this filter uses vtkMath::Nan() for
double and float arrays, while 0 for all other types of arrays i.e
int, char etc.
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Probe Type'''<br>''(Source)''
|'''SingleVertexPerCell''' (SingleVertexPerCell)
|
|
This property specifies the dataset whose geometry will be used in determining positions to probe.
Tell filter to only generate one vertex per cell instead
 
of multiple vertices in one cell.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers).
Accepts boolean values (0 or 1).
 
 
The value must be set to one of the following: HighResLineSource.
 


|}
|}


==Material Interface Filter==


==Plot Selection Over Time==
The Material Interface filter finds volumes in the input data containg material above a certain material fraction.
 
The Material Interface filter finds voxels inside of which a material
 
fraction (or normalized amount of material) is higher than a given
Extracts selection over time and then plots it.
threshold. As these voxels are identified surfaces enclosing adjacent
 
voxels above the threshold are generated. The resulting volume and its
This filter extracts the selection over time, i.e.  cell and/or point<br>
surface are what we call a fragment. The filter has the ability to
variables at a cells/point selected are extracted over time<br>
compute various volumetric attributes such as fragment volume, mass,
The output multi-block consists of 1D rectilinear grids where the x coordinate<br>
center of mass as well as volume and mass weighted averages for any of
corresponds to time (the same array is also copied to a point array named<br>
the fields present. Any field selected for such computation will be also
Time or TimeData (if Time exists in the input)).<br>
be coppied into the fragment surface's point data for visualization. The
If selection input is a Location based selection then the point values are<br>
filter also has the ability to generate Oriented Bounding Boxes (OBB) for
interpolated from the nearby cells, ie those of the cell the location<br>
each fragment. The data generated by the filter is organized in three
lies in.<br>
outputs. The "geometry" output, containing the fragment surfaces. The
"statistics" output, containing a point set of the centers of mass. The
"obb representaion" output, containing OBB representations (poly data).
All computed attributes are coppied into the statistics and geometry
output. The obb representation output is used for validation and
debugging puproses and is turned off by default. To measure the size of
craters, the filter can invert a volume fraction and clip the volume
fraction with a sphere and/or a plane.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,492: Line 6,928:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Input to the filter can be a hierarchical box data set
containing image data or a multi-block of rectilinear
grids.
|
|
The input from which the selection is extracted.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkNonOverlappingAMR
 
The dataset must contain a field array (cell)
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable, vtkCompositeDataSet.
 


|-
|-
| '''Selection'''<br>''(Selection)''
|'''Select Material Fraction Arrays''' (SelectMaterialArray)
|
Material fraction is defined as normalized amount of
material per voxel. It is expected that arrays containing material
fraction data has been down converted to a unsigned
char.
|
|
The input that provides the selection object.


|
|
An array of scalars is required.
|-
|'''Material Fraction Threshold''' (MaterialFractionThreshold)
|
Material fraction is defined as normalized amount of
material per voxel. Any voxel in the input data set with a material
fraction greater than this value is included in the output data
set.
|
0.5
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''InvertVolumeFraction''' (InvertVolumeFraction)
|
Inverting the volume fraction generates the negative of
the material. It is useful for analyzing craters.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Clip Type''' (ClipFunction)
|
This property sets the type of clip geometry, and
associated parameters.
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.
|
The value can be one of the following:
* None (implicit_functions)


* Plane (implicit_functions)


|}
* Sphere (implicit_functions)


|-
|'''Select Mass Arrays''' (SelectMassArray)
|
Mass arrays are paired with material fraction arrays.
This means that the first selected material fraction array is paired
with the first selected mass array, and so on sequentially. As the
filter identifies voxels meeting the minimum material fraction
threshold, these voxel's mass will be used in fragment center of mass
and mass calculation. A warning is generated if no mass array is
selected for an individual material fraction array. However, in that
case the filter will run without issue because the statistics output
can be generated using fragments' centers computed from axis aligned
bounding boxes.
|


==Point Data to Cell Data==
|
An array of scalars is required.
|-
|'''Compute volume weighted average over:''' (SelectVolumeWtdAvgArray)
|
Specifies the arrays from which to volume weighted
average. For arrays selected a volume weighted average is
computed. The values of these arrays are also coppied into fragment
geometry cell data as the fragment surfaces are
generated.
|


|


Create cell attributes by averaging point attributes.
|-
|'''Compute mass weighted average over:''' (SelectMassWtdAvgArray)
|
For arrays selected a mass weighted average is computed.
These arrays are also coppied into fragment geometry cell data as the
fragment surfaces are generated.
|


The Point Data to Cell Data filter averages the values of the point attributes of the points of a cell to compute cell attributes. This filter operates on any type of dataset, and the output dataset is the same type as the input.<br>
|


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''ComputeOBB''' (ComputeOBB)
| '''Description'''
|
| '''Default Value(s)'''
Compute Object Oriented Bounding boxes (OBB). When
| '''Restrictions'''
active the result of this computation is coppied into the statistics
output. In the case that the filter is built in its validation mode,
the OBB's are rendered.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''WriteGeometryOutput''' (WriteGeometryOutput)
|
If this property is set, then the geometry output is
written to a text file. The file name will be coonstructed using the
path in the "Output Base Name" widget.
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''WriteStatisticsOutput''' (WriteStatisticsOutput)
|
|
This property specifies the input to the Point Data to Cell Data filter.
If this property is set, then the statistics output is
 
written to a text file. The file name will be coonstructed using the
path in the "Output Base Name" widget.
|
|
0
|
|
Once set, the input dataset type cannot be changed.
Accepts boolean values (0 or 1).
 
 
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The dataset must contain a point array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Pass Point Data'''<br>''(PassPointData)''
|'''OutputBaseName''' (OutputBaseName)
|
This property specifies the base including path of where
to write the statistics and gemoetry output text files. It follows the
pattern "/path/to/folder/and/file" here file has no extention, as the
filter will generate a unique extention.
|
|
The value of this property controls whether the input point data will be passed to the output. If set to 1, then the input point data is passed through to the output; otherwise, only generated cell data is placed into the output.


| 0
|
|
Only the values 0 and 1 are accepted.




|}
|}


==Median==


==Principal Component Analysis==
Compute the median scalar values in a specified neighborhood for image/volume datasets.
 
The Median filter operates on uniform rectilinear (image
 
or volume) data and produces uniform rectilinear output.
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
It replaces the scalar value at each pixel / voxel with
 
the median scalar value in the specified surrounding
This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
neighborhood. Since the median operation removes outliers,
<br>
this filter is useful for removing high-intensity,
This filter performs additional analysis above and beyond the multicorrelative filter. It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter. Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.
low-probability noise (shot noise).
 
<br>
Since the PCA filter uses the multicorrelative filter's analysis, it shares the same raw covariance table specified in the multicorrelative documentation. The second table in the multiblock dataset comprising the model output is an expanded version of the multicorrelative version.
 
<br>
As with the multicorrlative filter, the second model table contains the mean values, the upper-triangular portion of the symmetric covariance matrix, and the non-zero lower-triangular portion of the Cholesky decomposition of the covariance matrix. Below these entries are the eigenvalues of the covariance matrix (in the column labeled "Mean") and the eigenvectors (as row vectors) in an additional NxN matrix.<br>
 


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,588: Line 7,089:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
This property specifies the input to the Median
filter.
|
|
Specify which type of field data the arrays will be drawn from.


| 0
|
|
Valud array names will be chosen from point and cell data.
Accepts input of following types:
* vtkImageData
The dataset must contain a field array (point)


with 1 component(s).


|-
|-
| '''Basis Energy'''<br>''(BasisEnergy)''
|'''SelectInputScalars''' (SelectInputScalars)
|
The value of this property lists the name of the scalar
array to use in computing the median.
|
|
The minimum energy to use when determining the dimensionality of the new space into which the assessment will project tuples.


| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
An array of scalars is required.
 
 
|-
|-
| '''Basis Scheme'''<br>''(BasisScheme)''
|'''KernelSize''' (KernelSize)
|
The value of this property specifies the number of
pixels/voxels in each dimension to use in computing the median to
assign to each pixel/voxel. If the kernel size in a particular
dimension is 1, then the median will not be computed in that
direction.
|
1 1 1
|
|
When reporting assessments, should the full eigenvector decomposition be used to project the original vector into the new space (Full basis), or should a fixed subset of the decomposition be used (Fixed-size basis), or should the projection be clipped to preserve at least some fixed "energy" (Fixed-energy basis)?




As an example, suppose the variables of interest were {A,B,C,D,E} and that the eigenvalues of the covariance matrix for these were {5,2,1.5,1,.5}. If the "Full basis" scheme is used, then all 5 components of the eigenvectors will be used to project each {A,B,C,D,E}-tuple in the original data into a new 5-components space.
|}


==Merge Blocks==


Appends vtkCompositeDataSet leaves into a single vtkUnstructuredGrid
vtkCompositeDataToUnstructuredGridFilter appends all vtkDataSet leaves of
the input composite dataset to a single unstructure grid. The subtree to
be combined can be choosen using the SubTreeCompositeIndex. If the
SubTreeCompositeIndex is a leaf node, then no appending is
required.


If the "Fixed-size" scheme is used and the "Basis Size" property is set to 4, then only the first 4 eigenvector components will be used to project each {A,B,C,D,E}-tuple into the new space and that space will be of dimension 4, not 5.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
Set the input composite dataset.
|


If the "Fixed-energy basis" scheme is used and the "Basis Energy" property is set to 0.8, then only the first 3 eigenvector components will be used to project each {A,B,C,D,E}-tuple into the new space, which will be of dimension 3. The number 3 is chosen because 3 is the lowest N for which the sum of the first N eigenvalues divided by the sum of all eigenvalues is larger than the specified "Basis Energy" (i.e., (5+2+1.5)/10 = 0.85 > 0.8).
| 0
|
|
The value must be one of the following: Full basis (0), Fixed-size basis (1), Fixed-energy basis (2).
Accepts input of following types:
 
* vtkCompositeDataSet
 
|-
|-
| '''Basis Size'''<br>''(BasisSize)''
|'''SubTreeCompositeIndex''' (SubTreeCompositeIndex)
|
Select the index of the subtree to be appended. For now,
this property is internal.
|
|
The maximum number of eigenvector components to use when projecting into the new space.
0
 
| 2
|
|
The value must be greater than or equal to 1.


|-
|-
| '''Input'''<br>''(Input)''
|'''Merge Points''' (MergePoints)
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).


|}


The dataset must contain a point or cell array.
==Mesh Quality==


This filter creates a new cell array containing a geometric measure of each cell's fitness. Different quality measures can be chosen for different cell shapes.This filter
creates a new cell array containing a geometric measure of
each cell's fitness. Different quality measures can be
chosen for different cell shapes. Supported shapes include
triangles, quadrilaterals, tetrahedra, and hexahedra. For
other shapes, a value of 0 is assigned.


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Model Input'''<br>''(ModelInput)''
|'''Input''' (Input)
|
This property specifies the input to the Mesh Quality
filter.
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''TriangleQualityMeasure''' (TriangleQualityMeasure)
|
This property indicates which quality measure will be
used to evaluate triangle quality. The radius ratio is the size of a
circle circumscribed by a triangle's 3 vertices divided by the size of
a circle tangent to a triangle's 3 edges. The edge ratio is the ratio
of the longest edge length to the shortest edge length.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
2
 
|
 
The value(s) is an enumeration of the following:
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
* Area (28)
 
* Aspect Ratio (1)
 
* Aspect Frobenius (3)
* Condition (9)
* Distortion (15)
* Edge Ratio (0)
* Maximum Angle (8)
* Minimum Angle (6)
* Scaled Jacobian (10)
* Radius Ratio (2)
* Relative Size Squared (12)
* Shape (13)
* Shape and Size (14)
|-
|-
| '''Normalization Scheme'''<br>''(NormalizationScheme)''
|'''QuadQualityMeasure''' (QuadQualityMeasure)
|
|
Before the eigenvector decomposition of the covariance matrix takes place, you may normalize each (i,j) entry by sqrt( cov(i,i) * cov(j,j) ). This implies that the variance of each variable of interest should be of equal importance.
This property indicates which quality measure will be
 
used to evaluate quadrilateral quality.
| 2
|
0
|
|
The value must be one of the following: No normalization (0), Normalize using covariances (3).
The value(s) is an enumeration of the following:
 
* Area (28)
 
* Aspect Ratio (1)
* Condition (9)
* Distortion (15)
* Edge Ratio (0)
* Jacobian (25)
* Maximum Aspect Frobenius (5)
* Maximum Aspect Frobenius (5)
* Maximum Edge Ratio (16)
* Mean Aspect Frobenius (4)
* Minimum Angle (6)
* Oddy (23)
* Radius Ratio (2)
* Relative Size Squared (12)
* Scaled Jacobian (10)
* Shape (13)
* Shape and Size (14)
* Shear (11)
* Shear and Size (24)
* Skew (17)
* Stretch (20)
* Taper (18)
* Warpage (26)
|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|'''TetQualityMeasure''' (TetQualityMeasure)
|
|
Choose arrays whose entries will be used to form observations for statistical analysis.
This property indicates which quality measure will be
 
used to evaluate tetrahedral quality. The radius ratio is the size of a
sphere circumscribed by a tetrahedron's 4 vertices divided by the size
of a circle tangent to a tetrahedron's 4 faces. The edge ratio is the
ratio of the longest edge length to the shortest edge length. The
collapse ratio is the minimum ratio of height of a vertex above the
triangle opposite it divided by the longest edge of the opposing
triangle across all vertex/triangle pairs.
|
|
2
|
|
An array of scalars is required.
The value(s) is an enumeration of the following:
 
* Edge Ratio (0)
 
* Aspect Beta (29)
* Aspect Gamma (27)
* Aspect Frobenius (3)
* Aspect Ratio (1)
* Collapse Ratio (7)
* Condition (9)
* Distortion (15)
* Jacobian (25)
* Minimum Dihedral Angle (6)
* Radius Ratio (2)
* Relative Size Squared (12)
* Scaled Jacobian (10)
* Shape (13)
* Shape and Size (14)
* Volume (19)
|-
|-
| '''Task'''<br>''(Task)''
|'''HexQualityMeasure''' (HexQualityMeasure)
|
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.
 
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
 
| 3
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
This property indicates which quality measure will be
 
used to evaluate hexahedral quality.
 
|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
5
 
| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value(s) is an enumeration of the following:
 
* Diagonal (21)
* Dimension (22)
* Distortion (15)
* Edge Ratio (0)
* Jacobian (25)
* Maximum Edge Ratio (16)
* Maximum Aspect Frobenius (5)
* Mean Aspect Frobenius (4)
* Oddy (23)
* Relative Size Squared (12)
* Scaled Jacobian (10)
* Shape (13)
* Shape and Size (14)
* Shear (11)
* Shear and Size (24)
* Skew (17)
* Stretch (20)
* Taper (18)
* Volume (19)


|}
|}


==MinMax==


==Probe Location==
Sample data attributes at the points in a point cloud.


The Probe filter samples the data set attributes of the current data set at the points in a point cloud. The Probe filter uses interpolation to determine the values at the selected point, whether or not it lies at an input point. The Probe filter operates on any type of data and produces polygonal output (a point cloud).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,731: Line 7,328:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the Min Max filter.
|
|
This property specifies the dataset from which to obtain probe values.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
 
 
The dataset must contain a point or cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkCompositeDataSet.
 
 
|-
|-
| '''Probe Type'''<br>''(Source)''
|'''Operation''' (Operation)
|
|
This property specifies the dataset whose geometry will be used in determining positions to probe.
Select whether to perform a min, max, or sum operation
 
on the data.
|
|
MIN
|
|
The selected object must be the result of the following: sources (includes readers).
The value(s) can be one of the following:
 
* MIN
 
* MAX
The value must be set to one of the following: FixedRadiusPointSource.
* SUM
 


|}
|}


==Multicorrelative Statistics==


==Process Id Scalars==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
 
This filter either computes a statistical model of a dataset or takes
 
such a model as its second input. Then, the model (however it is
This filter uses colors to show how data is partitioned across processes.
obtained) may optionally be used to assess the input dataset.&lt;p&gt;
 
This filter computes the covariance matrix for all the arrays you select
The Process Id Scalars filter assigns a unique scalar value to each piece of the input according to which processor it resides on. This filter operates on any type of data when ParaView is run in parallel. It is useful for determining whether your data is load-balanced across the processors being used. The output data set type is the same as that of the input.<br>
plus the mean of each array. The model is thus a multivariate Gaussian
distribution with the mean vector and variances provided. Data is
assessed using this model by computing the Mahalanobis distance for each
input point. This distance will always be positive.&lt;p&gt; The learned
model output format is rather dense and can be confusing, so it is
discussed here. The first filter output is a multiblock dataset
consisting of 2 tables: &lt;ol&gt; &lt;li&gt; Raw covariance data.
&lt;li&gt; Covariance matrix and its Cholesky decomposition. &lt;/ol&gt;
The raw covariance table has 3 meaningful columns: 2 titled "Column1" and
"Column2" whose entries generally refer to the N arrays you selected when
preparing the filter and 1 column titled "Entries" that contains numeric
values. The first row will always contain the number of observations in
the statistical analysis. The next N rows contain the mean for each of
the N arrays you selected. The remaining rows contain covariances of
pairs of arrays.&lt;p&gt; The second table (covariance matrix and
Cholesky decomposition) contains information derived from the raw
covariance data of the first table. The first N rows of the first column
contain the name of one array you selected for analysis. These rows are
followed by a single entry labeled "Cholesky" for a total of N+1 rows.
The second column, Mean contains the mean of each variable in the first N
entries and the number of observations processed in the final (N+1)
row.&lt;p&gt; The remaining columns (there are N, one for each array)
contain 2 matrices in triangular format. The upper right triangle
contains the covariance matrix (which is symmetric, so its lower triangle
may be inferred). The lower left triangle contains the Cholesky
decomposition of the covariance matrix (which is triangular, so its upper
triangle is zero). Because the diagonal must be stored for both matrices,
an additional row is required — hence the N+1 rows and
the final entry of the column named "Column".


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,776: Line 7,396:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
The input to the filter. Arrays from this dataset will
be used for computing statistics and/or assessed by a statistical
model.
|
|
This property specifies the input to the Process Id Scalars filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkImageData
 
* vtkStructuredGrid
 
* vtkPolyData
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
* vtkUnstructuredGrid
 
* vtkTable
* vtkGraph
The dataset must contain a field array ()


|-
|-
| '''Random Mode'''<br>''(RandomMode)''
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|
|
The value of this property determines whether to use random id values for the various pieces. If set to 1, the unique value per piece will be chosen at random; otherwise the unique value will match the id of the process.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkTable
 
* vtkMultiBlockDataSet
|}
 
 
==Programmable Filter==
 
 
Executes a user supplied python script on its input dataset to produce an output dataset.
 
This filter will execute a python script to produce an output dataset.<br>
The filter keeps a copy of the python script in Script, and creates <br>
Interpretor, a python interpretor to run the script upon the first <br>
execution.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''AttributeMode''' (AttributeMode)
| '''Description'''
|
| '''Default Value(s)'''
Specify which type of field data the arrays will be
| '''Restrictions'''
drawn from.
|
0
|
The value must be field array name.
|-
|-
| '''Copy Arrays'''<br>''(CopyArrays)''
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|
|
If this property is set to true, all the cell and point arrays from
first input are copied to the output.


| 0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''RequestInformation Script'''<br>''(InformationScript)''
|'''Task''' (Task)
|
|
This property is a python script that is executed during the RequestInformation pipeline pass. Use this to provide information such as WHOLE_EXTENT to the pipeline downstream.
Specify the task to be performed: modeling and/or
 
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
|
3
|
|
The value(s) is an enumeration of the following:
* Detailed model of input data (0)
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|-
| '''Input'''<br>''(Input)''
|'''TrainingFraction''' (TrainingFraction)
|
|
This property specifies the input(s) to the programmable filter.
Specify the fraction of values from the input dataset to
 
be used for model fitting. The exact set of values is chosen at random
from the dataset.
|
|
0.1
|
|
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.
|}
 
==Normal Glyphs==


Filter computing surface normals.Filter
computing surface normals.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Output Data Set Type'''<br>''(OutputDataSetType)''
| '''Property'''
|
| '''Description'''
The value of this property determines the dataset type for the output of the programmable filter.
| '''Default Value(s)'''
| '''Restrictions'''
 
 
|}


| 8
==Octree Depth Limit==
|
The value must be one of the following: Same as Input (8), vtkPolyData (0), vtkStructuredGrid (2), vtkRectilinearGrid (3), vtkUnstructuredGrid (4), vtkImageData (6), vtkUniformGrid (10), vtkMultiblockDataSet (13), vtkHierarchicalBoxDataSet (15), vtkTable (19).


This filter takes in a octree and produces a new octree which is no deeper than the maximum specified depth level.The Octree
Depth Limit filter takes in an octree and produces a new
octree that is nowhere deeper than the maximum specified
depth level. The attribute data of pruned leaf cells are
integrated in to their ancestors at the cut
level.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Python Path'''<br>''(PythonPath)''
| '''Property'''
|
| '''Description'''
A semi-colon (;) separated list of directories to add to the python library
| '''Default Value(s)'''
search path.
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
This property specifies the input to the Octree Depth
Limit filter.
|
|
|-
| '''Script'''<br>''(Script)''
|
This property contains the text of a python program that the programmable filter runs.


|
|
|
Accepts input of following types:
* vtkHyperOctree
|-
|-
| '''RequestUpdateExtent Script'''<br>''(UpdateExtentScript)''
|'''MaximumLevel''' (MaximumLevel)
|
|
This property is a python script that is executed during the RequestUpdateExtent pipeline pass. Use this to modify the update extent that your filter ask up stream for.
The value of this property specifies the maximum depth
 
of the output octree.
|
|
4
|
|
|}




==Python Calculator==
|}


==Octree Depth Scalars==


This filter evaluates a Python expression
This filter adds a scalar to each leaf of the octree that represents the leaf's depth within the tree.The
 
vtkHyperOctreeDepth filter adds a scalar to each leaf of
This filter uses Python to calculate an expression.<br>
the octree that represents the leaf's depth within the
It depends heavily on the numpy and paraview.vtk modules.<br>
tree.
To use the parallel functions, mpi4py is also necessary. The expression<br>
is evaluated and the resulting scalar value or numpy array is added<br>
to the output as an array. See numpy and paraview.vtk documentation<br>
for the list of available functions.<br><br><br>
This filter tries to make it easy for the user to write expressions<br>
by defining certain variables. The filter tries to assign each array<br>
to a variable of the same name. If the name of the array is not a <br>
valid Python variable, it has to be accessed through a dictionary called<br>
arrays (i.e. arrays['array_name']). The points can be accessed using the<br>
points variable.       <br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,908: Line 7,552:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Array Association'''<br>''(ArrayAssociation)''
|'''Input''' (Input)
|
This property specifies the input to the Octree Depth
Scalars filter.
|
|
This property controls the association of the output array as well as
which arrays are defined as variables.


| 0
|
|
The value must be one of the following: Point Data (0), Cell Data (1).
Accepts input of following types:
* vtkHyperOctree


|}


==OrderedCompositeDistributor==
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Array Name'''<br>''(ArrayName)''
| '''Property'''
|
| '''Description'''
The name of the output array.
| '''Default Value(s)'''
| '''Restrictions'''


| result
|-
|'''Input''' (Input)
|
|
|-
Set the input to the Ordered Composite Distributor
| '''Copy Arrays'''<br>''(CopyArrays)''
filter.
|
|
If this property is set to true, all the cell and point arrays from
first input are copied to the output.


| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Expression'''<br>''(Expression)''
|'''PassThrough''' (PassThrough)
|
|
The Python expression evaluated during execution.
Toggle whether to pass the data through without
 
compositing.
|
|
0
|
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''PKdTree''' (PKdTree)
|
Set the vtkPKdTree to distribute with.
|
 
|
|
Set the input of the filter.


|-
|'''OutputType''' (OutputType)
|
|
When not empty, the output will be converted to the
given type.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


 
|
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|}
|}


==Outline==


==Quadric Clustering==
This filter generates a bounding box representation of the input.The Outline filter
 
generates an axis-aligned bounding box for the input
 
dataset. This filter operates on any type of dataset and
This filter is the same filter used to generate level of detail for ParaView.  It uses a structured grid of bins and merges all points contained in each bin.
produces polygonal output.
 
The Quadric Clustering filter produces a reduced-resolution polygonal approximation of the input polygonal dataset. This filter is the one used by ParaView for computing LODs. It uses spatial binning to reduce the number of points in the data set; points that lie within the same spatial bin are collapsed into one representative point.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 5,973: Line 7,628:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Copy Cell Data'''<br>''(CopyCellData)''
|'''Input''' (Input)
|
|
If this property is set to 1, the cell data from the input will be copied to the output.
This property specifies the input to the Outline
 
filter.
| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Quadric Clustering filter.
Accepts input of following types:
* vtkDataSet


|
|}
|
The selected object must be the result of the following: sources (includes readers), filters.


==Outline Corners==


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
This filter generates a bounding box representation of the input. It only displays the corners of the bounding box.The
Outline Corners filter generates the corners of a bounding
box for the input dataset. This filter operates on any
type of dataset and produces polygonal
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Number of Dimensions'''<br>''(NumberOfDivisions)''
|'''Input''' (Input)
|
This property specifies the input to the Outline Corners
filter.
|
|
This property specifies the number of bins along the X, Y, and Z axes of the data set.


| 50 50 50
|
|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''Use Feature Edges'''<br>''(UseFeatureEdges)''
|'''CornerFactor''' (CornerFactor)
|
The value of this property sets the size of the corners
as a percentage of the length of the corresponding bounding box
edge.
|
|
If this property is set to 1, feature edge quadrics will be used to maintain the boundary edges along processor divisions.
0.2
 
| 0
|
|
Only the values 0 and 1 are accepted.




|-
|}
| '''Use Feature Points'''<br>''(UseFeaturePoints)''
|
If this property is set to 1, feature point quadrics will be used to maintain the boundary points along processor divisions.


| 0
==Outline Curvilinear DataSet==
|
Only the values 0 and 1 are accepted.


This filter generates an outline representation of the input.The Outline filter
generates an outline of the outside edges of the input
dataset, rather than the dataset's bounding box. This
filter operates on structured grid datasets and produces
polygonal output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Use Input Points'''<br>''(UseInputPoints)''
| '''Property'''
|
| '''Description'''
If the value of this property is set to 1, the representative point for each bin is selected from one of the input points that lies in that bin; the input point that produces the least error is chosen. If the value of this property is 0, the location of the representative point is calculated to produce the least error possible for that bin, but the point will most likely not be one of the input points.
| '''Default Value(s)'''
| '''Restrictions'''


| 1
|-
|'''Input''' (Input)
|
|
Only the values 0 and 1 are accepted.
This property specifies the input to the outline
 
(curvilinear) filter.
 
|-
| '''Use Internal Triangles'''<br>''(UseInternalTriangles)''
|
|
If this property is set to 1, triangles completely contained in a spatial bin will be included in the computation of the bin's quadrics. When this property is set to 0, the filters operates faster, but the resulting surface may not be as well-behaved.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkStructuredGrid


|}
|}


==Outline Generic DataSet==


==Random Vectors==
This filter generates a bounding box representation of the input.The Generic Outline
 
filter generates an axis-aligned bounding box for the
 
input data set. The Input menu specifies the data set for
This filter creates a new 3-component point data array and sets it as the default vector array. It uses a random number generator to create values.
which to create a bounding box. This filter operates on
 
generic data sets and produces polygonal
The Random Vectors filter generates a point-centered array of random vectors. It uses a random number generator to determine the components of the vectors. This filter operates on any type of data set, and the output data set will be of the same type as the input.<br>
output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,059: Line 7,723:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Random Vectors filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Maximum Speed'''<br>''(MaximumSpeed)''
|'''Input''' (Input)
|
|
This property specifies the maximum length of the random point vectors generated.
Set the input to the Generic Outline
 
filter.
| 1
|
|
The value must be greater than or equal to 0.


|-
| '''Minimum Speed'''<br>''(MinimumSpeed)''
|
|
This property specifies the minimum length of the random point vectors generated.
Accepts input of following types:
 
* vtkGenericDataSet
| 0
|
The value must be greater than or equal to 0.
 


|}
|}


==ParticlePath==


==Rectilinear Grid Connectivity==
Trace Particle Paths through time in a vector field.
 
The Particle Trace filter generates pathlines in a vector
 
field from a collection of seed points. The vector field
Parallel fragments extraction and attributes integration on rectilinear grids.
used is selected from the Vectors menu, so the input data
 
set is required to have point-centered vectors. The Seed
Extracts material fragments from multi-block vtkRectilinearGrid datasets<br>
portion of the interface allows you to select whether the
based on the selected volume fraction array(s) and a fraction isovalue and<br>
seed points for this integration lie in a point cloud or
integrates the associated attributes.<br>
along a line. Depending on which is selected, the
appropriate 3D widget (point or line widget) is displayed
along with traditional user interface controls for
positioning the point cloud or line within the data set.
Instructions for using the 3D widgets and the
corresponding manual controls can be found in section 7.4.
This filter operates on any type of data set, provided it
has point-centered vectors. The output is polygonal data
containing polylines. This filter is available on the
Toolbar.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,110: Line 7,763:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)''
|'''Input''' (Input)
|
Specify which is the Input of the StreamTracer
filter.
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
|
|
Accepts input of following types:
An array of scalars is required.
* vtkDataObject
The dataset must contain a field array (point)


with 3 component(s).


|-
|-
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)''
|'''Seed Source''' (Source)
|
Specify the seed dataset. Typically fron where the
vector field integration should begin. Usually a point/radius or a line
with a given resolution.
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''TerminationTime''' (TerminationTime)
|
Setting TerminationTime to a positive value will cause
particles to terminate when the time is reached. The units of time
should be consistent with the primary time variable.
|
0.0
|
|
An array of scalars is required.


|-
|-
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)''
|'''TimestepValues''' (TimestepValues)
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
|
|
|
An array of scalars is required.


|-
|-
| '''Input'''<br>''(Input)''
|'''ForceReinjectionEveryNSteps''' (ForceReinjectionEveryNSteps)
| This property specifies the input of the filter.
|
|
When animating particles, it is nice to inject new ones
every Nth step to produce a continuous flow. Setting
ForceReinjectionEveryNSteps to a non zero value will cause the particle
source to reinject particles every Nth step even if it is otherwise
unchanged. Note that if the particle source is also animated, this flag
will be redundant as the particles will be reinjected whenever the
source changes anyway
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a cell array with 1 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkRectilinearGrid, vtkCompositeDataSet.


|-
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|'''StaticSeeds''' (StaticSeeds)
|
|
The value of this property is the volume fraction value for the surface.
If the input seeds are not changing, then this
can be set to 1 to avoid having to do a repeated grid search
that would return the exact same result.


| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
0
 
|
 
Accepts boolean values (0 or 1).
|}
 
 
==Reflect==
 
 
This filter takes the union of the input and its reflection over an axis-aligned plane.
 
The Reflect filter reflects the input dataset across the specified plane. This filter operates on any type of data set and produces an unstructured grid output.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''StaticMesh''' (StaticMesh)
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Center'''<br>''(Center)''
|
|
If the value of the Plane property is X, Y, or Z, then the value of this property specifies the center of the reflection plane.
If the input grid is not changing, then this
can be set to 1 to avoid having to create cell locators for
each update.


| 0
|
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Copy Input'''<br>''(CopyInput)''
|'''SelectInputVectors''' (SelectInputVectors)
|
Specify which vector array should be used for the
integration through that filter.
|
|
If this property is set to 1, the output will contain the union of the input dataset and its reflection. Otherwise the output will contain only the reflection of the input data.


| 1
|
|
Only the values 0 and 1 are accepted.
An array of vectors is required.
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''ComputeVorticity''' (ComputeVorticity)
|
|
This property specifies the input to the Reflect filter.
Compute vorticity and angular rotation of particles as
 
they progress
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Plane'''<br>''(Plane)''
|'''Terminal Speed''' (TerminalSpeed)
|
|
The value of this property determines which plane to reflect across. If the value is X, Y, or Z, the value of the Center property determines where the plane is placed along the specified axis. The other six options (X Min, X Max, etc.) place the reflection plane at the specified face of the bounding box of the input dataset.
This property specifies the terminal speed, below which
 
particle advection/integration is terminated.
| 0
|
0.000000000001
|
|
The value must be one of the following: X Min (0), Y Min (1), Z Min (2), X Max (3), Y Max (4), Z Max (5), X (6), Y (7), Z (8).




|}
|}


==ParticleTracer==


==Resample With Dataset==
Trace Particles through time in a vector field.
 
The Particle Trace filter generates pathlines in a vector
 
field from a collection of seed points. The vector field
Sample data attributes at the points of a dataset.
used is selected from the Vectors menu, so the input data
 
set is required to have point-centered vectors. The Seed
Probe is a filter that computes point attributes at specified point positions. The filter has two inputs: the Input and Source. The Input geometric structure is passed through the filter. The point attributes are computed at the Input point positions by interpolating into the source data. For example, we can compute data values on a plane (plane specified as Input) from a volume (Source). The cell data of the source data is copied to the output based on in which source cell each input point is. If an array of the same name exists both in source's point and cell data, only the one from the point data is probed.<br>
portion of the interface allows you to select whether the
seed points for this integration lie in a point cloud or
along a line. Depending on which is selected, the
appropriate 3D widget (point or line widget) is displayed
along with traditional user interface controls for
positioning the point cloud or line within the data set.
Instructions for using the 3D widgets and the
corresponding manual controls can be found in section 7.4.
This filter operates on any type of data set, provided it
has point-centered vectors. The output is polygonal data
containing polylines. This filter is available on the
Toolbar.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,236: Line 7,899:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Specify which is the Input of the StreamTracer
filter.
|
|
This property specifies the dataset from which to obtain probe values.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataObject
 
The dataset must contain a field array (point)
 
The dataset must contain a point or cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkCompositeDataSet.


with 3 component(s).


|-
|-
| '''Source'''<br>''(Source)''
|'''Seed Source''' (Source)
|
Specify the seed dataset. Typically fron where the
vector field integration should begin. Usually a point/radius or a line
with a given resolution.
|
|
This property specifies the dataset whose geometry will be used in determining positions to probe.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''StaticSeeds''' (StaticSeeds)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
If the input seeds are not changing, then this
can be set to 1 to avoid having to do a repeated grid search
that would return the exact same result.


|
0
|
Accepts boolean values (0 or 1).
|-
|'''StaticMesh''' (StaticMesh)
|
If the input grid is not changing, then this
can be set to 1 to avoid having to create cell locators for
each update.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''TimestepValues''' (TimestepValues)
|


|


|}
|


|-
|'''ForceReinjectionEveryNSteps''' (ForceReinjectionEveryNSteps)
|
When animating particles, it is nice to inject new ones
every Nth step to produce a continuous flow. Setting
ForceReinjectionEveryNSteps to a non zero value will cause the particle
source to reinject particles every Nth step even if it is otherwise
unchanged. Note that if the particle source is also animated, this flag
will be redundant as the particles will be reinjected whenever the
source changes anyway
|
0
|


==Ribbon==
|-
|'''SelectInputVectors''' (SelectInputVectors)
|
Specify which vector array should be used for the
integration through that filter.
|


|
An array of vectors is required.
|-
|'''ComputeVorticity''' (ComputeVorticity)
|
Compute vorticity and angular rotation of particles as
they progress
|
1
|
Accepts boolean values (0 or 1).


This filter generates ribbon surface from lines.  It is useful for displaying streamlines.
|}


The Ribbon filter creates ribbons from the lines in the input data set. This filter is useful for visualizing streamlines. Both the input and output of this filter are polygonal data. The input data set must also have at least one point-centered vector array.<br>
==Pass Arrays==
 
Pass specified point and cell data arrays.
The Pass Arrays filter makes a shallow copy of the output
data object from the input data object except for passing
only the arrays specified to the output from the
input.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,281: Line 8,004:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Angle'''<br>''(Angle)''
|
The value of this property specifies the offset angle (in degrees) of the ribbon from the line normal.
| 0
|
The value must be greater than or equal to 0 and less than or equal to 360.


|-
|-
| '''Default Normal'''<br>''(DefaultNormal)''
|'''Input''' (Input)
|
|
The value of this property specifies the normal to use when the UseDefaultNormal property is set to 1 or the input contains no vector array (SelectInputVectors property).


| 0 0 1
|
|
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Ribbon filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataObject
 
The dataset must contain a field array (cell)


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
The dataset must contain a field array (point)


The dataset must contain a field array (field)


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''PointDataArrays''' (PointDataArrays)
|
Add a point array by name to be passed.
|
|
The value of this property indicates the name of the input scalar array used by this filter. The width of the ribbons will be varied based on the values in the specified array if the value of the Width property is 1.


|
|
|
An array of scalars is required.


|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''CellDataArrays''' (CellDataArrays)
|
Add a cell array by name to be passed.
|
|
The value of this property indicates the name of the input vector array used by this filter. If the UseDefaultNormal property is set to 0, the normal vectors for the ribbons come from the specified vector array.


| 1
|
|
An array of vectors is required.


|-
|-
| '''Use Default Normal'''<br>''(UseDefaultNormal)''
|'''FieldDataArrays''' (FieldDataArrays)
|
Add a field array by name to be passed.
|
|
If this property is set to 0, and the input contains no vector array, then default ribbon normals will be generated (DefaultNormal property); if a vector array has been set (SelectInputVectors property), the ribbon normals will be set from the specified array. If this property is set to 1, the default normal (DefaultNormal property) will be used, regardless of whether the SelectInputVectors property has been set.


| 0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Vary Width'''<br>''(VaryWidth)''
|'''UseFieldTypes''' (UseFieldTypes)
|
This hidden property must always be set to 1 for this
proxy to work.
|
|
If this property is set to 1, the ribbon width will be scaled according to the scalar array specified in the SelectInputScalars property.
1
Toggle the variation of ribbon width with scalar value.
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Width'''<br>''(Width)''
|'''SelectedFieldTypes''' (SelectedFieldTypes)
|
|
If the VaryWidth property is set to 1, the value of this property is the minimum ribbon width. If the VaryWidth property is set to 0, the value of this property is half the width of the ribbon.
This hidden property must always be set to 0 for this
 
proxy to work.
| 1
|
0 1 2
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.01.
Accepts boolean values (0 or 1).
 


|}
|}


 
==Pass Through==
==Rotational Extrusion==




This filter generates a swept surface while translating the input along a circular path.
A simple pass-through filter that doesn't transform data in any way.


The Rotational Extrusion filter forms a surface by rotating the input about the Z axis. This filter is intended to operate on 2D polygonal data. It produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,378: Line 8,077:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Angle'''<br>''(Angle)''
|
This property specifies the angle of rotation in degrees. The surface is swept from 0 to the value of this property.


| 360
|
|-
|-
| '''Capping'''<br>''(Capping)''
|'''Input''' (Input)
|
|
If this property is set to 1, the open ends of the swept surface will be capped with a copy of the input dataset. This works property if the input is a 2D surface composed of filled polygons. If the input dataset is a closed solid (e.g., a sphere), then either two copies of the dataset will be drawn or no surface will be drawn. No surface is drawn if either this property is set to 0 or if the two surfaces would occupy exactly the same 3D space (i.e., the Angle property's value is a multiple of 360, and the values of the Translation and DeltaRadius properties are 0).
This property specifies the input to the filter.
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Delta Radius'''<br>''(DeltaRadius)''
|
|
The value of this property specifies the change in radius during the sweep process.
Accepts input of following types:
* vtkDataSet


| 0
|}
|
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Rotational Extrusion filter.


|
==Plot Data==
|
The selected object must be the result of the following: sources (includes readers), filters.


Plot data arrays from the inputThis filter
prepare arbitrary data to be plotted in any of the plots. By default the
data is shown in a XY line plot.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Resolution'''<br>''(Resolution)''
|'''Input''' (Input)
|
|
The value of this property controls the number of intermediate node points used in performing the sweep (rotating from 0 degrees to the value specified by the Angle property.
The input.
 
| 12
|
|
The value must be greater than or equal to 1.


|-
| '''Translation'''<br>''(Translation)''
|
|
The value of this property specifies the total amount of translation along the Z axis during the sweep process. Specifying a non-zero value for this property allows you to create a corkscrew (value of DeltaRadius > 0) or spring effect.
Accepts input of following types:
* vtkDataObject


| 0
|
|}
|}


==Plot Global Variables Over Time==


==Scatter Plot==
Extracts and plots data in field data over time.
 
This filter extracts the variables that reside in a
 
dataset's field data and are defined over time. The output
Creates a scatter plot from a dataset.
is a 1D rectilinear grid where the x coordinates
 
correspond to time (the same array is also copied to a
This filter creates a scatter plot from a dataset.<br>
point array named Time or TimeData (if Time exists in the
input)).


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,448: Line 8,131:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
The input from which the selection is
extracted.
|
|
This property specifies the input to the filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Plot On Intersection Curves==


==Shrink==
Extracts the edges in a 2D plane and plots them
 
Extracts the surface, intersect it with a 2D plane. Plot
 
the resulting polylines.
This filter shrinks each input cell so they pull away from their neighbors.
 
The Shrink filter causes the individual cells of a dataset to break apart from each other by moving each cell's points toward the centroid of the cell. (The centroid of a cell is the average position of its points.) This filter operates on any type of dataset and produces unstructured grid output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,477: Line 8,157:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Shrink filter.


|
|
The selected object must be the result of the following: sources (includes readers), filters.


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
==Plot On Sorted Lines==


The Plot on Sorted Lines filter sorts and orders polylines for graph visualization.The Plot on Sorted Lines filter sorts and orders
polylines for graph visualization. See http://www.paraview.org/ParaView3/index.php/Plotting_Over_Curves for more information.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Shrink Factor'''<br>''(ShrinkFactor)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Plot Edges
filter.
|
|
The value of this property determines how far the points will move. A value of 0 positions the points at the centroid of the cell; a value of 1 leaves them at their original positions.


| 0.5
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
Accepts input of following types:
* vtkPolyData
 
|}


==Plot Over Line==


|}
Sample data attributes at the points along a line. Probed lines will be displayed in a graph of the attributes.
The Plot Over Line filter samples the data set attributes
of the current data set at the points along a line. The
values of the point-centered variables along that line
will be displayed in an XY Plot. This filter uses
interpolation to determine the values at the selected
point, whether or not it lies at an input point. The Probe
filter operates on any type of data and produces polygonal
output (a line).


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


==Slice==


|}


This filter slices a data set with a plane. Slicing is similar to a contour. It creates surfaces from volumes and lines from surfaces.
==Plot Selection Over Time==


This filter extracts the portion of the input dataset that lies along the specified plane. The Slice filter takes any type of dataset as input. The output of this filter is polygonal data.<br>
Extracts selection over time and then plots it.
This filter extracts the selection over time, i.e. cell
and/or point variables at a cells/point selected are
extracted over time The output multi-block consists of 1D
rectilinear grids where the x coordinate corresponds to
time (the same array is also copied to a point array named
Time or TimeData (if Time exists in the input)). If
selection input is a Location based selection then the
point values are interpolated from the nearby cells, ie
those of the cell the location lies in.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,516: Line 8,227:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Slice Offset Values'''<br>''(ContourValues)''
|'''Input''' (Input)
|
The input from which the selection is
extracted.
|
|
The values in this property specify a list of current offset values. This can be used to create multiple slices with different centers. Each entry represents a new slice with its center shifted by the offset value.


|
|
Accepts input of following types:
* vtkDataSet
* vtkTable
* vtkCompositeDataSet
|-
|'''Selection''' (Selection)
|
|
Determine the length of the dataset's diagonal. The value must lie within -diagonal length to +diagonal length.
The input that provides the selection
 
object.
 
|-
| '''Slice Type'''<br>''(CutFunction)''
|
|
This property sets the parameters of the slice function.


|
|
|
Accepts input of following types:
The value must be set to one of the following: Plane, Box, Sphere.
* vtkSelection
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''Only Report Selection Statistics''' (Only Report Selection Statistics)
|
|
This property specifies the input to the Slice filter.
If this property is set to 1, the min, max,
 
inter-quartile ranges, and (for numeric arrays) mean and standard
deviation of all the selected points or cells within each time step
are reported -- instead of breaking each selected point's or cell's
attributes out into separate time history tables.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Point Data to Cell Data==


==Slice Generic Dataset==
Create cell attributes by averaging point attributes.The Point
 
Data to Cell Data filter averages the values of the point
 
attributes of the points of a cell to compute cell
This filter cuts a data set with a plane or sphere.  Cutting is similar to a contour.  It creates surfaces from volumes and lines from surfaces.
attributes. This filter operates on any type of dataset,
 
and the output dataset is the same type as the
The Generic Cut filter extracts the portion of the input data set that lies along the specified plane or sphere. From the Cut Function menu, you can select whether cutting will be performed with a plane or a sphere. The appropriate 3D widget (plane widget or sphere widget) will be displayed. The parameters of the cut function can be specified interactively using the 3D widget or manually using the traditional user interface controls. Instructions for using these 3D widgets and their corresponding user interfaces are found in section 7.4.<br>
input.
By default, the cut lies on the specified plane or sphere. Using the Cut Offset Values portion of the interface, it is also possible to cut the data set at some offset from the original cut function. The Cut Offset Values are in the spatial units of the data set. To add a single offset, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced offsets, use the controls in the Generate range of values section. Select the number of offsets to generate using the Number of Values slider. The Range slider controls the interval in which to generate the offsets. Once the number of values and range have been selected, click the Generate button. The new offsets will be added to the Offset Values list. To delete a value from the Cut Offset Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list.<br>
The Generic Cut filter takes a generic dataset as input. Use the Input menu to choose a data set to cut. The output of this filter is polygonal data.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,567: Line 8,280:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Slice Offset Values'''<br>''(ContourValues)''
|'''Input''' (Input)
|
This property specifies the input to the Point Data to
Cell Data filter.
|
|
The values in this property specify a list of current offset values. This can be used to create multiple slices with different centers. Each entry represents a new slice with its center shifted by the offset value.


|
|
|
Accepts input of following types:
Determine the length of the dataset's diagonal. The value must lie within -diagonal length to +diagonal length.
* vtkDataSetOnce set, the input dataset cannot be changed.
 
The dataset must contain a field array (point)


|-
|-
| '''Cut Type'''<br>''(CutFunction)''
|'''PassPointData''' (PassPointData)
|
|
Set the parameters to the implicit function used for cutting.
The value of this property controls whether the input
 
point data will be passed to the output. If set to 1, then the input
point data is passed through to the output; otherwise, only generated
cell data is placed into the output.
|
|
0
|
|
The value must be set to one of the following: Plane, Box, Sphere.
Accepts boolean values (0 or 1).


|}
==PolyLine To Rectilinear Grid==


|-
| '''Input'''<br>''(Input)''
|
Set the input to the Generic Cut filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.
|}
==Smooth==
This filter smooths a polygonal surface by iteratively moving points toward their neighbors.


The Smooth filter operates on a polygonal data set by iteratively adjusting the position of the points using Laplacian smoothing. (Because this filter only adjusts point positions, the output data set is also polygonal.) This results in better-shaped cells and more evenly distributed points.<br><br><br>
The Convergence slider limits the maximum motion of any point. It is expressed as a fraction of the length of the diagonal of the bounding box of the data set. If the maximum point motion during a smoothing iteration is less than the Convergence value, the smoothing operation terminates.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,617: Line 8,317:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Convergence'''<br>''(Convergence)''
|
The value of this property limits the maximum motion of any point. It is expressed as a fraction of the length of the diagonal of the bounding box of the input dataset. If the maximum point motion during a smoothing iteration is less than the value of this property, the smoothing operation terminates.
| 0
|
The value must be greater than or equal to 0 and less than or equal to 1.


|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Smooth filter.
 
|
|
Set the input to the Polyline to Rectilinear Grid
filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|-
| '''Number of Iterations'''<br>''(NumberOfIterations)''
|
This property sets the maximum number of smoothing iterations to perform. More iterations produce better smoothing.
| 20
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
 
* vtkPolyData


|}
|}


==Principal Component Analysis==


==Stream Tracer==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
 
This filter either computes a statistical model of a dataset or takes
 
such a model as its second input. Then, the model (however it is
Integrate streamlines in a vector field.
obtained) may optionally be used to assess the input dataset. &lt;p&gt;
 
This filter performs additional analysis above and beyond the
The Stream Tracer filter generates streamlines in a vector field from a collection of seed points. Production of streamlines terminates if a streamline crosses the exterior boundary of the input dataset. Other reasons for termination are listed for the MaximumNumberOfSteps, TerminalSpeed, and MaximumPropagation properties. This filter operates on any type of dataset, provided it has point-centered vectors. The output is polygonal data containing polylines.<br>
multicorrelative filter. It computes the eigenvalues and eigenvectors of
the covariance matrix from the multicorrelative filter. Data is then
assessed by projecting the original tuples into a possibly
lower-dimensional space. &lt;p&gt; Since the PCA filter uses the
multicorrelative filter's analysis, it shares the same raw covariance
table specified in the multicorrelative documentation. The second table
in the multiblock dataset comprising the model output is an expanded
version of the multicorrelative version. &lt;p&gt; As with the
multicorrlative filter, the second model table contains the mean values,
the upper-triangular portion of the symmetric covariance matrix, and the
non-zero lower-triangular portion of the Cholesky decomposition of the
covariance matrix. Below these entries are the eigenvalues of the
covariance matrix (in the column labeled "Mean") and the eigenvectors (as
row vectors) in an additional NxN matrix.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,666: Line 8,359:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
|'''Input''' (Input)
|
The input to the filter. Arrays from this dataset will
be used for computing statistics and/or assessed by a statistical
model.
|
|
Specify whether or not to compute vorticity.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset must contain a field array ()


|-
|-
| '''Initial Step Length'''<br>''(InitialIntegrationStep)''
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|
|
This property specifies the initial integration step size. For non-adaptive integrators (Runge-Kutta 2 and Runge-Kutta 4), it is fixed (always equal to this initial value) throughout the integration. For an adaptive integrator (Runge-Kutta 4-5), the actual step size varies such that the numerical error is less than a specified threshold.


| 0.2
|
|
Accepts input of following types:
* vtkTable
* vtkMultiBlockDataSet
|-
|-
| '''Input'''<br>''(Input)''
|'''AttributeMode''' (AttributeMode)
|
|
This property specifies the input to the Stream Tracer filter.
Specify which type of field data the arrays will be
 
drawn from.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must be field array name.
 
 
The dataset must contain a point array with 3 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Integration Direction'''<br>''(IntegrationDirection)''
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|
|
This property determines in which direction(s) a streamline is generated.


| 2
|
|
The value must be one of the following: FORWARD (0), BACKWARD (1), BOTH (2).


|-
|-
| '''Integration Step Unit'''<br>''(IntegrationStepUnit)''
|'''Task''' (Task)
|
|
This property specifies the unit for Minimum/Initial/Maximum integration step size. The Length unit refers to the arc length that a particle travels/advects within a single step. The Cell Length unit represents the step size as a number of cells.
Specify the task to be performed: modeling and/or
 
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
| 2
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
|
The value must be one of the following: Length (1), Cell Length (2).
3
 
|
 
The value(s) is an enumeration of the following:
* Detailed model of input data (0)
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|-
| '''Integrator Type'''<br>''(IntegratorType)''
|'''TrainingFraction''' (TrainingFraction)
|
|
This property determines which integrator (with increasing accuracy) to use for creating streamlines.
Specify the fraction of values from the input dataset to
 
be used for model fitting. The exact set of values is chosen at random
| 2
from the dataset.
|
0.1
|
|
The value must be one of the following: Runge-Kutta 2 (0), Runge-Kutta 4 (1), Runge-Kutta 4-5 (2).


|-
|-
| '''Interpolator Type'''<br>''(InterpolatorType)''
|'''Normalization Scheme''' (NormalizationScheme)
|
|
This property determines which interpolator to use for evaluating the velocity vector field. The first is faster though the second is more robust in locating cells during streamline integration.
Before the eigenvector decomposition of the covariance
 
matrix takes place, you may normalize each (i,j) entry by sqrt(
| 0
cov(i,i) * cov(j,j) ). This implies that the variance of each variable
of interest should be of equal importance.
|
|
The value must be one of the following: Interpolator with Point Locator (0), Interpolator with Cell Locator (1).
2
 
 
|-
| '''Maximum Error'''<br>''(MaximumError)''
|
This property specifies the maximum error (for Runge-Kutta 4-5) tolerated throughout streamline integration. The Runge-Kutta 4-5 integrator tries to adjust the step size such that the estimated error is less than this threshold.
 
| 1e-06
|
|
The value(s) is an enumeration of the following:
* No normalization (0)
* Normalize using covariances (3)
|-
|-
| '''Maximum Step Length'''<br>''(MaximumIntegrationStep)''
|'''Basis Scheme''' (BasisScheme)
|
When using the Runge-Kutta 4-5 ingrator, this property specifies the maximum integration step size.
 
| 0.5
|
|
|-
When reporting assessments, should the full eigenvector
| '''Maximum Steps'''<br>''(MaximumNumberOfSteps)''
decomposition be used to project the original vector into the new space
(Full basis), or should a fixed subset of the decomposition be used
(Fixed-size basis), or should the projection be clipped to preserve at
least some fixed "energy" (Fixed-energy basis)?&lt;p&gt; As an example,
suppose the variables of interest were {A,B,C,D,E} and that the
eigenvalues of the covariance matrix for these were {5,2,1.5,1,.5}. If
the "Full basis" scheme is used, then all 5 components of the
eigenvectors will be used to project each {A,B,C,D,E}-tuple in the
original data into a new 5-components space.&lt;p&gt; If the
"Fixed-size" scheme is used and the "Basis Size" property is set to 4,
then only the first 4 eigenvector components will be used to project
each {A,B,C,D,E}-tuple into the new space and that space will be of
dimension 4, not 5.&lt;p&gt; If the "Fixed-energy basis" scheme is used
and the "Basis Energy" property is set to 0.8, then only the first 3
eigenvector components will be used to project each {A,B,C,D,E}-tuple
into the new space, which will be of dimension 3. The number 3 is
chosen because 3 is the lowest N for which the sum of the first N
eigenvalues divided by the sum of all eigenvalues is larger than the
specified "Basis Energy" (i.e., (5+2+1.5)/10 = 0.85 &gt;
0.8).
|
|
This property specifies the maximum number of steps, beyond which streamline integration is terminated.
0
 
| 2000
|
|
The value(s) is an enumeration of the following:
* Full basis (0)
* Fixed-size basis (1)
* Fixed-energy basis (2)
|-
|-
| '''Maximum Streamline Length'''<br>''(MaximumPropagation)''
|'''Basis Size''' (BasisSize)
|
|
This property specifies the maximum streamline length (i.e., physical arc length), beyond which line integration is terminated.
The maximum number of eigenvector components to use when
 
projecting into the new space.
| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 1.
2
 
 
|-
| '''Minimum Step Length'''<br>''(MinimumIntegrationStep)''
|
|
When using the Runge-Kutta 4-5 ingrator, this property specifies the minimum integration step size.


| 0.01
|
|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''Basis Energy''' (BasisEnergy)
|
|
This property contains the name of the vector array from which to generate streamlines.
The minimum energy to use when determining the
 
dimensionality of the new space into which the assessment will project
tuples.
|
|
0.1
|
|
An array of vectors is required.


|-
|-
| '''Seed Type'''<br>''(Source)''
|'''RobustPCA''' (RobustPCA)
|
|
The value of this property determines how the seeds for the streamlines will be generated.
Compute robust PCA with medians instead of means.
 
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers).
Accepts boolean values (0 or 1).


The value must be set to one of the following: PointSource, HighResLineSource.
|-
| '''Terminal Speed'''<br>''(TerminalSpeed)''
|
This property specifies the terminal speed, below which particle advection/integration is terminated.
| 1e-12
|
|}
|}


==Probe Location==


==Stream Tracer For Generic Datasets==
Sample data attributes at the points in a point cloud.
 
The Probe filter samples the data set attributes of the
 
current data set at the points in a point cloud. The Probe
Integrate streamlines in a vector field.
filter uses interpolation to determine the values at the
 
selected point, whether or not it lies at an input point.
The Generic Stream Tracer filter generates streamlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4.<br>
The Probe filter operates on any type of data and produces
The Max. Propagation entry box allows you to specify the maximum length of the streamlines. From the Max. Propagation menu, you can select the units to be either Time (the time a particle would travel with steady flow) or Length (in the data set's spatial coordinates).<br>
polygonal output (a point cloud).
The Init. Step Len. menu and entry specify the initial step size for integration. (For non-adaptive integrators, Runge-Kutta 2 and 4, the initial step size is used throughout the integration.) The menu allows you to specify the units. Time and Length have the same meaning as for Max. Propagation. Cell Length specifies the step length as a number of cells.<br>
The Integration Direction menu determines in which direction(s) the stream trace will be generated: FORWARD, BACKWARD, or BOTH.<br>
The Integrator Type section of the interface determines which calculation to use for integration: Runge-Kutta 2, Runge-Kutta 4, or Runge-Kutta 4-5. If Runge-Kutta 4-5 is selected, controls are displayed for specifying the minimum and maximum step length and the maximum error. The controls for specifying Min. Step Len. and Max. Step Len. are the same as those for Init. Step Len. The Runge-Kutta 4-5 integrator tries to choose the step size so that the estimated error is less than the value of the Maximum Error entry.<br>
If the integration takes more than Max. Steps to complete, if the speed goes below Term. Speed, if Max. Propagation is reached, or if a boundary of the input data set is crossed, integration terminates.<br>
This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 6,829: Line 8,536:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Initial Integration Step'''<br>''(InitialIntegrationStep)''
|'''Input''' (Input)
|
This property specifies the dataset from which to obtain
probe values.
|
|
Specify the initial integration step.


| 0.5
|
|
Accepts input of following types:
* vtkDataSet
* vtkCompositeDataSet
The dataset must contain a field array ()
|-
|-
| '''Input'''<br>''(Input)''
|'''Probe Type''' (Source)
|
This property specifies the dataset whose geometry will
be used in determining positions to probe.
|
|
Set the input to the Generic Stream Tracer filter.


|
|
The value can be one of the following:
* FixedRadiusPointSource (extended_sources)
|-
|'''PassFieldArrays''' (PassFieldArrays)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point array with 3 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.


Set whether to pass the field-data arrays from the Input i.e. the input
providing the geometry to the output. On by default.


|
1
|
Accepts boolean values (0 or 1).
|-
|-
| '''Integration Direction'''<br>''(IntegrationDirection)''
|'''ComputeTolerance''' (ComputeTolerance)
|
|
This property determines in which direction(s) a streamline is generated.


| 2
Set whether to compute the tolerance or to use a user provided
value. On by default.
 
|
1
|
|
The value must be one of the following: FORWARD (0), BACKWARD (1), BOTH (2).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Integration Step Unit'''<br>''(IntegrationStepUnit)''
|'''Tolerance''' (Tolerance)
|
|
Choose the unit to use for the integration step.
Set the tolerance to use for
 
vtkDataSet::FindCell
| 2
|
2.2204460492503131e-16
|
|
The value must be one of the following: Time (0), Length (1), Cell Length (2).




|-
|}
| '''Integrator Type'''<br>''(IntegratorType)''
|
This property determines which integrator (with increasing accuracy) to use for creating streamlines.


| 2
==Process Id Scalars==
|
The value must be one of the following: Runge-Kutta 2 (0), Runge-Kutta 4 (1), Runge-Kutta 4-5 (2).


This filter uses colors to show how data is partitioned across processes.The
Process Id Scalars filter assigns a unique scalar value to
each piece of the input according to which processor it
resides on. This filter operates on any type of data when
ParaView is run in parallel. It is useful for determining
whether your data is load-balanced across the processors
being used. The output data set type is the same as that
of the input.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Maximum Error'''<br>''(MaximumError)''
| '''Property'''
|
| '''Description'''
Set the maximum error allowed in the integration. The meaning of
| '''Default Value(s)'''
this value depends on the integrator chosen.
| '''Restrictions'''


| 1e-06
|-
|'''Input''' (Input)
|
|
|-
This property specifies the input to the Process Id
| '''Maximum Integration Step'''<br>''(MaximumIntegrationStep)''
Scalars filter.
|
|
Specify the maximum integration step.


| 0.01
|
|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''Maximum Number Of Steps'''<br>''(MaximumNumberOfSteps)''
|'''RandomMode''' (RandomMode)
|
|
Specify the maximum number of steps used in the integration.
The value of this property determines whether to use
 
random id values for the various pieces. If set to 1, the unique value
| 2000
per piece will be chosen at random; otherwise the unique value will
match the id of the process.
|
|
|-
0
| '''Maximum Propagation'''<br>''(MaximumPropagation)''
|
|
Specify the maximum streamline length.
Accepts boolean values (0 or 1).
 
|}
 
==Programmable Filter==


| 1
Executes a user supplied python script on its input dataset to produce an output dataset.
|
This filter will execute a python script to produce an
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 1.
output dataset. The filter keeps a copy of the python
script in Script, and creates Interpretor, a python
interpretor to run the script upon the first
execution.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Minimum Integration Step'''<br>''(MinimumIntegrationStep)''
|'''Input''' (Input)
|
This property specifies the input(s) to the programmable
filter.
|
|
Specify the minimum integration step.


| 0.01
|
|
Accepts input of following types:
* vtkDataObject
|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''OutputDataSetType''' (OutputDataSetType)
|
|
This property contains the name of the vector array from which to generate streamlines.
The value of this property determines the dataset type
 
for the output of the programmable filter.
|
8
|
The value(s) is an enumeration of the following:
* Same as Input (8)
* vtkPolyData (0)
* vtkStructuredGrid (2)
* vtkRectilinearGrid (3)
* vtkUnstructuredGrid (4)
* vtkImageData (6)
* vtkUniformGrid (10)
* vtkMultiblockDataSet (13)
* vtkHierarchicalBoxDataSet (15)
* vtkTable (19)
|-
|'''Script''' (Script)
|
|
This property contains the text of a python program that
the programmable filter runs.
|
|
An array of vectors is required.


|


|-
|-
| '''Seed Type'''<br>''(Source)''
|'''RequestInformation Script''' (InformationScript)
|
This property is a python script that is executed during
the RequestInformation pipeline pass. Use this to provide information
such as WHOLE_EXTENT to the pipeline downstream.
|
|
The value of this property determines how the seeds for the streamlines will be generated.


|
|
|
The selected object must be the result of the following: sources (includes readers).
The value must be set to one of the following: PointSource, HighResLineSource.


|-
|-
| '''Terminal Speed'''<br>''(TerminalSpeed)''
|'''RequestUpdateExtent Script''' (UpdateExtentScript)
|
This property is a python script that is executed during
the RequestUpdateExtent pipeline pass. Use this to modify the update
extent that your filter ask up stream for.
|
|
If at any point the speed is below this value, the integration is terminated.


| 1e-12
|
|
|}


 
|-
==Stream Tracer With Custom Source==
|'''CopyArrays''' (CopyArrays)
 
|
 
If this property is set to true, all the cell and point
Integrate streamlines in a vector field.
arrays from first input are copied to the output.
 
The Stream Tracer filter generates streamlines in a vector field from a collection of seed points. Production of streamlines terminates if a streamline crosses the exterior boundary of the input dataset. Other reasons for termination are listed for the MaximumNumberOfSteps, TerminalSpeed, and MaximumPropagation properties. This filter operates on any type of dataset, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter takes a Source input that provides the seed points.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
|
|
Specify whether or not to compute vorticity.
0
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Initial Step Length'''<br>''(InitialIntegrationStep)''
|'''Parameters''' (Parameters)
|
|
This property specifies the initial integration step size. For non-adaptive integrators (Runge-Kutta 2 and Runge-Kutta 4), it is fixed (always equal to this initial value) throughout the integration. For an adaptive integrator (Runge-Kutta 4-5), the actual step size varies such that the numerical error is less than a specified threshold.


| 0.2
|
|
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Stream Tracer filter.


|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point array with 3 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Integration Direction'''<br>''(IntegrationDirection)''
|'''PythonPath''' (PythonPath)
|
A semi-colon (;) separated list of directories to add to
the python library search path.
|
|
This property determines in which direction(s) a streamline is generated.


| 2
|
|
The value must be one of the following: FORWARD (0), BACKWARD (1), BOTH (2).


|-
|-
| '''Integration Step Unit'''<br>''(IntegrationStepUnit)''
|'''TimestepValues''' (TimestepValues)
|
Available timestep values.
|
|
This property specifies the unit for Minimum/Initial/Maximum integration step size. The Length unit refers to the arc length that a particle travels/advects within a single step. The Cell Length unit represents the step size as a number of cells.


| 2
|
|
The value must be one of the following: Length (1), Cell Length (2).




|-
|}
| '''Integrator Type'''<br>''(IntegratorType)''
|
This property determines which integrator (with increasing accuracy) to use for creating streamlines.


| 2
==Python Annotation==
|
The value must be one of the following: Runge-Kutta 2 (0), Runge-Kutta 4 (1), Runge-Kutta 4-5 (2).


This filter evaluates a Python expression for a text annotation
This filter uses Python to calculate an expression. It
depends heavily on the numpy and paraview.vtk modules. To
use the parallel functions, mpi4py is also necessary. The
expression is evaluated and the resulting scalar value or
numpy array is added to the output as an array. See numpy
and paraview.vtk documentation for the list of available
functions. This filter tries to make it easy for the user
to write expressions by defining certain variables. The
filter tries to assign each array to a variable of the
same name. If the name of the array is not a valid Python
variable, it has to be accessed through a dictionary
called arrays (i.e. arrays['array_name']).


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Maximum Error'''<br>''(MaximumError)''
| '''Property'''
|
| '''Description'''
This property specifies the maximum error (for Runge-Kutta 4-5) tolerated throughout streamline integration. The Runge-Kutta 4-5 integrator tries to adjust the step size such that the estimated error is less than this threshold.
| '''Default Value(s)'''
| '''Restrictions'''


| 1e-06
|-
|'''Input''' (Input)
|
|
|-
Set the input of the filter.
| '''Maximum Step Length'''<br>''(MaximumIntegrationStep)''
|
|
When using the Runge-Kutta 4-5 ingrator, this property specifies the maximum integration step size.


| 0.5
|
|
Accepts input of following types:
* vtkDataObject
|-
|-
| '''Maximum Steps'''<br>''(MaximumNumberOfSteps)''
|'''ArrayAssociation''' (ArrayAssociation)
|
Select the attribute to use to popular array names from.
|
|
This property specifies the maximum number of steps, beyond which streamline integration is terminated.
2
 
| 2000
|
|
The value(s) is an enumeration of the following:
* Point Data (0)
* Cell Data (1)
* Field Data (2)
* Row Data (6)
|-
|-
| '''Maximum Streamline Length'''<br>''(MaximumPropagation)''
|'''Expression''' (Expression)
|
The Python expression evaluated during execution.
FieldData arrays are direclty available through their name. Set of
provided variables [input, t_value, t_steps, t_range, t_index,
FieldData, PointData, CellData] (i.e.: "Momentum: (%f, %f, %f)" %
(XMOM[t_index,0], YMOM[t_index,0], ZMOM[t_index,0]) )
|
|
This property specifies the maximum streamline length (i.e., physical arc length), beyond which line integration is terminated.


| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 1.


|-
|-
| '''Minimum Step Length'''<br>''(MinimumIntegrationStep)''
|'''AnnotationValue''' (AnnotationValue)
|
When using the Runge-Kutta 4-5 ingrator, this property specifies the minimum integration step size.
 
| 0.01
|
|
|-
Text that is used as annotation
| '''Vectors'''<br>''(SelectInputVectors)''
|
|
This property contains the name of the vector array from which to generate streamlines.


|
|
|
An array of vectors is required.




|-
| '''Source'''<br>''(Source)''
|
This property specifies the input used to obtain the seed points.
|
|
The selected object must be the result of the following: sources (includes readers).
|-
| '''Terminal Speed'''<br>''(TerminalSpeed)''
|
This property specifies the terminal speed, below which particle advection/integration is terminated.
| 1e-12
|
|}
|}


==Python Calculator==


==Subdivide==
This filter evaluates a Python expressionThis filter
 
uses Python to calculate an expression. It depends heavily
 
on the numpy and paraview.vtk modules. To use the parallel
This filter iteratively divide triangles into four smaller triangles. New points are placed linearly so the output surface matches the input surface.
functions, mpi4py is also necessary. The expression is
 
evaluated and the resulting scalar value or numpy array is
The Subdivide filter iteratively divides each triangle in the input dataset into 4 new triangles. Three new points are added per triangle -- one at the midpoint of each edge. This filter operates only on polygonal data containing triangles, so run your polygonal data through the Triangulate filter first if it is not composed of triangles. The output of this filter is also polygonal.<br>
added to the output as an array. See numpy and
paraview.vtk documentation for the list of available
functions. This filter tries to make it easy for the user
to write expressions by defining certain variables. The
filter tries to assign each array to a variable of the
same name. If the name of the array is not a valid Python
variable, it has to be accessed through a dictionary
called arrays (i.e. arrays['array_name']). The points can
be accessed using the points variable.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,111: Line 8,838:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input of the filter.
|
|
This parameter specifies the input to the Subdivide filter.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''Expression''' (Expression)
|
The Python expression evaluated during
execution.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|-
 
|'''ArrayAssociation''' (ArrayAssociation)
|
This property controls the association of the output
array as well as which arrays are defined as variables.
|
0
|
The value(s) is an enumeration of the following:
* Point Data (0)
* Cell Data (1)
|-
|'''ArrayName''' (ArrayName)
|
The name of the output array.
|
result
|


|-
|-
| '''Number of Subdivisions'''<br>''(NumberOfSubdivisions)''
|'''CopyArrays''' (CopyArrays)
|
|
The value of this property specifies the number of subdivision iterations to perform.
If this property is set to true, all the cell and point
 
arrays from first input are copied to the output.
| 1
|
1
|
|
The value must be greater than or equal to 1 and less than or equal to 4.
Accepts boolean values (0 or 1).
 


|}
|}


==Quadric Clustering==


==Surface Flow==
This filter is the same filter used to generate level of detail for ParaView. It uses a structured grid of bins and merges all points contained in each bin.The Quadric
 
Clustering filter produces a reduced-resolution polygonal
 
approximation of the input polygonal dataset. This filter
This filter integrates flow through a surface.
is the one used by ParaView for computing LODs. It uses
 
spatial binning to reduce the number of points in the data
The flow integration fitler  integrates the dot product of a point flow vector field and surface normal. It computes the net flow across the 2D surface. It operates on any type of dataset and produces an unstructured grid output.<br>
set; points that lie within the same spatial bin are
collapsed into one representative point.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,150: Line 8,904:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Quadric
Clustering filter.
|
|
This property specifies the input to the Surface Flow filter.


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''Number of Dimensions''' (NumberOfDivisions)
|
This property specifies the number of bins along the X,
Y, and Z axes of the data set.
|
50 50 50
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point array with 3 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Select Input Vectors'''<br>''(SelectInputVectors)''
|'''UseInputPoints''' (UseInputPoints)
|
|
The value of this property specifies the name of the input vector array containing the flow vector field.
If the value of this property is set to 1, the
 
representative point for each bin is selected from one of the input
points that lies in that bin; the input point that produces the least
error is chosen. If the value of this property is 0, the location of
the representative point is calculated to produce the least error
possible for that bin, but the point will most likely not be one of the
input points.
|
|
1
|
|
An array of vectors is required.
Accepts boolean values (0 or 1).
 
 
|}
 
 
==Surface Vectors==
 
 
This filter constrains vectors to lie on a surface.
 
The Surface Vectors filter is used for 2D data sets. It constrains vectors to lie in a surface by removing components of the vectors normal to the local surface.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''UseFeatureEdges''' (UseFeatureEdges)
| '''Description'''
|
| '''Default Value(s)'''
If this property is set to 1, feature edge quadrics will
| '''Restrictions'''
be used to maintain the boundary edges along processor
divisions.
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Constraint Mode'''<br>''(ConstraintMode)''
|'''UseFeaturePoints''' (UseFeaturePoints)
|
|
This property specifies whether the vectors will be parallel or perpendicular to the surface. If the value is set to PerpendicularScale (2), then the output will contain a scalar array with the dot product of the surface normal and the vector at each point.
If this property is set to 1, feature point quadrics
 
will be used to maintain the boundary points along processor
| 0
divisions.
|
0
|
|
The value must be one of the following: Parallel (0), Perpendicular (1), PerpendicularScale (2).
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''CopyCellData''' (CopyCellData)
|
|
This property specifies the input to the Surface Vectors filter.
If this property is set to 1, the cell data from the
 
input will be copied to the output.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The dataset must contain a point array with 3 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
|-
| '''Select Input Vectors'''<br>''(SelectInputVectors)''
|'''UseInternalTriangles''' (UseInternalTriangles)
|
|
This property specifies the name of the input vector array to process.
If this property is set to 1, triangles completely
 
contained in a spatial bin will be included in the computation of the
bin's quadrics. When this property is set to 0, the filters operates
faster, but the resulting surface may not be as
well-behaved.
|
|
0
|
|
An array of vectors is required.
Accepts boolean values (0 or 1).
 


|}
|}


==Random Attributes==


==Table To Points==
This filter creates a new random attribute array and sets it as the default array.
 
The Random Attributes filter creates random attributes
including scalars and vectors. These attributes can be
generated as point data or cell data. The generation of each
component is normalized between a user-specified minimum and
maximum value.


Converts table to set of points.
This filter provides that capability to specify the data type
of the attributes and the range for each of the components.


The TableToPolyData filter converts a vtkTable to a set of points in a<br>
vtkPolyData. One must specifies the columns in the input table to use as<br>
the X, Y and Z coordinates for the points in the output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,246: Line 9,001:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Random Scalars
filter.
|
|
This property specifies the input..


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''DataType''' (DataType)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a  array with 1 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable.


Specify the type of array to create (all components of this
array are of this type). This holds true for all arrays that
are created.


|
3
|
The value(s) is an enumeration of the following:
* Bit (1)
* Char (2)
* UnsignedChar (3)
* Short (4)
* UnsignedShort (5)
* Int (6)
* UnsignedInt (7)
* Long (8)
* UnsignedLong (9)
* Float (10)
* Double (11)
|-
|-
| '''X Column'''<br>''(XColumn)''
|'''ComponentRange''' (ComponentRange)
|
|
This property specifies which data array is going to be used as the
Set the range values (minimum and maximum) for
X coordinate in the generated polydata dataset.
each component. This applies to all data that is
 
generated.
|
|
0 255
|
|
An array of scalars is required.


|-
|-
| '''Y Column'''<br>''(YColumn)''
|'''AttributesConstantPerBlock''' (AttributesConstantPerBlock)
|
|
This property specifies which data array is going to be used as the
Indicate that the generated attributes are
Y coordinate in the generated polydata dataset.
constant within a block. This can be used to highlight
 
blocks in a composite dataset.
|
|
0
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
|-
 
|'''GeneratePointScalars''' (GeneratePointScalars)
|
Indicate that point scalars are to be
generated.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''GeneratePointVectors''' (GeneratePointVectors)
|
Indicate that point vectors are to be
generated.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''GenerateCellScalars''' (GenerateCellScalars)
|
Indicate that point scalars are to be
generated.
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Z Column'''<br>''(ZColumn)''
|'''GenerateCellVectors''' (GenerateCellVectors)
|
|
This property specifies which data array is going to be used as the
Indicate that point vectors are to be
Z coordinate in the generated polydata dataset.
generated.
 
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 


|}
|}


==Random Vectors==


==Table To Structured Grid==
This filter creates a new 3-component point data array and sets it as the default vector array. It uses a random number generator to create values.The Random
 
Vectors filter generates a point-centered array of random
 
vectors. It uses a random number generator to determine
Converts to table to structured grid.
the components of the vectors. This filter operates on any
 
type of data set, and the output data set will be of the
The TableToStructuredGrid filter converts a vtkTable to a<br>
same type as the input.
vtkStructuredGrid.  One must specifies the columns in the input table to<br>
use as the X, Y and Z coordinates for the points in the output, and the<br>
whole extent.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,314: Line 9,109:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input..
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a  array with 1 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable.


|-
|-
| '''Whole Extent'''<br>''(WholeExtent)''
|'''Input''' (Input)
|
|
| 0 0 0 0 0 0
This property specifies the input to the Random Vectors
|
filter.
|-
| '''X Column'''<br>''(XColumn)''
|
|
This property specifies which data array is going to be used as the
X coordinate in the generated polydata dataset.


|
|
|
Accepts input of following types:
An array of scalars is required.
* vtkDataSet
 
 
|-
|-
| '''Y Column'''<br>''(YColumn)''
|'''MinimumSpeed''' (MinimumSpeed)
|
|
This property specifies which data array is going to be used as the
This property specifies the minimum length of the random
Y coordinate in the generated polydata dataset.
point vectors generated.
 
|
|
0
|
|
An array of scalars is required.


|-
|-
| '''Z Column'''<br>''(ZColumn)''
|'''MaximumSpeed''' (MaximumSpeed)
|
|
This property specifies which data array is going to be used as the
This property specifies the maximum length of the random
Z coordinate in the generated polydata dataset.
point vectors generated.
 
|
|
1
|
|
An array of scalars is required.




|}
|}


==Rectilinear Data to Point Set==


==Temporal Cache==
Converts a rectilinear grid to an equivalend structured gridThe Rectilinear Grid to Point Set
 
filter takes an rectilinear grid object and outputs an
 
equivalent Structured Grid (which is a type of point set). This
Saves a copy of the data set for a fixed number of time steps.
brings the data to a broader category of data storage but only
 
adds a small amount of overhead. This filter can be helpful in
The Temporal Cache can be used to save multiple copies of a data set at different time steps to prevent thrashing in the pipeline caused by downstream filters that adjust the requested time step.  For example, assume that there is a downstream Temporal Interpolator filter. This filter will (usually) request two time steps from the upstream filters, which in turn (usually) causes the upstream filters to run twice, once for each time step.  The next time the interpolator requests the same two time steps, they might force the upstream filters to re-evaluate the same two time steps.  The Temporal Cache can keep copies of both of these time steps and provide the requested data without having to run upstream filters.<br>
applying filters that expect or manipulate point
coordinates.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,384: Line 9,157:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Cache Size'''<br>''(CacheSize)''
|'''Input''' (Input)
|
|
The cache size determines the number of time steps that can be cached at one time.  The maximum number is 10.  The minimum is 2 (since it makes little sense to cache less than that).


| 2
|
|
The value must be greater than or equal to 2 and less than or equal to 10.
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input of the Temporal Cache filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.


|
Accepts input of following types:
* vtkRectilinearGrid


|}
|}


==Rectilinear Grid Connectivity==


==Temporal Interpolator==
Parallel fragments extraction and attributes integration on rectilinear grids.
 
Extracts material fragments from multi-block vtkRectilinearGrid datasets
 
based on the selected volume fraction array(s) and a fraction isovalue
Interpolate between time steps.
and integrates the associated attributes.
 
The Temporal Interpolator converts data that is defined at discrete time steps to one that is defined over a continuum of time by linearly interpolating the data's field data between two adjacent time steps.  The interpolated values are a simple approximation and should not be interpreted as anything more.  The Temporal Interpolator assumes that the topology between adjacent time steps does not change.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,423: Line 9,183:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Discrete Time Step Interval'''<br>''(DiscreteTimeStepInterval)''
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
|
If Discrete Time Step Interval is set to 0, then the Temporal Interpolator will provide a continuous region of time on its output.  If set to anything else, then the output will define a finite set of time points on its output, each spaced by the Discrete Time Step Interval.  The output will have (time range)/(discrete time step interval) time steps.  (Note that the time range is defined by the time range of the data of the input filter, which may be different from other pipeline objects or the range defined in the animation inspector.)  This is a useful option to use if you have a dataset with one missing time step and wish to 'file-in' the missing data with an interpolated value from the steps on either side.


| 0
|
|
Accepts input of following types:
* vtkRectilinearGrid
* vtkCompositeDataSet
The dataset must contain a field array (cell)
with 1 component(s).
|-
|-
| '''Input'''<br>''(Input)''
|'''Double Volume Arrays''' (AddDoubleVolumeArrayName)
|
This property specifies the name(s) of the volume
fraction array(s) for generating parts.
|
|
This property specifies the input of the Temporal Interpolator.


|
|
An array of scalars is required.
|-
|'''Float Volume Arrays''' (AddFloatVolumeArrayName)
|
This property specifies the name(s) of the volume
fraction array(s) for generating parts.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|
An array of scalars is required.
|-
|'''Unsigned Character Volume Arrays''' (AddUnsignedCharVolumeArrayName)
|
This property specifies the name(s) of the volume
fraction array(s) for generating parts.
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.
|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
The value of this property is the volume fraction value
for the surface.
|
0.1
|




|}
|}


==RectilinearGridGeometryFilter==


==Temporal Shift Scale==
Extracts geometry for a rectilinear grid. Output is a polydata dataset.
 
RectilinearGridGeometryFilter is a filter that extracts
 
geometry from a rectilinear grid. By specifying
Shift and scale time values.
appropriate i-j-k indices, it is possible to extract a
 
point, a curve, a surface, or a "volume". The volume is
The Temporal Shift Scale filter linearly transforms the time values of a pipeline object by applying a shift and then scale. Given a data at time t on the input, it will be transformed to time t*Shift + Scale on the output.  Inversely, if this filter has a request for time t, it will request time (t-Shift)/Scale on its input.<br>
actually a (n x m x o) region of points. The extent
specification is zero-offset. That is, the first k-plane
in a 50x50x50 rectilinear grid is given by (0,49, 0,49,
0,0).


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,459: Line 9,256:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the Rectilinear Grid Geometry
filter.
|
|
The input to the Temporal Shift Scale filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkDataSet
 
|}
 
==ReductionFilter==




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Maximum Number Of Periods'''<br>''(MaximumNumberOfPeriods)''
|'''Input''' (Input)
|
Set the input to the Reduction filter.
|
|
If Periodic time is enabled, this controls how many time periods time is reported
for. A filter cannot output an infinite number of time steps and therefore a finite
number of periods is generated when reporting time.


| 1
|
|
The value must be greater than or equal to 0 and less than or equal to 100.


|-
|'''PreGatherHelperName''' (PreGatherHelperName)
|
Set the algorithm that runs on each node in
parallel.
|
|


|-
|-
| '''Periodic'''<br>''(Periodic)''
|'''PostGatherHelperName''' (PostGatherHelperName)
|
Set the algorithm that takes multiple inputs and
produces a single reduced output.
|
|
If Periodic is true, requests for time will be wrapped around so that
the source appears to be a periodic time source. If data exists for times
{0,N-1}, setting periodic to true will cause time 0 to be produced when time
N, 2N, 2N etc is requested. This effectively gives the source the ability to
generate time data indefinitely in a loop.
When combined with Shift/Scale, the time becomes periodic in the
shifted and scaled time frame of reference.
Note: Since the input time may not start at zero, the wrapping of time
from the end of one period to the start of the next, will subtract the
initial time - a source with T{5..6} repeated periodicaly will have output
time {5..6..7..8} etc.


| 0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Periodic End Correction'''<br>''(PeriodicEndCorrection)''
|'''PostGatherHelper''' (PostGatherHelper)
|
|
If Periodic time is enabled, this flag determines if the last time step is the same
as the first. If PeriodicEndCorrection is true, then it is assumed that the input
data goes from 0-1 (or whatever scaled/shifted actual time) and time 1 is the
same as time 0 so that steps will be 0,1,2,3...N,1,2,3...N,1,2,3 where step N
is the same as 0 and step 0 is not repeated. When this flag is false
the data is assumed to be literal and output is of the form 0,1,2,3...N,0,1,2,3...
By default this flag is ON


| 1
|
|
Only the values 0 and 1 are accepted.


|


|-
|-
| '''Post Shift'''<br>''(PostShift)''
|'''PreGatherHelper''' (PreGatherHelper)
|
 
|
|
The amount of time the input is shifted.


| 0
|
|
|-
|-
| '''Pre Shift'''<br>''(PreShift)''
|'''PassThrough''' (PassThrough)
|
If set to a non-negative value, then produce results
using only the node Id specified.
|
-1
|
|
Apply a translation to the data before scaling.
To convert T{5,100} to T{0,1} use Preshift=-5, Scale=1/95, PostShift=0
To convert T{5,105} to T{5,10} use Preshift=-5, Scale=5/100, PostShift=5


| 0
|-
|'''GenerateProcessIds''' (GenerateProcessIds)
|
If true, the filter will generate vtkOriginalProcessIds
arrays indicating the process id on which the cell/point was
generated.
|
|
|-
0
| '''Scale'''<br>''(Scale)''
|
|
The factor by which the input time is scaled.
Accepts boolean values (0 or 1).


| 1
|
|}
|}


==Reflect==


==Temporal Snap-to-Time-Step==
This filter takes the union of the input and its reflection over an axis-aligned plane.The
 
Reflect filter reflects the input dataset across the
 
specified plane. This filter operates on any type of data
Modifies the time range/steps of temporal data.
set and produces an unstructured grid
 
output.
This file modifies the time range or time steps of<br>
the data without changing the data itself. The data is not resampled<br>
by this filter, only the information accompanying the data is modified.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,561: Line 9,359:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
This property specifies the input to the Reflect
filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.


|
Accepts input of following types:
* vtkDataSet
|-
|'''Plane''' (Plane)
|
The value of this property determines which plane to
reflect across. If the value is X, Y, or Z, the value of the Center
property determines where the plane is placed along the specified axis.
The other six options (X Min, X Max, etc.) place the reflection plane
at the specified face of the bounding box of the input
dataset.
|
0
|
The value(s) is an enumeration of the following:
* X Min (0)
* Y Min (1)
* Z Min (2)
* X Max (3)
* Y Max (4)
* Z Max (5)
* X (6)
* Y (7)
* Z (8)
|-
|'''Center''' (Center)
|
If the value of the Plane property is X, Y, or Z, then
the value of this property specifies the center of the reflection
plane.
|
0.0
|


|-
|-
| '''Snap Mode'''<br>''(SnapMode)''
|'''CopyInput''' (CopyInput)
|
|
Determine which time step to snap to.
If this property is set to 1, the output will contain
 
the union of the input dataset and its reflection. Otherwise the output
| 0
will contain only the reflection of the input data.
|
1
|
|
The value must be one of the following: Nearest (0), NextBelowOrEqual (1), NextAboveOrEqual (2).
Accepts boolean values (0 or 1).
 


|}
|}


==Resample AMR==


==Temporal Statistics==
Converts AMR data to a uniform gridThis
 
filter allows the user to specify a Region of Interest(ROI)
 
within the AMR data-set and extract it as a uniform
Loads in all time steps of a data set and computes some statistics about how each point and cell variable changes over time.
grid.
 
Given an input that changes over time, vtkTemporalStatistics looks<br>
at the data for each time step and computes some statistical<br>
information of how a point or cell variable changes over time.  For<br>
example, vtkTemporalStatistics can compute the average value of<br>
"pressure" over time of each point.<br><br><br>
Note that this filter will require the upstream filter to be run on<br>
every time step that it reports that it can compute.  This may be a<br>
time consuming operation.<br><br><br>
vtkTemporalStatistics ignores the temporal spacing.  Each timestep<br>
will be weighted the same regardless of how long of an interval it<br>
is to the next timestep.  Thus, the average statistic may be quite<br>
different from an integration of the variable if the time spacing<br>
varies.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,610: Line 9,428:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Average'''<br>''(ComputeAverage)''
|'''Input''' (Input)
|
This property specifies the input for this
filter.
|
|
Compute the average of each point and cell variable over time.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkOverlappingAMR
 
|-
|-
| '''Compute Maximum'''<br>''(ComputeMaximum)''
|'''Demand-Driven Mode''' (Demand-Driven Mode)
|
|
Compute the maximum of each point and cell variable over time.
This property specifies whether the resampling filter
 
will operate in demand-driven mode or not.
| 1
|
1
|
Accepts boolean values (0 or 1).
|-
|'''TransferToNodes''' (TransferToNodes)
|
This property specifies whether the solution will be
transfered to the nodes of the extracted region or the
cells.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''NumberOfPartitions''' (NumberOfPartitions)
|
Set the number of subdivisions for recursive coordinate
bisection.
|
1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Compute Minimum'''<br>''(ComputeMinimum)''
|'''Number of Samples''' (Number of Samples)
|
Sets the number of samples in each
dimension
|
|
Compute the minimum of each point and cell variable over time.
10 10 10
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Compute Standard Deviation'''<br>''(ComputeStandardDeviation)''
|'''Min''' (Min)
|
This property sets the minimum 3-D coordinate location
by which the particles will be filtered out.
|
|
Compute the standard deviation of each point and cell variable over time.
0.0 0.0 0.0
 
| 1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Input'''<br>''(Input)''
|'''Max''' (Max)
|
|
Set the input to the Temporal Statistics filter.
This property sets the minimum 3-D coordinate location
 
by which the particles will be filtered out.
|
|
0.0 0.0 0.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|}
|}


==Resample With Dataset==


==Tessellate==
Sample data attributes at the points of a dataset.
 
Probe is a filter that computes point attributes at
 
specified point positions. The filter has two inputs: the
Tessellate nonlinear curves, surfaces, and volumes with lines, triangles, and tetrahedra.
Input and Source. The 'Source' geometric structure is passed
 
through the filter. The point attributes are computed at
The Tessellate filter tessellates cells with nonlinear geometry and/or scalar fields into a simplicial complex with linearly interpolated field values that more closely approximate the original field. This is useful for datasets containing quadratic cells.<br>
the 'Source' point positions by interpolating into the
'Input' data. For example, we can compute data values on a plane
(plane specified as Source) from a volume (Input). The
cell data of the Input data is copied to the output based
on in which Input cell each Source point is. If an array
of the same name exists both in Input's point and cell
data, only the one from the point data is
probed.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,679: Line 9,519:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Chord Error'''<br>''(ChordError)''
|'''Input''' (Input)
|
This property specifies the dataset from which to obtain
probe values. The data attributes come from this dataset.
|
|
This property controls the maximum chord error allowed at any edge midpoint in the output tessellation. The chord error is measured as the distance between the midpoint of any output edge and the original nonlinear geometry.


| 0.001
|
|
Accepts input of following types:
* vtkDataSet
* vtkCompositeDataSet
The dataset must contain a field array ()
|-
|-
| '''Field Error'''<br>''(FieldError2)''
|'''Source''' (Source)
|
This property specifies the dataset whose geometry will
be used in determining positions to probe. The mesh comes from this
dataset.
|
|
This proeprty controls the maximum field error allowed at any edge midpoint in the output tessellation. The field error is measured as the difference between a field value at the midpoint of an output edge and the value of the corresponding field in the original nonlinear geometry.


|
|
|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''Input'''<br>''(Input)''
|'''PassCellArrays''' (PassCellArrays)
|
|
This property specifies the input to the Tessellate filter.
 
When set the input's cell data arrays are shallow copied to the output.


|
|
0
|
Accepts boolean values (0 or 1).
|-
|'''PassPointArrays''' (PassPointArrays)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


When set the input's point data arrays are shallow copied to the output.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData, vtkDataSet, vtkUnstructuredGrid.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''PassFieldArrays''' (PassFieldArrays)
|


Set whether to pass the field-data arrays from the Input i.e. the input
providing the geometry to the output. On by default.


|-
| '''Maximum Number of Subdivisions'''<br>''(MaximumNumberOfSubdivisions)''
|
|
This property specifies the maximum number of times an edge may be subdivided. Increasing this number allows further refinement but can drastically increase the computational and storage requirements, especially when the value of the OutputDimension property is 3.
1
 
| 3
|
|
The value must be greater than or equal to 0 and less than or equal to 8.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Merge Points'''<br>''(MergePoints)''
|'''ComputeTolerance''' (ComputeTolerance)
|
If the value of this property is set to 1, coincident vertices will be merged after tessellation has occurred. Only geometry is considered during the merge and the first vertex encountered is the one whose point attributes will be used. Any discontinuities in point fields will be lost. On the other hand, many operations, such as streamline generation, require coincident vertices to be merged.
Toggle whether to merge coincident vertices.
 
| 1
|
|
Only the values 0 and 1 are accepted.


Set whether to compute the tolerance or to use a user provided
value. On by default.


|-
| '''Output Dimension'''<br>''(OutputDimension)''
|
|
The value of this property sets the maximum dimensionality of the output tessellation. When the value of this property is 3, 3D cells produce tetrahedra, 2D cells produce triangles, and 1D cells produce line segments. When the value is 2, 3D cells will have their boundaries tessellated with triangles. When the value is 1, all cells except points produce line segments.
1
 
| 3
|
|
The value must be greater than or equal to 1 and less than or equal to 3.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Reset Field Criteria'''<br>''(ResetFieldCriteria)''
|'''Tolerance''' (Tolerance)
|
|
Set the tolernce to use for
vtkDataSet::FindCell
|
|
2.2204460492503131e-16
|
|
|}




==Tetrahedralize==
|}


==Ribbon==


This filter converts 3-d cells to tetrahedrons and polygons to triangles. The output is always of type unstructured grid.
This filter generates ribbon surface from lines. It is useful for displaying streamlines.The Ribbon
 
filter creates ribbons from the lines in the input data
The Tetrahedralize filter converts the 3D cells of any type of dataset to tetrahedrons and the 2D ones to triangles. This filter always produces unstructured grid output.<br>
set. This filter is useful for visualizing streamlines.
Both the input and output of this filter are polygonal
data. The input data set must also have at least one
point-centered vector array.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,758: Line 9,613:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Ribbon
filter.
|
|
This property specifies the input to the Tetrahedralize filter.


|
|
|
Accepts input of following types:
The selected object must be the result of the following: sources (includes readers), filters.
* vtkPolyData
The dataset must contain a field array (point)


with 3 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The dataset must contain a field array (point)


with 1 component(s).


|}
|-
|'''Scalars''' (SelectInputScalars)
|
The value of this property indicates the name of the
input scalar array used by this filter. The width of the ribbons will
be varied based on the values in the specified array if the value of
the Width property is 1.
|


|
An array of scalars is required.
|-
|'''Vectors''' (SelectInputVectors)
|
The value of this property indicates the name of the
input vector array used by this filter. If the UseDefaultNormal
property is set to 0, the normal vectors for the ribbons come from the
specified vector array.
|
1
|
An array of vectors is required.
|-
|'''Width''' (Width)
|
If the VaryWidth property is set to 1, the value of this
property is the minimum ribbon width. If the VaryWidth property is set
to 0, the value of this property is half the width of the
ribbon.
|
1
|


==Texture Map to Cylinder==
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
0.01.


|-
|'''Angle''' (Angle)
|
The value of this property specifies the offset angle
(in degrees) of the ribbon from the line normal.
|
0
|


Generate texture coordinates by mapping points to cylinder.
This is a filter that generates 2D texture coordinates by mapping input<br>
dataset points onto a cylinder. The cylinder is generated automatically.<br>
The cylinder is generated automatically by computing the axis of the<br>
cylinder. Note that the generated texture coordinates for the s-coordinate<br>
ranges from (0-1) (corresponding to angle of 0->360 around axis), while the<br>
mapping of the t-coordinate is controlled by the projection of points along<br>
the axis.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''UseDefaultNormal''' (UseDefaultNormal)
| '''Description'''
|
| '''Default Value(s)'''
If this property is set to 0, and the input contains no
| '''Restrictions'''
vector array, then default ribbon normals will be generated
(DefaultNormal property); if a vector array has been set
(SelectInputVectors property), the ribbon normals will be set from the
specified array. If this property is set to 1, the default normal
(DefaultNormal property) will be used, regardless of whether the
SelectInputVectors property has been set.
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''DefaultNormal''' (DefaultNormal)
|
|
Set the input to the Texture Map to Cylinder filter.
The value of this property specifies the normal to use
 
when the UseDefaultNormal property is set to 1 or the input contains no
vector array (SelectInputVectors property).
|
|
0 0 1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
|-
| '''Prevent Seam'''<br>''(PreventSeam)''
|'''VaryWidth''' (VaryWidth)
|
|
Control how the texture coordinates are generated. If Prevent Seam
If this property is set to 1, the ribbon width will be
is set, the s-coordinate ranges from 0->1 and 1->0 corresponding
scaled according to the scalar array specified in the
to the theta angle variation between 0->180 and 180->0
SelectInputScalars property. Toggle the variation of ribbon width with
degrees. Otherwise, the s-coordinate ranges from 0->1 between
scalar value.
0->360 degrees.
|
 
0
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Rotational Extrusion==


==Texture Map to Plane==
This filter generates a swept surface while translating the input along a circular path.
 
The Rotational Extrusion filter forms a surface by
 
rotating the input about the Z axis. This filter is
Generate texture coordinates by mapping points to plane.
intended to operate on 2D polygonal data. It produces
 
polygonal output.
TextureMapToPlane is a filter that generates 2D texture coordinates by<br>
mapping input dataset points onto a plane. The plane is generated<br>
automatically. A least squares method is used to generate the plane<br>
automatically.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,839: Line 9,730:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Rotational
Extrusion filter.
|
|
Set the input to the Texture Map to Plane filter.


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''Resolution''' (Resolution)
|
The value of this property controls the number of
intermediate node points used in performing the sweep (rotating from 0
degrees to the value specified by the Angle property.
|
12
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Capping''' (Capping)
|
If this property is set to 1, the open ends of the swept
surface will be capped with a copy of the input dataset. This works
property if the input is a 2D surface composed of filled polygons. If
the input dataset is a closed solid (e.g., a sphere), then either two
copies of the dataset will be drawn or no surface will be drawn. No
surface is drawn if either this property is set to 0 or if the two
surfaces would occupy exactly the same 3D space (i.e., the Angle
property's value is a multiple of 360, and the values of the
Translation and DeltaRadius properties are 0).
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Angle''' (Angle)
|
This property specifies the angle of rotation in
degrees. The surface is swept from 0 to the value of this
property.
|
360
|
|-
|'''Translation''' (Translation)
|
The value of this property specifies the total amount of
translation along the Z axis during the sweep process. Specifying a
non-zero value for this property allows you to create a corkscrew
(value of DeltaRadius &gt; 0) or spring effect.
|
0
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
|'''DeltaRadius''' (DeltaRadius)
|
The value of this property specifies the change in
radius during the sweep process.
|
0
|




|}
|}


==Scatter Plot==


==Texture Map to Sphere==
Creates a scatter plot from a dataset.This
 
filter creates a scatter plot from a
 
dataset.
Generate texture coordinates by mapping points to sphere.
 
This is a filter that generates 2D texture coordinates by mapping input<br>
dataset points onto a sphere. The sphere is generated automatically. The<br>
sphere is generated automatically by computing the center i.e. averaged<br>
coordinates, of the sphere. Note that the generated texture coordinates<br>
range between (0,1). The s-coordinate lies in the angular direction around<br>
the z-axis, measured counter-clockwise from the x-axis. The t-coordinate<br>
lies in the angular direction measured down from the north pole towards<br>
the south pole.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,875: Line 9,812:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
Set the input to the Texture Map to Sphere filter.
 
|
|
This property specifies the input to the
filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
| '''Prevent Seam'''<br>''(PreventSeam)''
|
|
Control how the texture coordinates are generated. If Prevent Seam
Accepts input of following types:
is set, the s-coordinate ranges from 0->1 and 1->0 corresponding
* vtkDataSet
to the theta angle variation between 0->180 and 180->0
degrees. Otherwise, the s-coordinate ranges from 0->1 between
0->360 degrees.
 
| 1
|
Only the values 0 and 1 are accepted.
 


|}
|}


==Shrink==


==Threshold==
This filter shrinks each input cell so they pull away from their neighbors.The Shrink filter
 
causes the individual cells of a dataset to break apart
 
from each other by moving each cell's points toward the
This filter extracts cells that have point or cell scalars in the specified range.
centroid of the cell. (The centroid of a cell is the
 
average position of its points.) This filter operates on
The Threshold filter extracts the portions of the input dataset whose scalars lie within the specified range. This filter operates on either point-centered or cell-centered data. This filter operates on any type of dataset and produces unstructured grid output.<br><br><br>
any type of dataset and produces unstructured grid
To select between these two options, select either Point Data or Cell Data from the Attribute Mode menu. Once the Attribute Mode has been selected, choose the scalar array from which to threshold the data from the Scalars menu. The Lower Threshold and Upper Threshold sliders determine the range of the scalars to retain in the output. The All Scalars check box only takes effect when the Attribute Mode is set to Point Data. If the All Scalars option is checked, then a cell will only be passed to the output if the scalar values of all of its points lie within the range indicated by the Lower Threshold and Upper Threshold sliders. If unchecked, then a cell will be added to the output if the specified scalar value for any of its points is within the chosen range.<br>
output.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,919: Line 9,842:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''All Scalars'''<br>''(AllScalars)''
|'''Input''' (Input)
|
This property specifies the input to the Shrink
filter.
|
|
If the value of this property is 1, then a cell is only included in the output if the value of the selected array for all its points is within the threshold. This is only relevant when thresholding by a point-centered array.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
 
* vtkDataSet
 
|-
|-
| '''Input'''<br>''(Input)''
|'''ShrinkFactor''' (ShrinkFactor)
|
|
This property specifies the input to the Threshold filter.
The value of this property determines how far the points
 
will move. A value of 0 positions the points at the centroid of the
cell; a value of 1 leaves them at their original
positions.
|
|
0.5
|
|
The selected object must be the result of the following: sources (includes readers), filters.




The dataset must contain a point or cell array with 1 components.
|}


==Slice==


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
This filter slices a data set with a plane. Slicing is similar to a contour. It creates surfaces from volumes and lines from surfaces.This filter
 
extracts the portion of the input dataset that lies along
the specified plane. The Slice filter takes any type of
dataset as input. The output of this filter is polygonal
data.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Slice
filter.
|
|
The value of this property contains the name of the scalar array from which to perform thresholding.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''Slice Type''' (CutFunction)
|
This property sets the parameters of the slice
function.
|
|
|
An array of scalars is required.
The value can be one of the following:
* Plane (implicit_functions)


* Box (implicit_functions)


Valud array names will be chosen from point and cell data.
* Sphere (implicit_functions)


* Cylinder (implicit_functions)


|-
|-
| '''Threshold Range'''<br>''(ThresholdBetween)''
|'''InputBounds''' (InputBounds)
|
 
|
|
The values of this property specify the upper and lower bounds of the thresholding operation.


| 0 0
|
|
The value must lie within the range of the selected data array.


|-
|'''Crinkle slice''' (PreserveInputCells)
|
This parameter controls whether to extract the entire
cells that are sliced by the region or just extract a triangulated
surface of that region.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Triangulate the slice''' (Triangulate the slice)
|
This parameter controls whether to produce triangles in the output.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Slice Offset Values''' (ContourValues)
|
The values in this property specify a list of current
offset values. This can be used to create multiple slices with
different centers. Each entry represents a new slice with its center
shifted by the offset value.
|


|}
|


Determine the length of the dataset's diagonal.
The value must lie within -diagonal length to +diagonal length.


==Transform==


|}


This filter applies transformation to the polygons.
==Slice (demand-driven-composite)==


The Transform filter allows you to specify the position, size, and orientation of polygonal, unstructured grid, and curvilinear data sets.<br>
This filter slices a data set with a plane. Slicing is similar to a contour. It creates surfaces from volumes and lines from surfaces.This filter
extracts the portion of the input dataset that lies along
the specified plane. The Slice filter takes any type of
dataset as input. The output of this filter is polygonal
data.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 7,984: Line 9,966:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input to the Slice
filter.
|
|
This property specifies the input to the Transform filter.


|
|
Accepts input of following types:
* vtkDataObject
|-
|'''Slice Type''' (CutFunction)
|
This property sets the parameters of the slice
function.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|
The value can be one of the following:
* Plane (implicit_functions)


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
* Box (implicit_functions)


* Sphere (implicit_functions)


|-
|-
| '''Transform'''<br>''(Transform)''
|'''InputBounds''' (InputBounds)
|
|
The values in this property allow you to specify the transform (translation, rotation, and scaling) to apply to the input dataset.


|
|
|
|
The selected object must be the result of the following: transforms.


|-
|'''Slice Offset Values''' (ContourValues)
|
The values in this property specify a list of current
offset values. This can be used to create multiple slices with
different centers. Each entry represents a new slice with its center
shifted by the offset value.
|


The value must be set to one of the following: Transform3.
|
 
Determine the length of the dataset's diagonal.
The value must lie within -diagonal length to +diagonal length.




|}
|}


==Slice AMR data==


==Triangle Strips==
Slices AMR DataThis filter slices AMR
 
data.
 
This filter uses a greedy algorithm to convert triangles into triangle strips
 
The Triangle Strips filter converts triangles into triangle strips and lines into polylines. This filter operates on polygonal data sets and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 8,026: Line 10,028:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
This property specifies the input for this
filter.
|
|
This property specifies the input to the Triangle Strips filter.


|
|
Accepts input of following types:
* vtkOverlappingAMR
|-
|'''ForwardUpstream''' (ForwardUpstream)
|
This property specifies whether or not requests will be
propagated upstream.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''EnablePrefetching''' (EnablePrefetching)
|
This property specifies whether or not requests
pre-fetching of blocks of the next level will be
enabled.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Level''' (Level)
|
Set maximum slice resolution.
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
|'''OffSet''' (OffSet)
|
Set's the offset from the origin of the
data-set
|
1.0
|


|-
|-
| '''Maximum Length'''<br>''(MaximumLength)''
|'''Normal''' (Normal)
|
This property sets the normal of the
slice.
|
|
This property specifies the maximum number of triangles/lines to include in a triangle strip or polyline.
0
 
| 1000
|
|
The value must be greater than or equal to 4 and less than or equal to 100000.
The value(s) is an enumeration of the following:
 
* X-Normal (1)
* Y-Normal (2)
* Z-Normal (3)


|}
|}


==Slice Along PolyLine==


==Triangulate==
Slice along the surface defined by sweeping a polyline parallel to the z-axis.
The Slice Along PolyLine filter is similar to the Slice Filter except that it slices along a surface that
is defined by sweeping the input polyline parallel to the z-axis. Explained another way: take a laser
cutter and move it so that it hits every point on the input polyline while keeping it parallel
to the z-axis. The surface cut from the input dataset is the result.


This filter converts polygons and triangle strips to basic triangles.
The Triangulate filter decomposes polygonal data into only triangles, points, and lines. It separates triangle strips and polylines into individual triangles and lines, respectively. The output is polygonal data. Some filters that take polygonal data as input require that the data be composed of triangles rather than other polygons, so passing your data through this filter first is useful in such situations. You should use this filter in these cases rather than the Tetrahedralize filter because they produce different output dataset types. The filters referenced require polygonal input, and the Tetrahedralize filter produces unstructured grid output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 8,065: Line 10,105:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Dataset''' (Dataset)
|
 
Set the vtkDataObject to slice.
 
|
|
This property specifies the input to the Triangulate filter.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''PolyLine''' (PolyLine)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Set the polyline to slice along.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|


|
Accepts input of following types:
* vtkPolyData
|-
|'''Tolerance''' (Tolerance)
|


|}
The threshold used internally to determine correspondence between the polyline
and the output slice. If the output has sections missing, increasing this
value may help.


|
10
|


==Tube==


|}


Convert lines into tubes. Normals are used to avoid cracks between tube segments.
==Slice Generic Dataset==


The Tube filter creates tubes around the lines in the input polygonal dataset. The output is also polygonal.<br>
This filter cuts a data set with a plane or sphere. Cutting is similar to a contour. It creates surfaces from volumes and lines from surfaces.The
Generic Cut filter extracts the portion of the input data
set that lies along the specified plane or sphere. From
the Cut Function menu, you can select whether cutting will
be performed with a plane or a sphere. The appropriate 3D
widget (plane widget or sphere widget) will be displayed.
The parameters of the cut function can be specified
interactively using the 3D widget or manually using the
traditional user interface controls. Instructions for
using these 3D widgets and their corresponding user
interfaces are found in section 7.4. By default, the cut
lies on the specified plane or sphere. Using the Cut
Offset Values portion of the interface, it is also
possible to cut the data set at some offset from the
original cut function. The Cut Offset Values are in the
spatial units of the data set. To add a single offset,
select the value from the New Value slider in the Add
value portion of the interface and click the Add button,
or press Enter. To instead add several evenly spaced
offsets, use the controls in the Generate range of values
section. Select the number of offsets to generate using
the Number of Values slider. The Range slider controls the
interval in which to generate the offsets. Once the number
of values and range have been selected, click the Generate
button. The new offsets will be added to the Offset Values
list. To delete a value from the Cut Offset Values list,
select the value and click the Delete button. (If no value
is selected, the last value in the list will be removed.)
Clicking the Delete All button removes all the values in
the list. The Generic Cut filter takes a generic dataset
as input. Use the Input menu to choose a data set to cut.
The output of this filter is polygonal
data.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 8,093: Line 10,184:
| '''Description'''
| '''Description'''
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Generic Cut filter.
|
 
|
Accepts input of following types:
* vtkGenericDataSet
|-
|'''Cut Type''' (CutFunction)
|
Set the parameters to the implicit function used for
cutting.
|
 
|
The value can be one of the following:
* Plane (implicit_functions)
 
* Box (implicit_functions)
 
* Sphere (implicit_functions)
 
|-
|'''InputBounds''' (InputBounds)
|
 
|
 
|
 
|-
|'''Slice Offset Values''' (ContourValues)
|
The values in this property specify a list of current
offset values. This can be used to create multiple slices with
different centers. Each entry represents a new slice with its center
shifted by the offset value.
|
 
|
 
Determine the length of the dataset's diagonal.
The value must lie within -diagonal length to +diagonal length.
 
 
|}
 
==Smooth==
 
This filter smooths a polygonal surface by iteratively moving points toward their neighbors.
The Smooth filter operates on a polygonal data set by
iteratively adjusting the position of the points using
Laplacian smoothing. (Because this filter only adjusts
point positions, the output data set is also polygonal.)
This results in better-shaped cells and more evenly
distributed points. The Convergence slider limits the
maximum motion of any point. It is expressed as a fraction
of the length of the diagonal of the bounding box of the
data set. If the maximum point motion during a smoothing
iteration is less than the Convergence value, the
smoothing operation terminates.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Smooth
filter.
|
 
|
Accepts input of following types:
* vtkPolyData
|-
|'''Number of Iterations''' (NumberOfIterations)
|
This property sets the maximum number of smoothing
iterations to perform. More iterations produce better
smoothing.
|
20
|
 
|-
|'''Convergence''' (Convergence)
|
The value of this property limits the maximum motion of
any point. It is expressed as a fraction of the length of the diagonal
of the bounding box of the input dataset. If the maximum point motion
during a smoothing iteration is less than the value of this property,
the smoothing operation terminates.
|
0.0
|
 
 
|}
 
==StreakLine==
 
Trace Streak lines through time in a vector field.
The Particle Trace filter generates pathlines in a vector
field from a collection of seed points. The vector field
used is selected from the Vectors menu, so the input data
set is required to have point-centered vectors. The Seed
portion of the interface allows you to select whether the
seed points for this integration lie in a point cloud or
along a line. Depending on which is selected, the
appropriate 3D widget (point or line widget) is displayed
along with traditional user interface controls for
positioning the point cloud or line within the data set.
Instructions for using the 3D widgets and the
corresponding manual controls can be found in section 7.4.
This filter operates on any type of data set, provided it
has point-centered vectors. The output is polygonal data
containing polylines. This filter is available on the
Toolbar.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Specify which is the Input of the StreamTracer
filter.
|
 
|
Accepts input of following types:
* vtkDataObject
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''StaticSeeds''' (StaticSeeds)
|
If the input seeds are not changing, then this
can be set to 1 to avoid having to do a repeated grid search
that would return the exact same result.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''StaticMesh''' (StaticMesh)
|
If the input grid is not changing, then this
can be set to 1 to avoid having to create cell locators for
each update.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Seed Source''' (Source)
|
Specify the seed dataset. Typically fron where the
vector field integration should begin. Usually a point/radius or a line
with a given resolution.
|
 
|
Accepts input of following types:
* vtkDataSet
|-
|'''TerminationTime''' (TerminationTime)
|
Setting TerminationTime to a positive value will cause
particles to terminate when the time is reached. The units of time
should be consistent with the primary time variable.
|
0.0
|
 
|-
|'''TimestepValues''' (TimestepValues)
|
 
|
 
|
 
|-
|'''ForceReinjectionEveryNSteps''' (ForceReinjectionEveryNSteps)
|
When animating particles, it is nice to inject new ones
every Nth step to produce a continuous flow. Setting
ForceReinjectionEveryNSteps to a non zero value will cause the particle
source to reinject particles every Nth step even if it is otherwise
unchanged. Note that if the particle source is also animated, this flag
will be redundant as the particles will be reinjected whenever the
source changes anyway
|
1
|
 
|-
|'''SelectInputVectors''' (SelectInputVectors)
|
Specify which vector array should be used for the
integration through that filter.
|
 
|
An array of vectors is required.
|-
|'''ComputeVorticity''' (ComputeVorticity)
|
Compute vorticity and angular rotation of particles as
they progress
|
1
|
Accepts boolean values (0 or 1).
|-
|'''DisableResetCache''' (DisableResetCache)
|
Prevents cache from getting reset so that new computation
always start from previous results.
|
0
|
Accepts boolean values (0 or 1).
 
|}
 
==Stream Tracer==
 
Integrate streamlines in a vector field.The
Stream Tracer filter generates streamlines in a vector
field from a collection of seed points. Production of
streamlines terminates if a streamline crosses the
exterior boundary of the input dataset. Other reasons for
termination are listed for the MaximumNumberOfSteps,
TerminalSpeed, and MaximumPropagation properties. This
filter operates on any type of dataset, provided it has
point-centered vectors. The output is polygonal data
containing polylines.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Stream Tracer
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (any)
 
with 3 component(s).
 
|-
|'''Vectors''' (SelectInputVectors)
|
This property contains the name of the vector array from
which to generate streamlines.
|
 
|
An array of vectors is required.
|-
|'''InterpolatorType''' (InterpolatorType)
|
This property determines which interpolator to use for
evaluating the velocity vector field. The first is faster though the
second is more robust in locating cells during streamline
integration.
|
0
|
The value(s) is an enumeration of the following:
* Interpolator with Point Locator (0)
* Interpolator with Cell Locator (1)
|-
|'''Surface Streamlines''' (Surface Streamlines)
|
Specify whether or not to compute surface
streamlines.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''IntegrationDirection''' (IntegrationDirection)
|
This property determines in which direction(s) a
streamline is generated.
|
2
|
The value(s) is an enumeration of the following:
* FORWARD (0)
* BACKWARD (1)
* BOTH (2)
|-
|'''IntegratorType''' (IntegratorType)
|
This property determines which integrator (with
increasing accuracy) to use for creating streamlines.
|
2
|
The value(s) is an enumeration of the following:
* Runge-Kutta 2 (0)
* Runge-Kutta 4 (1)
* Runge-Kutta 4-5 (2)
|-
|'''Integration Step Unit''' (IntegrationStepUnit)
|
This property specifies the unit for
Minimum/Initial/Maximum integration step size. The Length unit refers
to the arc length that a particle travels/advects within a single step.
The Cell Length unit represents the step size as a number of
cells.
|
2
|
The value(s) is an enumeration of the following:
* Length (1)
* Cell Length (2)
|-
|'''Initial Step Length''' (InitialIntegrationStep)
|
This property specifies the initial integration step
size. For non-adaptive integrators (Runge-Kutta 2 and Runge-Kutta 4),
it is fixed (always equal to this initial value) throughout the
integration. For an adaptive integrator (Runge-Kutta 4-5), the actual
step size varies such that the numerical error is less than a specified
threshold.
|
0.2
|
 
|-
|'''Minimum Step Length''' (MinimumIntegrationStep)
|
When using the Runge-Kutta 4-5 ingrator, this property
specifies the minimum integration step size.
|
0.01
|
 
|-
|'''Maximum Step Length''' (MaximumIntegrationStep)
|
When using the Runge-Kutta 4-5 ingrator, this property
specifies the maximum integration step size.
|
0.5
|
 
|-
|'''Maximum Steps''' (MaximumNumberOfSteps)
|
This property specifies the maximum number of steps,
beyond which streamline integration is terminated.
|
2000
|
 
|-
|'''Maximum Streamline Length''' (MaximumPropagation)
|
This property specifies the maximum streamline length
(i.e., physical arc length), beyond which line integration is
terminated.
|
1.0
|
 
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
1.0.
 
|-
|'''Terminal Speed''' (TerminalSpeed)
|
This property specifies the terminal speed, below which
particle advection/integration is terminated.
|
0.000000000001
|
 
|-
|'''MaximumError''' (MaximumError)
|
This property specifies the maximum error (for
Runge-Kutta 4-5) tolerated throughout streamline integration. The
Runge-Kutta 4-5 integrator tries to adjust the step size such that the
estimated error is less than this threshold.
|
0.000001
|
 
|-
|'''ComputeVorticity''' (ComputeVorticity)
|
Specify whether or not to compute
vorticity.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Seed Type''' (Source)
|
The value of this property determines how the seeds for
the streamlines will be generated.
|
 
|
The value can be one of the following:
* PointSource (extended_sources)
 
* HighResLineSource (extended_sources)
 
 
|}
 
==Stream Tracer For Generic Datasets==
 
Integrate streamlines in a vector field.The
Generic Stream Tracer filter generates streamlines in a
vector field from a collection of seed points. The vector
field used is selected from the Vectors menu, so the input
data set is required to have point-centered vectors. The
Seed portion of the interface allows you to select whether
the seed points for this integration lie in a point cloud
or along a line. Depending on which is selected, the
appropriate 3D widget (point or line widget) is displayed
along with traditional user interface controls for
positioning the point cloud or line within the data set.
Instructions for using the 3D widgets and the
corresponding manual controls can be found in section 7.4.
The Max. Propagation entry box allows you to specify the
maximum length of the streamlines. From the Max.
Propagation menu, you can select the units to be either
Time (the time a particle would travel with steady flow)
or Length (in the data set's spatial coordinates). The
Init. Step Len. menu and entry specify the initial step
size for integration. (For non-adaptive integrators,
Runge-Kutta 2 and 4, the initial step size is used
throughout the integration.) The menu allows you to
specify the units. Time and Length have the same meaning
as for Max. Propagation. Cell Length specifies the step
length as a number of cells. The Integration Direction
menu determines in which direction(s) the stream trace
will be generated: FORWARD, BACKWARD, or BOTH. The
Integrator Type section of the interface determines which
calculation to use for integration: Runge-Kutta 2,
Runge-Kutta 4, or Runge-Kutta 4-5. If Runge-Kutta 4-5 is
selected, controls are displayed for specifying the
minimum and maximum step length and the maximum error. The
controls for specifying Min. Step Len. and Max. Step Len.
are the same as those for Init. Step Len. The Runge-Kutta
4-5 integrator tries to choose the step size so that the
estimated error is less than the value of the Maximum
Error entry. If the integration takes more than Max. Steps
to complete, if the speed goes below Term. Speed, if Max.
Propagation is reached, or if a boundary of the input data
set is crossed, integration terminates. This filter
operates on any type of data set, provided it has
point-centered vectors. The output is polygonal data
containing polylines.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Generic Stream Tracer
filter.
|
 
|
Accepts input of following types:
* vtkGenericDataSet
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''Seed Type''' (Source)
|
The value of this property determines how the seeds for
the streamlines will be generated.
|
 
|
The value can be one of the following:
* PointSource (extended_sources)
 
* HighResLineSource (extended_sources)
 
|-
|'''Vectors''' (SelectInputVectors)
|
This property contains the name of the vector array from
which to generate streamlines.
|
 
|
An array of vectors is required.
|-
|'''MaximumPropagation''' (MaximumPropagation)
|
Specify the maximum streamline length.
|
1.0
|
 
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
1.0.
 
|-
|'''InitialIntegrationStep''' (InitialIntegrationStep)
|
Specify the initial integration step.
|
0.5
|
 
|-
|'''IntegrationDirection''' (IntegrationDirection)
|
This property determines in which direction(s) a
streamline is generated.
|
2
|
The value(s) is an enumeration of the following:
* FORWARD (0)
* BACKWARD (1)
* BOTH (2)
|-
|'''IntegratorType''' (IntegratorType)
|
This property determines which integrator (with
increasing accuracy) to use for creating streamlines.
|
2
|
The value(s) is an enumeration of the following:
* Runge-Kutta 2 (0)
* Runge-Kutta 4 (1)
* Runge-Kutta 4-5 (2)
|-
|'''MaximumError''' (MaximumError)
|
Set the maximum error allowed in the integration. The
meaning of this value depends on the integrator chosen.
|
0.000001
|
 
|-
|'''MinimumIntegrationStep''' (MinimumIntegrationStep)
|
Specify the minimum integration step.
|
0.01
|
 
|-
|'''IntegrationStepUnit''' (IntegrationStepUnit)
|
Choose the unit to use for the integration
step.
|
2
|
The value(s) is an enumeration of the following:
* Time (0)
* Length (1)
* Cell Length (2)
|-
|'''MaximumIntegrationStep''' (MaximumIntegrationStep)
|
Specify the maximum integration step.
|
0.01
|
 
|-
|'''MaximumNumberOfSteps''' (MaximumNumberOfSteps)
|
Specify the maximum number of steps used in the
integration.
|
2000
|
 
|-
|'''TerminalSpeed''' (TerminalSpeed)
|
If at any point the speed is below this value, the
integration is terminated.
|
0.000000000001
|
 
 
|}
 
==Stream Tracer With Custom Source==
 
Integrate streamlines in a vector field.The
Stream Tracer filter generates streamlines in a vector
field from a collection of seed points. Production of
streamlines terminates if a streamline crosses the
exterior boundary of the input dataset. Other reasons for
termination are listed for the MaximumNumberOfSteps,
TerminalSpeed, and MaximumPropagation properties. This
filter operates on any type of dataset, provided it has
point-centered vectors. The output is polygonal data
containing polylines. This filter takes a Source input
that provides the seed points.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Stream Tracer
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''Vectors''' (SelectInputVectors)
|
This property contains the name of the vector array from
which to generate streamlines.
|
 
|
An array of vectors is required.
|-
|'''Surface Streamlines''' (Surface Streamlines)
|
Specify whether or not to compute surface
streamlines.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''IntegrationDirection''' (IntegrationDirection)
|
This property determines in which direction(s) a
streamline is generated.
|
2
|
The value(s) is an enumeration of the following:
* FORWARD (0)
* BACKWARD (1)
* BOTH (2)
|-
|'''IntegratorType''' (IntegratorType)
|
This property determines which integrator (with
increasing accuracy) to use for creating streamlines.
|
2
|
The value(s) is an enumeration of the following:
* Runge-Kutta 2 (0)
* Runge-Kutta 4 (1)
* Runge-Kutta 4-5 (2)
|-
|'''Integration Step Unit''' (IntegrationStepUnit)
|
This property specifies the unit for
Minimum/Initial/Maximum integration step size. The Length unit refers
to the arc length that a particle travels/advects within a single step.
The Cell Length unit represents the step size as a number of
cells.
|
2
|
The value(s) is an enumeration of the following:
* Length (1)
* Cell Length (2)
|-
|'''Initial Step Length''' (InitialIntegrationStep)
|
This property specifies the initial integration step
size. For non-adaptive integrators (Runge-Kutta 2 and Runge-Kutta 4),
it is fixed (always equal to this initial value) throughout the
integration. For an adaptive integrator (Runge-Kutta 4-5), the actual
step size varies such that the numerical error is less than a specified
threshold.
|
0.2
|
 
|-
|'''Minimum Step Length''' (MinimumIntegrationStep)
|
When using the Runge-Kutta 4-5 ingrator, this property
specifies the minimum integration step size.
|
0.01
|
 
|-
|'''Maximum Step Length''' (MaximumIntegrationStep)
|
When using the Runge-Kutta 4-5 ingrator, this property
specifies the maximum integration step size.
|
0.5
|
 
|-
|'''Maximum Steps''' (MaximumNumberOfSteps)
|
This property specifies the maximum number of steps,
beyond which streamline integration is terminated.
|
2000
|
 
|-
|'''Maximum Streamline Length''' (MaximumPropagation)
|
This property specifies the maximum streamline length
(i.e., physical arc length), beyond which line integration is
terminated.
|
1.0
|
 
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
1.0.
 
|-
|'''Terminal Speed''' (TerminalSpeed)
|
This property specifies the terminal speed, below which
particle advection/integration is terminated.
|
0.000000000001
|
 
|-
|'''MaximumError''' (MaximumError)
|
This property specifies the maximum error (for
Runge-Kutta 4-5) tolerated throughout streamline integration. The
Runge-Kutta 4-5 integrator tries to adjust the step size such that the
estimated error is less than this threshold.
|
0.000001
|
 
|-
|'''ComputeVorticity''' (ComputeVorticity)
|
Specify whether or not to compute
vorticity.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Seed Source''' (Source)
|
This property specifies the input used to obtain the
seed points.
|
 
|
 
 
|}
 
==Subdivide==
 
This filter iteratively divide triangles into four smaller triangles. New points are placed linearly so the output surface matches the input surface.
The Subdivide filter iteratively divides each triangle in
the input dataset into 4 new triangles. Three new points
are added per triangle -- one at the midpoint of each
edge. This filter operates only on polygonal data
containing triangles, so run your polygonal data through
the Triangulate filter first if it is not composed of
triangles. The output of this filter is also
polygonal.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This parameter specifies the input to the Subdivide
filter.
|
 
|
Accepts input of following types:
* vtkPolyData
|-
|'''Number of Subdivisions''' (NumberOfSubdivisions)
|
The value of this property specifies the number of
subdivision iterations to perform.
|
1
|
 
 
|}
 
==Surface Flow==
 
This filter integrates flow through a surface.
The flow integration fitler integrates the dot product of
a point flow vector field and surface normal. It computes
the net flow across the 2D surface. It operates on any
type of dataset and produces an unstructured grid
output.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Surface Flow
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''SelectInputVectors''' (SelectInputVectors)
|
The value of this property specifies the name of the
input vector array containing the flow vector field.
|
 
|
An array of vectors is required.
 
|}
 
==Surface Vectors==
 
This filter constrains vectors to lie on a surface.
The Surface Vectors filter is used for 2D data sets. It
constrains vectors to lie in a surface by removing
components of the vectors normal to the local
surface.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Surface Vectors
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''SelectInputVectors''' (SelectInputVectors)
|
This property specifies the name of the input vector
array to process.
|
 
|
An array of vectors is required.
|-
|'''ConstraintMode''' (ConstraintMode)
|
This property specifies whether the vectors will be
parallel or perpendicular to the surface. If the value is set to
PerpendicularScale (2), then the output will contain a scalar array
with the dot product of the surface normal and the vector at each
point.
|
0
|
The value(s) is an enumeration of the following:
* Parallel (0)
* Perpendicular (1)
* PerpendicularScale (2)
 
|}
 
==Table FFT==
 
Performs the Fast Fourier Transform on the columns of a
table.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
 
|
Accepts input of following types:
* vtkTable
The dataset must contain a field array (row)
 
with 1 component(s).
 
 
|}
 
==Table To Points==
 
Converts table to set of points.The
TableToPolyData filter converts a vtkTable to a set of
points in a vtkPolyData. One must specifies the columns in
the input table to use as the X, Y and Z coordinates for
the points in the output.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input..
|
 
|
Accepts input of following types:
* vtkTable
The dataset must contain a field array (row)
 
with 1 component(s).
 
|-
|'''XColumn''' (XColumn)
|
This property specifies which data array is going to be
used as the X coordinate in the generated polydata
dataset.
|
 
|
 
|-
|'''YColumn''' (YColumn)
|
This property specifies which data array is going to be
used as the Y coordinate in the generated polydata
dataset.
|
 
|
 
|-
|'''ZColumn''' (ZColumn)
|
This property specifies which data array is going to be
used as the Z coordinate in the generated polydata
dataset.
|
 
|
 
|-
|'''2D Points''' (Create2DPoints)
|
Specify whether the points of the polydata are 3D or 2D.
If this is set to true then the Z Column will be ignored and the z
value of each point on the polydata will be set to 0. By default this
will be off.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''KeepAllDataArrays''' (KeepAllDataArrays)
|
Allow user to keep columns specified as X,Y,Z as Data
arrays. By default this will be off.
|
0
|
Accepts boolean values (0 or 1).
 
|}
 
==Table To Structured Grid==
 
Converts to table to structured grid.The
TableToStructuredGrid filter converts a vtkTable to a
vtkStructuredGrid. One must specifies the columns in the
input table to use as the X, Y and Z coordinates for the
points in the output, and the whole
extent.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input..
|
 
|
Accepts input of following types:
* vtkTable
The dataset must contain a field array (row)
 
with 1 component(s).
 
|-
|'''WholeExtent''' (WholeExtent)
|
 
|
0 0 0 0 0 0
|
 
|-
|'''XColumn''' (XColumn)
|
This property specifies which data array is going to be
used as the X coordinate in the generated polydata
dataset.
|
 
|
 
|-
|'''YColumn''' (YColumn)
|
This property specifies which data array is going to be
used as the Y coordinate in the generated polydata
dataset.
|
 
|
 
|-
|'''ZColumn''' (ZColumn)
|
This property specifies which data array is going to be
used as the Z coordinate in the generated polydata
dataset.
|
 
|
 
 
|}
 
==Temporal Cache==
 
Saves a copy of the data set for a fixed number of time steps.The Temporal Cache
can be used to save multiple copies of a data set at
different time steps to prevent thrashing in the pipeline
caused by downstream filters that adjust the requested
time step. For example, assume that there is a downstream
Temporal Interpolator filter. This filter will (usually)
request two time steps from the upstream filters, which in
turn (usually) causes the upstream filters to run twice,
once for each time step. The next time the interpolator
requests the same two time steps, they might force the
upstream filters to re-evaluate the same two time steps.
The Temporal Cache can keep copies of both of these time
steps and provide the requested data without having to run
upstream filters.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input of the Temporal Cache
filter.
|
 
|
Accepts input of following types:
* vtkDataObject
|-
|'''CacheSize''' (CacheSize)
|
The cache size determines the number of time steps that
can be cached at one time. The maximum number is 10. The minimum is 2
(since it makes little sense to cache less than that).
|
2
|
 
|-
|'''TimestepValues''' (TimestepValues)
|
 
|
 
|
 
 
|}
 
==Temporal Interpolator==
 
Interpolate between time steps.The Temporal
Interpolator converts data that is defined at discrete
time steps to one that is defined over a continuum of time
by linearly interpolating the data's field data between
two adjacent time steps. The interpolated values are a
simple approximation and should not be interpreted as
anything more. The Temporal Interpolator assumes that the
topology between adjacent time steps does not
change.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input of the Temporal
Interpolator.
|
 
|
Accepts input of following types:
* vtkDataObject
|-
|'''DiscreteTimeStepInterval''' (DiscreteTimeStepInterval)
|
If Discrete Time Step Interval is set to 0, then the
Temporal Interpolator will provide a continuous region of time on its
output. If set to anything else, then the output will define a finite
set of time points on its output, each spaced by the Discrete Time Step
Interval. The output will have (time range)/(discrete time step
interval) time steps. (Note that the time range is defined by the time
range of the data of the input filter, which may be different from
other pipeline objects or the range defined in the animation
inspector.) This is a useful option to use if you have a dataset with
one missing time step and wish to 'file-in' the missing data with an
interpolated value from the steps on either side.
|
0.0
|
 
|-
|'''TimestepValues''' (TimestepValues)
|
 
|
 
|
 
|-
|'''TimeRange''' (TimeRange)
|
 
|
 
|
 
 
|}
 
==Temporal Particles To Pathlines==
 
Creates polylines representing pathlines of animating particles
Particle Pathlines takes any dataset as input, it extracts the
point locations of all cells over time to build up a polyline
trail. The point number (index) is used as the 'key' if the points
are randomly changing their respective order in the points list,
then you should specify a scalar that represents the unique
ID. This is intended to handle the output of a filter such as the
TemporalStreamTracer.
 
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
The input cells to create pathlines for.
 
|
 
|
Accepts input of following types:
* vtkPointSet
The dataset must contain a field array (point)
 
|-
|'''Selection''' (Selection)
|
 
Set a second input, which is a selection. Particles with the same
Id in the selection as the primary input will be chosen for
pathlines Note that you must have the same IdChannelArray in the
selection as the input
 
|
 
|
Accepts input of following types:
* vtkDataSet
|-
|'''MaskPoints''' (MaskPoints)
|
 
Set the number of particles to track as a ratio of the input.
Example: setting MaskPoints to 10 will track every 10th point.
 
|
100
|
 
|-
|'''MaxTrackLength''' (MaxTrackLength)
|
 
If the Particles being traced animate for a long time, the trails
or traces will become long and stringy. Setting the
MaxTraceTimeLength will limit how much of the trace is
displayed. Tracks longer then the Max will disappear and the
trace will apppear like a snake of fixed length which progresses
as the particle moves. This length is given with respect to
timesteps.
 
|
25
|
 
|-
|'''MaxStepDistance''' (MaxStepDistance)
|
 
If a particle disappears from one end of a simulation and
reappears on the other side, the track left will be
unrepresentative. Set a MaxStepDistance{x,y,z} which acts as a
threshold above which if a step occurs larger than the value (for
the dimension), the track will be dropped and restarted after the
step. (ie the part before the wrap around will be dropped and the
newer part kept).
 
|
1.0 1.0 1.0
|
 
|-
|'''IdChannelArray''' (IdChannelArray)
|
 
Specify the name of a scalar array which will be used to fetch
the index of each point. This is necessary only if the particles
change position (Id order) on each time step. The Id can be used
to identify particles at each step and hence track them properly.
If this array is set to "Global or Local IDs", the global point
ids are used if they exist or the point index is otherwise.
 
|
Global or Local IDs
|
An array of scalars is required.
 
|}
 
==Temporal Shift Scale==
 
Shift and scale time values.The Temporal
Shift Scale filter linearly transforms the time values of
a pipeline object by applying a shift and then scale.
Given a data at time t on the input, it will be
transformed to time t*Shift + Scale on the output.
Inversely, if this filter has a request for time t, it
will request time (t-Shift)/Scale on its
input.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
The input to the Temporal Shift Scale
filter.
|
 
|
Accepts input of following types:
* vtkDataObject
|-
|'''PreShift''' (PreShift)
|
Apply a translation to the data before scaling. To
convert T{5,100} to T{0,1} use Preshift=-5, Scale=1/95, PostShift=0 To
convert T{5,105} to T{5,10} use Preshift=-5, Scale=5/100,
PostShift=5
|
0.0
|
 
|-
|'''PostShift''' (PostShift)
|
The amount of time the input is shifted.
|
0.0
|
 
|-
|'''Scale''' (Scale)
|
The factor by which the input time is
scaled.
|
1.0
|
 
|-
|'''Periodic''' (Periodic)
|
If Periodic is true, requests for time will be wrapped
around so that the source appears to be a periodic time source. If data
exists for times {0,N-1}, setting periodic to true will cause time 0 to
be produced when time N, 2N, 2N etc is requested. This effectively
gives the source the ability to generate time data indefinitely in a
loop. When combined with Shift/Scale, the time becomes periodic in the
shifted and scaled time frame of reference. Note: Since the input time
may not start at zero, the wrapping of time from the end of one period
to the start of the next, will subtract the initial time - a source
with T{5..6} repeated periodicaly will have output time {5..6..7..8}
etc.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''PeriodicEndCorrection''' (PeriodicEndCorrection)
|
If Periodic time is enabled, this flag determines if the
last time step is the same as the first. If PeriodicEndCorrection is
true, then it is assumed that the input data goes from 0-1 (or whatever
scaled/shifted actual time) and time 1 is the same as time 0 so that
steps will be 0,1,2,3...N,1,2,3...N,1,2,3 where step N is the same as 0
and step 0 is not repeated. When this flag is false the data is assumed
to be literal and output is of the form 0,1,2,3...N,0,1,2,3... By
default this flag is ON
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MaximumNumberOfPeriods''' (MaximumNumberOfPeriods)
|
If Periodic time is enabled, this controls how many time
periods time is reported for. A filter cannot output an infinite number
of time steps and therefore a finite number of periods is generated
when reporting time.
|
1.0
|
 
|-
|'''TimestepValues''' (TimestepValues)
|
 
|
 
|
 
 
|}
 
==Temporal Snap-to-Time-Step==
 
Modifies the time range/steps of temporal data.
This file modifies the time range or time steps of the
data without changing the data itself. The data is not
resampled by this filter, only the information
accompanying the data is modified.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
 
|
Accepts input of following types:
* vtkDataObject
|-
|'''SnapMode''' (SnapMode)
|
Determine which time step to snap to.
|
0
|
The value(s) is an enumeration of the following:
* Nearest (0)
* NextBelowOrEqual (1)
* NextAboveOrEqual (2)
|-
|'''TimestepValues''' (TimestepValues)
|
 
|
 
|
 
 
|}
 
==Temporal Statistics==
 
Loads in all time steps of a data set and computes some statistics about how each point and cell variable changes over time.Given an input
that changes over time, vtkTemporalStatistics looks at the
data for each time step and computes some statistical
information of how a point or cell variable changes over
time. For example, vtkTemporalStatistics can compute the
average value of "pressure" over time of each point. Note
that this filter will require the upstream filter to be
run on every time step that it reports that it can
compute. This may be a time consuming operation.
vtkTemporalStatistics ignores the temporal spacing. Each
timestep will be weighted the same regardless of how long
of an interval it is to the next timestep. Thus, the
average statistic may be quite different from an
integration of the variable if the time spacing
varies.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Temporal Statistics
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
|-
|'''ComputeAverage''' (ComputeAverage)
|
Compute the average of each point and cell variable over
time.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeMinimum''' (ComputeMinimum)
|
Compute the minimum of each point and cell variable over
time.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeMaximum''' (ComputeMaximum)
|
Compute the maximum of each point and cell variable over
time.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeStandardDeviation''' (ComputeStandardDeviation)
|
Compute the standard deviation of each point and cell
variable over time.
|
1
|
Accepts boolean values (0 or 1).
 
|}
 
==Tensor Glyph==
 
This filter generates an ellipsoid, cuboid, cylinder or superquadric glyph at each point of the input data set. The glyphs are oriented and scaled according to eigenvalues and eigenvectors of tensor point data of the input data set.
 
The Tensor Glyph filter generates an ellipsoid, cuboid, cylinder or superquadric glyph at every point in
the input data set. The glyphs are oriented and scaled according to eigenvalues and eigenvectors of tensor
point data of the input data set. The Tensor Glyph filter operates on any type of data set. Its output is
polygonal.
 
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
This property specifies the input to the Glyph filter.
 
|
 
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)
 
with 9 component(s).
 
The dataset must contain a field array (point)
 
with 1 component(s).
 
|-
|'''Tensors''' (SelectInputTensors)
|
 
This property indicates the name of the tensor array on which to operate. The indicated array's
eigenvalues and eigenvectors are used for scaling and orienting the glyphs.
 
|
 
|
 
|-
|'''Glyph Type''' (Source)
|
 
This property determines which type of glyph will be placed at the points in the input dataset.
 
|
 
|
Accepts input of following types:
* vtkPolyDataThe value can be one of the following:
* SphereSource (sources)
 
* CylinderSource (sources)
 
* CubeSource (sources)
 
* SuperquadricSource (sources)
 
|-
|'''ExtractEigenvalues''' (ExtractEigenvalues)
|
 
Toggle whether to extract eigenvalues from tensor. If false, eigenvalues/eigenvectors are not extracted and
the columns of the tensor are taken as the eigenvectors (the norm of column, always positive, is the eigenvalue).
If true, the glyph is scaled and oriented according to eigenvalues and eigenvectors; additionally, eigenvalues
are provided as new data array.
 
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ColorGlyphs''' (ColorGlyphs)
|
 
This property determines whether or not to color the glyphes.
 
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Scalars''' (SelectInputScalars)
|
 
This property indicates the name of the scalar array to use for coloring
 
|
1
|
An array of scalars is required.
|-
|'''Color by''' (ColorMode)
|
 
This property determines whether input scalars or computed eigenvalues at the point should be used
to color the glyphs. If ThreeGlyphs is set and the eigenvalues are chosen for coloring then each glyph
is colored by the corresponding eigenvalue and if not set the color corresponding to the largest
eigenvalue is chosen.
 
|
0
|
The value(s) is an enumeration of the following:
* input scalars (0)
* eigenvalues (1)
|-
|'''ScaleFactor''' (ScaleFactor)
|
 
This property specifies the scale factor to scale every glyph by.
 
|
1
|
 
|-
|'''LimitScalingByEigenvalues''' (LimitScalingByEigenvalues)
|
 
This property determines whether scaling of glyphs by ScaleFactor times eigenvalue should be limited.
This is useful to prevent uncontrolled scaling near singularities.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''MaxScaleFactor''' (MaxScaleFactor)
|
 
If scaling by eigenvalues should be limited, this value sets an upper limit for scale factor times
eigenvalue.
 
|
10
|
 
|-
|'''Symmetric''' (Symmetric)
|
 
This property determines whether or not to draw a mirror of each glyph.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ThreeGlyphs''' (ThreeGlyphs)
|
 
Toggle whether to produce three glyphs, each of which oriented along an eigenvector and scaled according
to the corresponding eigenvector.
 
|
0
|
Accepts boolean values (0 or 1).
 
|}
 
==Tessellate==
 
Tessellate nonlinear curves, surfaces, and volumes with lines, triangles, and tetrahedra.The Tessellate filter
tessellates cells with nonlinear geometry and/or scalar
fields into a simplicial complex with linearly
interpolated field values that more closely approximate
the original field. This is useful for datasets containing
quadratic cells.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Tessellate
filter.
|
 
|
Accepts input of following types:
* vtkPolyData
* vtkDataSet
* vtkUnstructuredGrid
|-
|'''OutputDimension''' (OutputDimension)
|
The value of this property sets the maximum
dimensionality of the output tessellation. When the value of this
property is 3, 3D cells produce tetrahedra, 2D cells produce triangles,
and 1D cells produce line segments. When the value is 2, 3D cells will
have their boundaries tessellated with triangles. When the value is 1,
all cells except points produce line segments.
|
3
|
 
|-
|'''ChordError''' (ChordError)
|
This property controls the maximum chord error allowed
at any edge midpoint in the output tessellation. The chord error is
measured as the distance between the midpoint of any output edge and
the original nonlinear geometry.
|
1e-3
|
 
|-
|'''Field Error''' (FieldError2)
|
This proeprty controls the maximum field error allowed
at any edge midpoint in the output tessellation. The field error is
measured as the difference between a field value at the midpoint of an
output edge and the value of the corresponding field in the original
nonlinear geometry.
|
 
|
 
|-
|'''Maximum Number of Subdivisions''' (MaximumNumberOfSubdivisions)
|
This property specifies the maximum number of times an
edge may be subdivided. Increasing this number allows further
refinement but can drastically increase the computational and storage
requirements, especially when the value of the OutputDimension property
is 3.
|
3
|
 
|-
|'''MergePoints''' (MergePoints)
|
If the value of this property is set to 1, coincident
vertices will be merged after tessellation has occurred. Only geometry
is considered during the merge and the first vertex encountered is the
one whose point attributes will be used. Any discontinuities in point
fields will be lost. On the other hand, many operations, such as
streamline generation, require coincident vertices to be merged. Toggle
whether to merge coincident vertices.
|
1
|
Accepts boolean values (0 or 1).
 
|}
 
==Tessellate Generic Dataset==
 
Tessellate a higher-order datasetTessellate
a higher-order dataset.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Generic Tessellator
filter.
|
 
|
Accepts input of following types:
* vtkGenericDataSet
 
|}
 
==Tetrahedralize==
 
This filter converts 3-d cells to tetrahedrons and polygons to triangles. The output is always of type unstructured grid.The
Tetrahedralize filter converts the 3D cells of any type of
dataset to tetrahedrons and the 2D ones to triangles. This
filter always produces unstructured grid
output.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Tetrahedralize
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
 
|}
 
==Texture Map to Cylinder==
 
Generate texture coordinates by mapping points to cylinder.
This is a filter that generates 2D texture coordinates by
mapping input dataset points onto a cylinder. The cylinder
is generated automatically. The cylinder is generated
automatically by computing the axis of the cylinder. Note
that the generated texture coordinates for the
s-coordinate ranges from (0-1) (corresponding to angle of
0-&gt;360 around axis), while the mapping of the
t-coordinate is controlled by the projection of points
along the axis.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Texture Map to Cylinder
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
|-
|'''PreventSeam''' (PreventSeam)
|
Control how the texture coordinates are generated. If
Prevent Seam is set, the s-coordinate ranges from 0-&gt;1 and 1-&gt;0
corresponding to the theta angle variation between 0-&gt;180 and
180-&gt;0 degrees. Otherwise, the s-coordinate ranges from 0-&gt;1
between 0-&gt;360 degrees.
|
1
|
Accepts boolean values (0 or 1).
 
|}
 
==Texture Map to Plane==
 
Generate texture coordinates by mapping points to plane.
TextureMapToPlane is a filter that generates 2D texture
coordinates by mapping input dataset points onto a plane.
The plane is generated automatically. A least squares
method is used to generate the plane
automatically.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Texture Map to Plane
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
 
|}
 
==Texture Map to Sphere==
 
Generate texture coordinates by mapping points to sphere.
This is a filter that generates 2D texture coordinates by
mapping input dataset points onto a sphere. The sphere is
generated automatically. The sphere is generated
automatically by computing the center i.e. averaged
coordinates, of the sphere. Note that the generated
texture coordinates range between (0,1). The s-coordinate
lies in the angular direction around the z-axis, measured
counter-clockwise from the x-axis. The t-coordinate lies
in the angular direction measured down from the north pole
towards the south pole.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Texture Map to Sphere
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
|-
|'''PreventSeam''' (PreventSeam)
|
Control how the texture coordinates are generated. If
Prevent Seam is set, the s-coordinate ranges from 0-&gt;1 and 1-&gt;0
corresponding to the theta angle variation between 0-&gt;180 and
180-&gt;0 degrees. Otherwise, the s-coordinate ranges from 0-&gt;1
between 0-&gt;360 degrees.
|
1
|
Accepts boolean values (0 or 1).
 
|}
 
==Threshold==
 
This filter extracts cells that have point or cell scalars in the specified range.
The Threshold filter extracts the portions of the input
dataset whose scalars lie within the specified range. This
filter operates on either point-centered or cell-centered
data. This filter operates on any type of dataset and
produces unstructured grid output. To select between these
two options, select either Point Data or Cell Data from
the Attribute Mode menu. Once the Attribute Mode has been
selected, choose the scalar array from which to threshold
the data from the Scalars menu. The Lower Threshold and
Upper Threshold sliders determine the range of the scalars
to retain in the output. The All Scalars check box only
takes effect when the Attribute Mode is set to Point Data.
If the All Scalars option is checked, then a cell will
only be passed to the output if the scalar values of all
of its points lie within the range indicated by the Lower
Threshold and Upper Threshold sliders. If unchecked, then
a cell will be added to the output if the specified scalar
value for any of its points is within the chosen
range.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Threshold
filter.
|
 
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array ()
 
with 1 component(s).
 
|-
|'''Scalars''' (SelectInputScalars)
|
The value of this property contains the name of the
scalar array from which to perform thresholding.
|
 
|
An array of scalars is required.The value must be field array name.
|-
|'''Threshold Range''' (ThresholdBetween)
|
The values of this property specify the upper and lower
bounds of the thresholding operation.
|
0 0
|
The value must lie within the range of the selected data array.
|-
|'''AllScalars''' (AllScalars)
|
If the value of this property is 1, then a cell is only
included in the output if the value of the selected array for all its
points is within the threshold. This is only relevant when thresholding
by a point-centered array.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''UseContinuousCellRange''' (UseContinuousCellRange)
|
 
If off, the vertex scalars are treated as a discrete set. If on, they
are treated as a continuous interval over the minimum and maximum. One
important "on" use case: When setting lower and upper threshold
equal to some value and turning AllScalars off, the results are
cells containing the iso-surface for that value. WARNING: Whether on
or off, for higher order input, the filter will not give accurate
results.
 
|
0
|
Accepts boolean values (0 or 1).
 
|}
 
==Transform==
 
This filter applies transformation to the polygons.The Transform
filter allows you to specify the position, size, and
orientation of polygonal, unstructured grid, and
curvilinear data sets.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Transform
filter.
|
 
|
Accepts input of following types:
* vtkPointSet
* vtkImageData
* vtkRectilinearGrid
|-
|'''Transform''' (Transform)
|
The values in this property allow you to specify the
transform (translation, rotation, and scaling) to apply to the input
dataset.
|
 
|
The value can be one of the following:
* Transform3 (extended_sources)
 
 
|}
 
==Transpose Table==
 
Transpose a table.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
 
|
Accepts input of following types:
* vtkTable
The dataset must contain a field array (row)
 
with 1 component(s).
 
|-
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|
 
|
 
|-
|'''Add a column with original columns name''' (AddIdColumn)
|
This flag indicates if a column must be inserted
at index 0 with the names (ids) of the input columns.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Use the column with original columns name''' (UseIdColumn)
|
This flag indicates if the output column must be
named using the names listed in the index 0 column.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Only extract selected columns''' (DoNotTranspose)
|
This flag indicates if the sub-table must be
effectively transposed or not.
|
0
|
Accepts boolean values (0 or 1).
 
|}
 
==Triangle Strips==
 
This filter uses a greedy algorithm to convert triangles into triangle stripsThe
Triangle Strips filter converts triangles into triangle
strips and lines into polylines. This filter operates on
polygonal data sets and produces polygonal
output.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Triangle Strips
filter.
|
 
|
Accepts input of following types:
* vtkPolyData
|-
|'''MaximumLength''' (MaximumLength)
|
This property specifies the maximum number of
triangles/lines to include in a triangle strip or
polyline.
|
1000
|
 
 
|}
 
==Triangulate==
 
This filter converts polygons and triangle strips to basic triangles.The
Triangulate filter decomposes polygonal data into only
triangles, points, and lines. It separates triangle strips
and polylines into individual triangles and lines,
respectively. The output is polygonal data. Some filters
that take polygonal data as input require that the data be
composed of triangles rather than other polygons, so
passing your data through this filter first is useful in
such situations. You should use this filter in these cases
rather than the Tetrahedralize filter because they produce
different output dataset types. The filters referenced
require polygonal input, and the Tetrahedralize filter
produces unstructured grid output.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Triangulate
filter.
|
 
|
Accepts input of following types:
* vtkPolyData
 
|}
 
==Tube==
 
Convert lines into tubes. Normals are used to avoid cracks between tube segments.The Tube filter
creates tubes around the lines in the input polygonal
dataset. The output is also polygonal.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Tube
filter.
|
 
|
Accepts input of following types:
* vtkPolyData
The dataset must contain a field array (point)
 
with 1 component(s).
 
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''Scalars''' (SelectInputScalars)
|
This property indicates the name of the scalar array on
which to operate. The indicated array may be used for scaling the
tubes. (See the VaryRadius property.)
|
 
|
An array of scalars is required.
|-
|'''Vectors''' (SelectInputVectors)
|
This property indicates the name of the vector array on
which to operate. The indicated array may be used for scaling and/or
orienting the tubes. (See the VaryRadius property.)
|
1
|
An array of vectors is required.
|-
|'''Number of Sides''' (NumberOfSides)
|
The value of this property indicates the number of faces
around the circumference of the tube.
|
6
|
 
|-
|'''Capping''' (Capping)
|
If this property is set to 1, endcaps will be drawn on
the tube. Otherwise the ends of the tube will be open.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Radius''' (Radius)
|
The value of this property sets the radius of the tube.
If the radius is varying (VaryRadius property), then this value is the
minimum radius.
|
1.0
|
 
The value must be less than the largest dimension of the
dataset multiplied by a scale factor of
0.01.
 
|-
|'''VaryRadius''' (VaryRadius)
|
The property determines whether/how to vary the radius
of the tube. If varying by scalar (1), the tube radius is based on the
point-based scalar values in the dataset. If it is varied by vector,
the vector magnitude is used in varying the radius.
|
0
|
The value(s) is an enumeration of the following:
* Off (0)
* By Scalar (1)
* By Vector (2)
* By Absolute Scalar (3)
|-
|'''RadiusFactor''' (RadiusFactor)
|
If varying the radius (VaryRadius property), the
property sets the maximum tube radius in terms of a multiple of the
minimum radius. If not varying the radius, this value has no
effect.
|
10
|
 
|-
|'''UseDefaultNormal''' (UseDefaultNormal)
|
If this property is set to 0, and the input contains no
vector array, then default ribbon normals will be generated
(DefaultNormal property); if a vector array has been set
(SelectInputVectors property), the ribbon normals will be set from the
specified array. If this property is set to 1, the default normal
(DefaultNormal property) will be used, regardless of whether the
SelectInputVectors property has been set.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''DefaultNormal''' (DefaultNormal)
|
The value of this property specifies the normal to use
when the UseDefaultNormal property is set to 1 or the input contains no
vector array (SelectInputVectors property).
|
0 0 1
|
 
 
|}
 
==UpdateSuppressor2==
 
 
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
Set the input to the Update Suppressor
filter.
|
 
|
 
|-
|'''Enabled''' (Enabled)
|
Toggle whether the update suppressor is
enabled.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''UpdateTime''' (UpdateTime)
|
 
|
none
|
 
 
|}
 
==Warp By Scalar==
 
This filter moves point coordinates along a vector scaled by a point attribute. It can be used to produce carpet plots.
The Warp (scalar) filter translates the points of the
input data set along a vector by a distance determined by
the specified scalars. This filter operates on polygonal,
curvilinear, and unstructured grid data sets containing
single-component scalar arrays. Because it only changes
the positions of the points, the output data set type is
the same as that of the input. Any scalars in the input
dataset are copied to the output, so the data can be
colored by them.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Warp (scalar)
filter.
|
 
|
Accepts input of following types:
* vtkPointSet
* vtkImageData
* vtkRectilinearGrid
The dataset must contain a field array (point)
 
with 1 component(s).
 
|-
|'''Scalars''' (SelectInputScalars)
|
This property contains the name of the scalar array by
which to warp the dataset.
|
 
|
An array of scalars is required.
|-
|'''ScaleFactor''' (ScaleFactor)
|
The scalar value at a given point is multiplied by the
value of this property to determine the magnitude of the change vector
for that point.
|
1.0
|
 
|-
|'''Normal''' (Normal)
|
The values of this property specify the direction along
which to warp the dataset if any normals contained in the input dataset
are not being used for this purpose. (See the UseNormal
property.)
|
0 0 1
|
 
|-
|'''UseNormal''' (UseNormal)
|
If point normals are present in the dataset, the value
of this property toggles whether to use a single normal value (value =
1) or the normals from the dataset (value = 0).
|
0
|
Accepts boolean values (0 or 1).
|-
|'''XY Plane''' (XYPlane)
|
If the value of this property is 1, then the
Z-coordinates from the input are considered to be the scalar values,
and the displacement is along the Z axis. This is useful for creating
carpet plots.
|
0
|
Accepts boolean values (0 or 1).
 
|}
 
==Warp By Vector==
 
This filter displaces point coordinates along a vector attribute. It is useful for showing mechanical deformation.
The Warp (vector) filter translates the points of the
input dataset using a specified vector array. The vector
array chosen specifies a vector per point in the input.
Each point is translated along its vector by a given scale
factor. This filter operates on polygonal, curvilinear,
and unstructured grid datasets. Because this filter only
changes the positions of the points, the output dataset
type is the same as that of the input.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input to the Warp (vector)
filter.
|
 
|
Accepts input of following types:
* vtkPointSet
* vtkImageData
* vtkRectilinearGrid
The dataset must contain a field array (point)
 
with 3 component(s).
 
|-
|'''Vectors''' (SelectInputVectors)
|
The value of this property contains the name of the
vector array by which to warp the dataset's point
coordinates.
|
 
|
An array of vectors is required.
|-
|'''ScaleFactor''' (ScaleFactor)
|
Each component of the selected vector array will be
multiplied by the value of this property before being used to compute
new point coordinates.
|
1.0
|
 
 
|}
 
==Youngs Material Interface==
 
Computes linear material interfaces in 2D or 3D mixed cells produced by eulerian or ALE simulation codes
Computes linear material interfaces in 2D or 3D mixed
cells produced by Eulerian or ALE simulation
codes
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
|
 
|
Accepts input of following types:
* vtkCompositeDataSet
The dataset must contain a field array (cell)
 
with 1 component(s).
 
The dataset must contain a field array (cell)
 
with 3 component(s).
 
|-
|'''InverseNormal''' (InverseNormal)
|
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ReverseMaterialOrder''' (ReverseMaterialOrder)
|
 
|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Capping'''<br>''(Capping)''
|'''OnionPeel''' (OnionPeel)
|
|
If this property is set to 1, endcaps will be drawn on the tube. Otherwise the ends of the tube will be open.


| 1
|
|
Only the values 0 and 1 are accepted.
1
 
 
|-
| '''Default Normal'''<br>''(DefaultNormal)''
|
The value of this property specifies the normal to use when the UseDefaultNormal property is set to 1 or the input contains no vector array (SelectInputVectors property).
 
| 0 0 1
|
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''AxisSymetric''' (AxisSymetric)
|
|
This property specifies the input to the Tube filter.


|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
 
|-
| '''Number of Sides'''<br>''(NumberOfSides)''
|
The value of this property indicates the number of faces around the circumference of the tube.
 
| 6
|
The value must be greater than or equal to 3.
 
 
|-
|-
| '''Radius'''<br>''(Radius)''
|'''FillMaterial''' (FillMaterial)
|
|
The value of this property sets the radius of the tube. If the radius is varying (VaryRadius property), then this value is the minimum radius.


| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.01.
1
 
 
|-
| '''Radius Factor'''<br>''(RadiusFactor)''
|
If varying the radius (VaryRadius property), the property sets the
maximum tube radius in terms of a multiple of the minimum radius. If
not varying the radius, this value has no effect.
 
| 10
|
|
Accepts boolean values (0 or 1).
|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''UseFractionAsDistance''' (UseFractionAsDistance)
|
|
This property indicates the name of the scalar array on which to
operate. The indicated array may be used for scaling the tubes.
(See the VaryRadius property.)


|
|
0
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|'''VolumeFractionRange''' (VolumeFractionRange)
|
|
This property indicates the name of the vector array on which to
operate. The indicated array may be used for scaling and/or
orienting the tubes. (See the VaryRadius property.)


| 1
|
|
An array of vectors is required.
0.01 0.99
 
 
|-
| '''Use Default Normal'''<br>''(UseDefaultNormal)''
|
|
If this property is set to 0, and the input contains no vector array, then default ribbon normals will be generated (DefaultNormal property); if a vector array has been set (SelectInputVectors property), the ribbon normals will be set from the specified array. If this property is set to 1, the default normal (DefaultNormal property) will be used, regardless of whether the SelectInputVectors property has been set.


| 0
|
Only the values 0 and 1 are accepted.
|-
| '''Vary Radius'''<br>''(VaryRadius)''
|
The property determines whether/how to vary the radius of the tube. If
varying by scalar (1), the tube radius is based on the point-based
scalar values in the dataset. If it is varied by vector, the vector
magnitude is used in varying the radius.
| 0
|
The value must be one of the following: Off (0), By Scalar (1), By Vector (2), By Absolute Scalar (3).
|}
==Warp By Scalar==
This filter moves point coordinates along a vector scaled by a point attribute.  It can be used to produce carpet plots.
The Warp (scalar) filter translates the points of the input data set along a vector by a distance determined by the specified scalars. This filter operates on polygonal, curvilinear, and unstructured grid data sets containing single-component scalar arrays. Because it only changes the positions of the points, the output data set type is the same as that of the input. Any scalars in the input dataset are copied to the output, so the data can be colored by them.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''NumberOfDomainsInformation''' (NumberOfDomainsInformation)
|
|
This property specifies the input to the Warp (scalar) filter.


|
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The dataset must contain a point array with 1 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
|-
| '''Normal'''<br>''(Normal)''
|
|
The values of this property specify the direction along which to warp the dataset if any normals contained in the input dataset are not being used for this purpose. (See the UseNormal property.)


| 0 0 1
|
|-
|-
| '''Scale Factor'''<br>''(ScaleFactor)''
|'''VolumeFractionArrays''' (VolumeFractionArrays)
|
|
The scalar value at a given point is multiplied by the value of this property to determine the magnitude of the change vector for that point.


| 1
|
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
|
This property contains the name of the scalar array by which to warp the dataset.


|
|
|
An array of scalars is required.
An array of scalars is required.
|-
|-
| '''Use Normal'''<br>''(UseNormal)''
|'''NormalArrays''' (NormalArrays)
|
|
If point normals are present in the dataset, the value of this property toggles whether to use a single normal value (value = 1) or the normals from the dataset (value = 0).


| 0
|
|
Only the values 0 and 1 are accepted.


|-
| '''XY Plane'''<br>''(XYPlane)''
|
|
If the value of this property is 1, then the Z-coordinates from the input are considered to be the scalar values, and the displacement is along the Z axis. This is useful for creating carpet plots.
An array of vectors is required.The value must be field array name.
 
| 0
|
Only the values 0 and 1 are accepted.
 
 
|}
 
 
==Warp By Vector==
 
 
This filter displaces point coordinates along a vector attribute. It is useful for showing mechanical deformation.
 
The Warp (vector) filter translates the points of the input dataset using a specified vector array. The vector array chosen specifies a vector per point in the input. Each point is translated along its vector by a given scale factor. This filter operates on polygonal, curvilinear, and unstructured grid datasets. Because this filter only changes the positions of the points, the output dataset type is the same as that of the input.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''OrderingArrays''' (OrderingArrays)
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Warp (vector) filter.


|
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The dataset must contain a point array with 3 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
|-
| '''Scale Factor'''<br>''(ScaleFactor)''
|
Each component of the selected vector array will be multiplied by the value of this property before being used to compute new point coordinates.
| 1
|
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
The value of this property contains the name of the vector array by which to warp the dataset's point coordinates.
|
|
|
An array of vectors is required.
An array of scalars is required.The value must be field array name.
 


|}
|}

Latest revision as of 18:07, 26 January 2016

AMR Connectivity

Fragment Identification

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the volume input of the filter.

Accepts input of following types:

  • vtkNonOverlappingAMR

The dataset must contain a field array (cell)

with 1 component(s).

SelectMaterialArrays (SelectMaterialArrays)

This property specifies the cell arrays from which the analysis will determine fragments

An array of scalars is required.

Volume Fraction Value (VolumeFractionSurfaceValue)

This property specifies the values at which to compute the isosurface.

0.1

Resolve Blocks (Resolve Blocks)

Resolve the fragments between blocks.

1

Accepts boolean values (0 or 1).

Propagate Ghosts (Propagate Ghosts)

Propagate regionIds into the ghosts.

0

Accepts boolean values (0 or 1).

AMR Contour

Iso surface cell array.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkCompositeDataSet

The dataset must contain a field array (cell)

with 1 component(s).

SelectMaterialArrays (SelectMaterialArrays)

This property specifies the cell arrays from which the contour filter will compute contour cells.

An array of scalars is required.

Volume Fraction Value (VolumeFractionSurfaceValue)

This property specifies the values at which to compute the isosurface.

0.1

Capping (Capping)

If this property is on, the the boundary of the data set is capped.

1

Accepts boolean values (0 or 1).

DegenerateCells (DegenerateCells)

If this property is on, a transition mesh between levels is created.

1

Accepts boolean values (0 or 1).

MultiprocessCommunication (MultiprocessCommunication)

If this property is off, each process executes independantly.

1

Accepts boolean values (0 or 1).

SkipGhostCopy (SkipGhostCopy)

A simple test to see if ghost values are already set properly.

1

Accepts boolean values (0 or 1).

Triangulate (Triangulate)

Use triangles instead of quads on capping surfaces.

1

Accepts boolean values (0 or 1).

MergePoints (MergePoints)

Use more memory to merge points on the boundaries of blocks.

1

Accepts boolean values (0 or 1).

AMR CutPlane

Planar Cut of an AMR grid datasetThis filter creates a cut-plane of the

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input for this filter.

Accepts input of following types:

  • vtkOverlappingAMR
UseNativeCutter (UseNativeCutter)

This property specifies whether the ParaView's generic dataset cutter is used instead of the specialized AMR cutter.

0

Accepts boolean values (0 or 1).

LevelOfResolution (LevelOfResolution)

Set maximum slice resolution.

0

Center (Center)

0.5 0.5 0.5

Normal (Normal)

0 0 1


AMR Dual Clip

Clip with scalars. Tetrahedra.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkCompositeDataSet

The dataset must contain a field array (cell)

with 1 component(s).

SelectMaterialArrays (SelectMaterialArrays)

This property specifies the cell arrays from which the clip filter will compute clipped cells.

An array of scalars is required.

Volume Fraction Value (VolumeFractionSurfaceValue)

This property specifies the values at which to compute the isosurface.

0.1

InternalDecimation (InternalDecimation)

If this property is on, internal tetrahedra are decimation

1

Accepts boolean values (0 or 1).

MultiprocessCommunication (MultiprocessCommunication)

If this property is off, each process executes independantly.

1

Accepts boolean values (0 or 1).

MergePoints (MergePoints)

Use more memory to merge points on the boundaries of blocks.

1

Accepts boolean values (0 or 1).

AMR Fragment Integration

Fragment Integration

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the volume input of the filter.

Accepts input of following types:

  • vtkNonOverlappingAMR

The dataset must contain a field array (cell)

with 1 component(s).

SelectMaterialArrays (SelectMaterialArrays)

This property specifies the cell arrays from which the analysis will determine fragments

An array of scalars is required.

SelectMassArrays (SelectMassArrays)

This property specifies the cell arrays from which the analysis will determine fragment mass

An array of scalars is required.

SelectVolumeWeightedArrays (SelectVolumeWeightedArrays)

This property specifies the cell arrays from which the analysis will determine volume weighted average values

An array of scalars is required.

SelectMassWeightedArrays (SelectMassWeightedArrays)

This property specifies the cell arrays from which the analysis will determine mass weighted average values

An array of scalars is required.

AMR Fragments Filter

Meta Fragment filterCombines the running of AMRContour, AMRFragmentIntegration, AMRDualContour and ExtractCTHParts

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the volume input of the filter.

Accepts input of following types:

  • vtkNonOverlappingAMR

The dataset must contain a field array (cell)

with 1 component(s).

SelectMaterialArrays (SelectMaterialArrays)

This property specifies the cell arrays from which the analysis will determine fragments

An array of scalars is required.

SelectMassArrays (SelectMassArrays)

This property specifies the cell arrays from which the analysis will determine fragment mass

An array of scalars is required.

SelectVolumeWeightedArrays (SelectVolumeWeightedArrays)

This property specifies the cell arrays from which the analysis will determine volume weighted average values

An array of scalars is required.

SelectMassWeightedArrays (SelectMassWeightedArrays)

This property specifies the cell arrays from which the analysis will determine mass weighted average values

An array of scalars is required.

Volume Fraction Value (VolumeFractionSurfaceValue)

This property specifies the values at which to compute the isosurface.

0.1

Extract Surface (Extract Surface)

Whether or not to extract a surface from this data

0

Accepts boolean values (0 or 1).

Use Watertight Surface (Use Watertight Surface)

Whether the extracted surface should be watertight or not

0

Accepts boolean values (0 or 1).

Integrate Fragments (Integrate Fragments)

Whether or not to integrate fragments in this data

1

Accepts boolean values (0 or 1).

Add Field Arrays

Reads arrays from a file and adds them to the input data object. Takes in an input data object and a filename. Opens the file and adds any arrays it sees there to the input data.


Property Description Default Value(s) Restrictions
Input (Input)

The input.

FileName (FileName)

This property specifies the file to read to get arrays

The value(s) must be a filename (or filenames).

Angular Periodic Filter

This filter generate a periodic multiblock dataset.This filter generate a periodic multiblock dataset

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Periodic filter.

Accepts input of following types:

  • vtkDataSet
BlockIndices (BlockIndices)

This property lists the ids of the blocks to make periodic from the input multiblock dataset.

IterationMode (IterationMode)

This property specifies the mode of iteration, either a user-provided number of periods, or the maximum number of periods to rotate to 360°.

1

The value(s) is an enumeration of the following:

  • Manual (0)
  • Maximum (1)
NumberOfPeriods (NumberOfPeriods)

This property specifies the number of iteration

3

RotationMode (RotationMode)

This property specifies the mode of rotation, either from a user provided angle or from an array in the data.

0

The value(s) is an enumeration of the following:

  • Direct Angle (0)
  • Array Value (1)
RotationAngle (RotationAngle)

Rotation angle in degree.

10

RotationArrayName (RotationArrayName)

Field array name that contains the rotation angle in radian.

periodic angle

Axis (Axis)

This property specifies the axis of rotation

0

The value(s) is an enumeration of the following:

  • Axis X (0)
  • Axis Y (1)
  • Axis Z (2)
Center (Center)

This property specifies the 3D coordinates for the center of the rotation.

0.0 0.0 0.0


Annotate Attribute Data

Adds a text annotation to a Rander View This filter can be used to add a text annotation to a Render View (or similar) using a tuple from any attribute array (point/cell/field/row etc.) from a specific rank (when running in parallel). Use **ArrayName** property to select the array association and array name. Use

    • ElementId* property to set the element number to extract the value to

label with. When running on multiple ranks, use **ProcessId** property to select the rank of interest. The **Prefix** property can be used to specify a string that will be used as the prefix to the generated annotation text.


Property Description Default Value(s) Restrictions
Input (Input)

Set the input of the filter. To avoid the complications/confusion when identifying elements in a composite dataset, this filter doesn't support composite datasets currently.

Accepts input of following types:

  • vtkDataSet
  • vtkTable

The dataset must contain a field array (any)

with 1 component(s).

ArrayAssociation (ArrayAssociation)

Select the attribute to use to popular array names from.

2

The value(s) is an enumeration of the following:

  • Point Data (0)
  • Cell Data (1)
  • Field Data (2)
  • Row Data (6)
ArrayName (ArrayName)

Choose arrays that is going to be displayed

ElementId (ElementId)

Set the element index to annotate with.

0

ProcessId (ProcessId)

Set the process rank to extract element from.

0

Prefix (Prefix)

Text that is used as a prefix to the field value

Value is:


Annotate Global Data

Filter for annotating with global data (designed for ExodusII reader) Annotate Global Data provides a simpler API for creating text annotations using vtkPythonAnnotationFilter. Instead of users specifying the annotation expression, this filter determines the expression based on the array selected by limiting the scope of the functionality. This filter only allows the user to annotate using "global-data" aka field data and specify the string prefix to use. If the field array chosen has as many elements as number of timesteps, the array is assumed to be "temporal" and indexed using the current timestep.


Property Description Default Value(s) Restrictions
Input (Input)

Set the input of the filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (none)

with 1 component(s).

SelectArrays (SelectArrays)

Choose arrays that is going to be displayed

Prefix (Prefix)

Text that is used as a prefix to the field value

Value is:

Suffix (Suffix)

Text that is used as a suffix to the field value


Annotate Time Filter

Shows input data time as text annnotation in the view.The Annotate Time filter can be used to show the data time in a text annotation.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input dataset for which to display the time.

Format (Format)

The value of this property is a format string used to display the input time. The format string is specified using printf style.

Time: %f

Shift (Shift)

The amount of time the input is shifted (after scaling).

0.0

Scale (Scale)

The factor by which the input time is scaled.

1.0


Append Attributes

Copies geometry from first input. Puts all of the arrays into the output. The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Append Attributes filter.

Accepts input of following types:

  • vtkDataSet

Append Datasets

Takes an input of multiple datasets and output has only one unstructured grid.The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the datasets to be merged into a single dataset by the Append Datasets filter.

Accepts input of following types:

  • vtkDataSet

Append Geometry

Takes an input of multiple poly data parts and output has only one part.The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Append Geometry filter.

Accepts input of following types:

  • vtkPolyData

Block Scalars

The Level Scalars filter uses colors to show levels of a multiblock dataset.The Level Scalars filter uses colors to show levels of a multiblock dataset.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Level Scalars filter.

Accepts input of following types:

  • vtkMultiBlockDataSet

CTH Surface

Not finished yet.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkCompositeDataSet

CacheKeeper

vtkPVCacheKeeper manages data cache for flip book animations. When caching is disabled, this simply acts as a pass through filter. When caching is enabled, is the current time step has been previously cached then this filter shuts the update request, otherwise propagates the update and then cache the result for later use. The current time step is set using SetCacheTime().

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Update Suppressor filter.

CacheTime (CacheTime)

0.0

CachingEnabled (CachingEnabled)

Toggle whether the caching is enabled.

1

Accepts boolean values (0 or 1).

Calculator

Compute new attribute arrays as function of existing arrays. The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector.

The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies. Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.

- Clear: Erase the current function (displayed in the read-only text box above the calculator buttons). - /: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses. - *: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses. - -: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses. - +: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses. - sin: Compute the sine of a scalar. cos: Compute the cosine of a scalar. - tan: Compute the tangent of a scalar. - asin: Compute the arcsine of a scalar. - acos: Compute the arccosine of a scalar. - atan: Compute the arctangent of a scalar. - sinh: Compute the hyperbolic sine of a scalar. - cosh: Compute the hyperbolic cosine of a scalar. - tanh: Compute the hyperbolic tangent of a scalar. - min: Compute minimum of two scalars. - max: Compute maximum of two scalars. - x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses. - sqrt: Compute the square root of a scalar. - e^x: Raise e to the power of a scalar. - log: Compute the logarithm of a scalar (deprecated. same as log10). - log10: Compute the logarithm of a scalar to the base 10. - ln: Compute the logarithm of a scalar to the base 'e'. - ceil: Compute the ceiling of a scalar. floor: Compute the floor of a scalar. - abs: Compute the absolute value of a scalar. - v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses. - cross: Compute cross product of two vectors. - mag: Compute the magnitude of a vector. - norm: Normalize a vector.

The operands are described below. The digits 0 - 9 and the decimal point are used to enter constant scalar values. **iHat**, **jHat**, and **kHat** are vector constants representing unit vectors in the X, Y, and Z directions, respectively. The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell). The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

AttributeMode (AttributeMode)

This property determines whether the computation is to be performed on point-centered or cell-centered data.

1

The value(s) is an enumeration of the following:

  • Point Data (1)
  • Cell Data (2)
CoordinateResults (CoordinateResults)

The value of this property determines whether the results of this computation should be used as point coordinates or as a new array.

0

Accepts boolean values (0 or 1).

ResultNormals (ResultNormals)

Set whether to output results as point/cell normals. Outputing as normals is only valid with vector results. Point or cell normals are selected using AttributeMode.

0

Accepts boolean values (0 or 1).

ResultTCoords (ResultTCoords)

Set whether to output results as point/cell texture coordinates. Point or cell texture coordinates are selected using AttributeMode. 2-component texture coordinates cannot be generated at this time.

0

Accepts boolean values (0 or 1).

ResultArrayName (ResultArrayName)

This property contains the name for the output array containing the result of this computation.

Result

Function (Function)

This property contains the equation for computing the new array.

Replace Invalid Results (ReplaceInvalidValues)

This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.)

1

Accepts boolean values (0 or 1).

ReplacementValue (ReplacementValue)

If invalid values in the computation are to be replaced with another value, this property contains that value.

0.0


Cell Centers

Create a point (no geometry) at the center of each input cell.The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Cell Centers filter.

Accepts input of following types:

  • vtkDataSet
VertexCells (VertexCells)

If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated.

0

Accepts boolean values (0 or 1).

Cell Data to Point Data

Create point attributes by averaging cell attributes.The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Cell Data to Point Data filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (cell)

PassCellData (PassCellData)

If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output.

0

Accepts boolean values (0 or 1).

PieceInvariant (PieceInvariant)

If the value of this property is set to 1, this filter will request ghost levels so that the values at boundary points match across processes. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.

0

Accepts boolean values (0 or 1).

Clean

Merge coincident points if they do not meet a feature edge criteria.The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Clean filter.

Accepts input of following types:

  • vtkPolyData
PieceInvariant (PieceInvariant)

If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel.

1

Accepts boolean values (0 or 1).

Tolerance (Tolerance)

If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set.

0.0

AbsoluteTolerance (AbsoluteTolerance)

If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set.

1.0

ToleranceIsAbsolute (ToleranceIsAbsolute)

This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging.

0

Accepts boolean values (0 or 1).

ConvertLinesToPoints (ConvertLinesToPoints)

If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points.

1

Accepts boolean values (0 or 1).

ConvertPolysToLines (ConvertPolysToLines)

If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines.

1

Accepts boolean values (0 or 1).

ConvertStripsToPolys (ConvertStripsToPolys)

If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles.

1

Accepts boolean values (0 or 1).

PointMerging (PointMerging)

If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged.

1

Accepts boolean values (0 or 1).

Clean Cells to Grid

This filter merges cells and converts the data set to unstructured grid.Merges degenerate cells. Assumes the input grid does not contain duplicate points. You may want to run vtkCleanUnstructuredGrid first to assert it. If duplicated cells are found they are removed in the output. The filter also handles the case, where a cell may contain degenerate nodes (i.e. one and the same node is referenced by a cell more than once).

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Clean Cells to Grid filter.

Accepts input of following types:

  • vtkUnstructuredGrid

Clean to Grid

This filter merges points and converts the data set to unstructured grid.The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Clean to Grid filter.

Accepts input of following types:

  • vtkDataSet

ClientServerMoveData

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Client Server Move Data filter.

OutputDataType (OutputDataType)

0

WholeExtent (WholeExtent)

0 -1 0 -1 0 -1


Clip

Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the dataset on which the Clip filter will operate.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

with 1 component(s).

Clip Type (ClipFunction)

This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)
  • Cylinder (implicit_functions)
  • Scalar (implicit_functions)
InputBounds (InputBounds)
Scalars (SelectInputScalars)

If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.

An array of scalars is required.The value must be field array name.

Value (Value)

If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface.

0.0

The value must lie within the range of the selected data array.

InsideOut (InsideOut)

If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead.

0

Accepts boolean values (0 or 1).

UseValueAsOffset (UseValueAsOffset)

If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array.

0

Accepts boolean values (0 or 1).

Crinkle clip (PreserveInputCells)

This parameter controls whether to extract entire cells in the given region or clip those cells so all of the output one stay only inside that region.

0

Accepts boolean values (0 or 1).

Clip Closed Surface

Clip a polygonal dataset with a plane to produce closed surfaces This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the dataset on which the Clip filter will operate.

Accepts input of following types:

  • vtkPolyData

The dataset must contain a field array (point)

with 1 component(s).

Clipping Plane (ClippingPlane)

This property specifies the parameters of the clipping plane used to clip the polygonal data.

The value can be one of the following:

  • Plane (implicit_functions)
GenerateFaces (GenerateFaces)

Generate polygonal faces in the output.

1

Accepts boolean values (0 or 1).

GenerateOutline (GenerateOutline)

Generate clipping outlines in the output wherever an input face is cut by the clipping plane.

0

Accepts boolean values (0 or 1).

Generate Cell Origins (ScalarMode)

Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells.

0

The value(s) is an enumeration of the following:

  • None (0)
  • Color (1)
  • Label (2)
InsideOut (InsideOut)

If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane.

0

Accepts boolean values (0 or 1).

Clipping Tolerance (Tolerance)

Specify the tolerance for creating new points. A small value might incur degenerate triangles.

0.000001

Base Color (BaseColor)

Specify the color for the faces from the input.

0.10 0.10 1.00

Clip Color (ClipColor)

Specifiy the color for the capping faces (generated on the clipping interface).

1.00 0.11 0.10


Clip Generic Dataset

Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set. This output data type of this filter is always an unstructured grid. The Generic Clip filter cuts away a portion of the input data set using a plane, a sphere, a box, or a scalar value. The menu in the Clip Function portion of the interface allows the user to select which implicit function to use or whether to clip using a scalar value. Making this selection loads the appropriate user interface. For the implicit functions, the appropriate 3D widget (plane, sphere, or box) is also displayed. The use of these 3D widgets, including their user interface components, is discussed in section 7.4. If an implicit function is selected, the clip filter returns that portion of the input data set that lies inside the function. If Scalars is selected, then the user must specify a scalar array to clip according to. The clip filter will return the portions of the data set whose value in the selected Scalars array is larger than the Clip value. Regardless of the selection from the Clip Function menu, if the Inside Out option is checked, the opposite portions of the data set will be returned. This filter operates on all types of data sets, and it returns unstructured grid data on output.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Clip filter.

Accepts input of following types:

  • vtkGenericDataSet

The dataset must contain a field array (point)

Clip Type (ClipFunction)

Set the parameters of the clip function.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)
  • Scalar (implicit_functions)
InputBounds (InputBounds)
Scalars (SelectInputScalars)

If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.

An array of scalars is required.The value must be field array name.

InsideOut (InsideOut)

Choose which portion of the dataset should be clipped away.

0

Accepts boolean values (0 or 1).

Value (Value)

If clipping with a scalar array, choose the clipping value.

0.0

The value must lie within the range of the selected data array.

Color By Array

This filter generate a color based image data based on a selected data scalar

Property Description Default Value(s) Restrictions
Input (Input)

Accepts input of following types:

  • vtkImageData

The dataset must contain a field array (point)

with 1 component(s).

LookupTable (LookupTable)
Color By (SelectInputScalars)

This property specifies the name of the scalar array from which we will color by.

An array of scalars is required.The value must be field array name.

RGBA NaN Color (NaNColor)

0 0 0 255

OutputFormat (OutputFormat)

3

The value(s) is an enumeration of the following:

  • Luminance (1)
  • Luminance Alpha (2)
  • RGB (3)
  • RGBA (4)

Compute Derivatives

This filter computes derivatives of scalars and vectors. CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 1 component(s).

The dataset must contain a field array (point)

with 3 component(s).

Scalars (SelectInputScalars)

This property indicates the name of the scalar array to differentiate.

An array of scalars is required.

Vectors (SelectInputVectors)

This property indicates the name of the vector array to differentiate.

1

An array of vectors is required.

OutputVectorType (OutputVectorType)

This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.

1

The value(s) is an enumeration of the following:

  • Nothing (0)
  • Scalar Gradient (1)
  • Vorticity (2)
OutputTensorType (OutputTensorType)

This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.

1

The value(s) is an enumeration of the following:

  • Nothing (0)
  • Vector Gradient (1)
  • Strain (2)

Compute Quartiles

Compute the quartiles table from a dataset or table.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkDataObject

Connectivity

Mark connected components with integer point attribute array.The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Connectivity filter.

Accepts input of following types:

  • vtkDataSet
ExtractionMode (ExtractionMode)

Controls the extraction of connected surfaces.

5

The value(s) is an enumeration of the following:

  • Extract Point Seeded Regions (1)
  • Extract Cell Seeded Regions (2)
  • Extract Specified Regions (3)
  • Extract Largest Region (4)
  • Extract All Regions (5)
  • Extract Closes Point Region (6)
ColorRegions (ColorRegions)

Controls the coloring of the connected regions.

1

Accepts boolean values (0 or 1).

Contingency Statistics

Compute a statistical model of a dataset and/or assess the dataset with a statistical model. This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset. This filter computes contingency tables between pairs of attributes. This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model. Data is assessed by computing <ul> <li> the probability of observing both variables simultaneously; <li> the probability of each variable conditioned on the other (the two values need not be identical); and <li> the pointwise mutual information (PMI). </ul> Finally, the summary statistics include the information entropy of the observations.

Property Description Default Value(s) Restrictions
Input (Input)

The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.

Accepts input of following types:

  • vtkImageData
  • vtkStructuredGrid
  • vtkPolyData
  • vtkUnstructuredGrid
  • vtkTable
  • vtkGraph

The dataset must contain a field array ()

ModelInput (ModelInput)

A previously-calculated model with which to assess a separate dataset. This input is optional.

Accepts input of following types:

  • vtkTable
  • vtkMultiBlockDataSet
AttributeMode (AttributeMode)

Specify which type of field data the arrays will be drawn from.

0

The value must be field array name.

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Task (Task)

Specify the task to be performed: modeling and/or assessment. <ol> <li> "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the <b>entire</b> input dataset;</li> <li> "Model a subset of the data," creates an output table (or tables) summarizing a <b>randomly-chosen subset</b> of the input dataset;</li> <li> "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and</li> <li> "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.</li> </ol> When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The <i>Training fraction</i> setting will be ignored for tasks 1 and 3.

3

The value(s) is an enumeration of the following:

  • Detailed model of input data (0)
  • Model a subset of the data (1)
  • Assess the data with a model (2)
  • Model and assess the same data (3)
TrainingFraction (TrainingFraction)

Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.

0.1


Contour

Generate isolines or isosurfaces using point scalars.The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input dataset to be used by the contour filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 1 component(s).

Contour By (SelectInputScalars)

This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.

An array of scalars is required.The value must be field array name.

ComputeNormals (ComputeNormals)

If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Select whether to compute normals.

1

Accepts boolean values (0 or 1).

ComputeGradients (ComputeGradients)

If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.

0

Accepts boolean values (0 or 1).

ComputeScalars (ComputeScalars)

If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.

0

Accepts boolean values (0 or 1).

OutputPointsPrecision (OutputPointsPrecision)

Select the output precision of the coordinates. **Single** sets the output to single-precision floating-point (i.e., float), **Double** sets it to double-precision floating-point (i.e., double), and

    • Default** sets it to the same precision as the precision of the

points in the input. Defaults to ***Single***.

0

The value(s) is an enumeration of the following:

  • Single (0)
  • Double (1)
  • Same as input (2)
GenerateTriangles (GenerateTriangles)

This parameter controls whether to produce triangles in the output. Warning: Many filters do not properly handle non-trianglular polygons.

1

Accepts boolean values (0 or 1).

Isosurfaces (ContourValues)

This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.

The value must lie within the range of the selected data array.

Point Merge Method (Locator)

This property specifies an incremental point locator for merging duplicate / coincident points.

The value can be one of the following:

  • MergePoints (incremental_point_locators)
  • IncrementalOctreeMergePoints (incremental_point_locators)
  • NonMergingPointLocator (incremental_point_locators)


Contour Generic Dataset

Generate isolines or isosurfaces using point scalars.The Generic Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The available scalar arrays are listed in the Scalars menu. The scalar range of the selected array will be displayed. The interface for adding contour values is very similar to the one for selecting cut offsets (in the Cut filter). To add a single contour value, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced contours, use the controls in the Generate range of values section. Select the number of contour values to generate using the Number of Values slider. The Range slider controls the interval in which to generate the contour values. Once the number of values and range have been selected, click the Generate button. The new values will be added to the Contour Values list. To delete a value from the Contour Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list. If no values are in the Contour Values list when Accept is pressed, the current value of the New Value slider will be used. In addition to selecting contour values, you can also select additional computations to perform. If any of Compute Normals, Compute Gradients, or Compute Scalars is selected, the appropriate computation will be performed, and a corresponding point-centered array will be added to the output. The Generic Contour filter operates on a generic data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Contour filter.

Accepts input of following types:

  • vtkGenericDataSet

The dataset must contain a field array (point)

with 1 component(s).

Contour By (SelectInputScalars)

This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.

An array of scalars is required.The value must be field array name.

ComputeNormals (ComputeNormals)

Select whether to compute normals.

1

Accepts boolean values (0 or 1).

ComputeGradients (ComputeGradients)

Select whether to compute gradients.

0

Accepts boolean values (0 or 1).

ComputeScalars (ComputeScalars)

Select whether to compute scalars.

0

Accepts boolean values (0 or 1).

Isosurfaces (ContourValues)

This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.

The value must lie within the range of the selected data array.

Point Merge Method (Locator)

This property specifies an incremental point locator for merging duplicate / coincident points.

The value can be one of the following:

  • MergePoints (incremental_point_locators)
  • IncrementalOctreeMergePoints (incremental_point_locators)
  • NonMergingPointLocator (incremental_point_locators)


Convert AMR dataset to Multi-block

Convert AMR to Multiblock

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input for this filter.

Accepts input of following types:

  • vtkOverlappingAMR

ConvertSelection

Converts a selection from one type to another.

Property Description Default Value(s) Restrictions
DataInput (DataInput)

Set the vtkDataObject input used to convert the selection.

Accepts input of following types:

  • vtkDataObject
Input (Input)

Set the selection to convert.

Accepts input of following types:

  • vtkSelection
OutputType (OutputType)

Set the ContentType for the output.

5

The value(s) is an enumeration of the following:

  • SELECTIONS (0)
  • GLOBALIDs (1)
  • PEDIGREEIDS (2)
  • VALUES (3)
  • INDICES (4)
  • FRUSTUM (5)
  • LOCATION (6)
  • THRESHOLDS (7)
ArrayNames (ArrayNames)
MatchAnyValues (MatchAnyValues)

0

Accepts boolean values (0 or 1).

Crop

Efficiently extract an area/volume of interest from a 2-d image or 3-d volume.The Crop filter extracts an area/volume of interest from a 2D image or a 3D volume by allowing the user to specify the minimum and maximum extents of each dimension of the data. Both the input and output of this filter are uniform rectilinear data.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Crop filter.

Accepts input of following types:

  • vtkImageData
OutputWholeExtent (OutputWholeExtent)

This property gives the minimum and maximum point index (extent) in each dimension for the output dataset.

0 0 0 0 0 0

The value(s) must lie within the structured-extents of the input dataset.

Curvature

This filter will compute the Gaussian or mean curvature of the mesh at each point.The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures. ; the type can be selected from the Curvature type menu button.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Curvature filter.

Accepts input of following types:

  • vtkPolyData
InvertMeanCurvature (InvertMeanCurvature)

If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.

0

Accepts boolean values (0 or 1).

CurvatureType (CurvatureType)

This propery specifies which type of curvature to compute.

0

The value(s) is an enumeration of the following:

  • Gaussian (0)
  • Mean (1)

D3

Repartition a data set into load-balanced spatially convex regions. Create ghost cells if requested.The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the D3 filter.

Accepts input of following types:

  • vtkDataSet
BoundaryMode (BoundaryMode)

This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.

0

The value(s) is an enumeration of the following:

  • Assign cells uniquely (0)
  • Duplicate cells (1)
  • Divide cells (2)
Minimal Memory (UseMinimalMemory)

If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.

0

Accepts boolean values (0 or 1).

Decimate

Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only. The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Decimate filter.

Accepts input of following types:

  • vtkPolyData
TargetReduction (TargetReduction)

This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)

0.9

PreserveTopology (PreserveTopology)

If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.

0

Accepts boolean values (0 or 1).

FeatureAngle (FeatureAngle)

The value of this property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.

15.0

BoundaryVertexDeletion (BoundaryVertexDeletion)

If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.

1

Accepts boolean values (0 or 1).

Delaunay 2D

Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution. Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh. The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation. Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed. The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process. Warning: Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle. Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input dataset to the Delaunay 2D filter.

Accepts input of following types:

  • vtkPointSet
ProjectionPlaneMode (ProjectionPlaneMode)

This property determines type of projection plane to use in performing the triangulation.

0

The value(s) is an enumeration of the following:

  • XY Plane (0)
  • Best-Fitting Plane (2)
Alpha (Alpha)

The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.

0.0

Tolerance (Tolerance)

This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.

0.00001

Offset (Offset)

This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.

1.0

BoundingTriangulation (BoundingTriangulation)

If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.

0

Accepts boolean values (0 or 1).

Delaunay 3D

Create a 3D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkUnstructuredGrid as output.Delaunay3D is a filter that constructs a 3D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is an unstructured grid dataset. Usually the output is a tetrahedral mesh, but if a non-zero alpha distance value is specified (called the "alpha" value), then only tetrahedra, triangles, edges, and vertices lying within the alpha radius are output. In other words, non-zero alpha values may result in arbitrary combinations of tetrahedra, triangles, lines, and vertices. (The notion of alpha value is derived from Edelsbrunner's work on "alpha shapes".) The 3D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=3 and the simplexes are tetrahedra). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. (See text for more information.) While in two dimensions this translates into an "optimal" triangulation, this is not true in 3D, since a measurement for optimality in 3D is not agreed on. Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D. (If you wish to create 2D triangulations see Delaunay2D.) The output is an unstructured grid. The Delaunay triangulation can be numerically sensitive. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process. Warning: Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first four points will form a tetrahedron; other degenerate points (relative to this initial tetrahedron) will not break it. Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. You can control the definition of coincidence with the "Tolerance" instance variable. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull. This behavior can be controlled by the Offset instance variable. Offset is a multiplier used to control the size of the initial triangulation. The larger the offset value, the more likely you will generate a convex hull; and the more likely you are to see numerical problems. The implementation of this algorithm varies from the 2D Delaunay algorithm (i.e., Delaunay2D) in an important way. When points are injected into the triangulation, the search for the enclosing tetrahedron is quite different. In the 3D case, the closest previously inserted point point is found, and then the connected tetrahedra are searched to find the containing one. (In 2D, a "walk" towards the enclosing triangle is performed.) If the triangulation is Delaunay, then an enclosing tetrahedron will be found. However, in degenerate cases an enclosing tetrahedron may not be found and the point will be rejected.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input dataset to the Delaunay 3D filter.

Accepts input of following types:

  • vtkPointSet
Alpha (Alpha)

This property specifies the alpha (or distance) value to control the output of this filter. For a non-zero alpha value, only edges, faces, or tetra contained within the circumsphere (of radius alpha) will be output. Otherwise, only tetrahedra will be output.

0.0

Tolerance (Tolerance)

This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.

0.001

Offset (Offset)

This property specifies a multiplier to control the size of the initial, bounding Delaunay triangulation.

2.5

BoundingTriangulation (BoundingTriangulation)

This boolean controls whether bounding triangulation points (and associated triangles) are included in the output. (These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.)

0

Accepts boolean values (0 or 1).

AlphaTets (AlphaTets)

This boolean controls whether tetrahedrons which satisfy the alpha criterion output when alpha is non-zero.

1

Accepts boolean values (0 or 1).

AlphaTris (AlphaTris)

This boolean controls whether triangles which satisfy the alpha criterion output when alpha is non-zero.

1

Accepts boolean values (0 or 1).

AlphaLines (AlphaLines)

This boolean controls whether lines which satisfy the alpha criterion output when alpha is non-zero.

0

Accepts boolean values (0 or 1).

AlphaVerts (AlphaVerts)

This boolean controls whether vertices which satisfy the alpha criterion are output when alpha is non-zero.

0

Accepts boolean values (0 or 1).

Descriptive Statistics

Compute a statistical model of a dataset and/or assess the dataset with a statistical model. This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.<p> This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.<p> The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.

Property Description Default Value(s) Restrictions
Input (Input)

The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.

Accepts input of following types:

  • vtkImageData
  • vtkStructuredGrid
  • vtkPolyData
  • vtkUnstructuredGrid
  • vtkTable
  • vtkGraph

The dataset must contain a field array ()

ModelInput (ModelInput)

A previously-calculated model with which to assess a separate dataset. This input is optional.

Accepts input of following types:

  • vtkTable
  • vtkMultiBlockDataSet
AttributeMode (AttributeMode)

Specify which type of field data the arrays will be drawn from.

0

The value must be field array name.

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Task (Task)

Specify the task to be performed: modeling and/or assessment. <ol> <li> "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the <b>entire</b> input dataset;</li> <li> "Model a subset of the data," creates an output table (or tables) summarizing a <b>randomly-chosen subset</b> of the input dataset;</li> <li> "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and</li> <li> "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.</li> </ol> When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The <i>Training fraction</i> setting will be ignored for tasks 1 and 3.

3

The value(s) is an enumeration of the following:

  • Detailed model of input data (0)
  • Model a subset of the data (1)
  • Assess the data with a model (2)
  • Model and assess the same data (3)
TrainingFraction (TrainingFraction)

Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.

0.1

Deviations should be (SignedDeviations)

Should the assessed values be signed deviations or unsigned?

0

The value(s) is an enumeration of the following:

  • Unsigned (0)
  • Signed (1)

Elevation

Create point attribute array by projecting points onto an elevation vector. The Elevation filter generates point scalar values for an input dataset along a specified direction vector. The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input dataset to the Elevation filter.

Accepts input of following types:

  • vtkDataSet
ScalarRange (ScalarRange)

This property determines the range into which scalars will be mapped.

0 1

Low Point (LowPoint)

This property defines one end of the direction vector (small scalar values).

0 0 0

The value must lie within the bounding box of the dataset.

It will default to the min in each dimension.

High Point (HighPoint)

This property defines the other end of the direction vector (large scalar values).

0 0 1

The value must lie within the bounding box of the dataset.

It will default to the max in each dimension.


Environment Annotation

Allows annotation of user name, date/time, OS, and possibly filename. Apply to any source. Gui allows manual selection of desired annotation options. If the source is a file, can display the filename.


Property Description Default Value(s) Restrictions
Input (Input)

Set the input of the filter.

Accepts input of following types:

  • vtkDataObject
DisplayUserName (DisplayUserName)

Toggle User Name Visibility.

0

Accepts boolean values (0 or 1).

DisplaySystemName (DisplaySystemName)

Toggle System Name Visibility.

0

Accepts boolean values (0 or 1).

DisplayDate (DisplayDate)

Toggle Date/Time Visibility.

0

Accepts boolean values (0 or 1).

DisplayFileName (DisplayFileName)

Toggle File Name Visibility.

0

Accepts boolean values (0 or 1).

FileName (FileName)

Annotation of file name.


Extract AMR Blocks

This filter extracts a list of datasets from hierarchical datasets.This filter extracts a list of datasets from hierarchical datasets.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Datasets filter.

Accepts input of following types:

  • vtkUniformGridAMR
SelectedDataSets (SelectedDataSets)

This property provides a list of datasets to extract.


Extract Attributes

Extract attribute data as a table.This is a filter that produces a vtkTable from the chosen attribute in the input dataobject. This filter can accept composite datasets. If the input is a composite dataset, the output is a multiblock with vtkTable leaves.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkDataObject
FieldAssociation (FieldAssociation)

Select the attribute data to pass.

0

The value(s) is an enumeration of the following:

  • Points (0)
  • Cells (1)
  • Field Data (2)
  • Vertices (4)
  • Edges (5)
  • Rows (6)
AddMetaData (AddMetaData)

It is possible for this filter to add additional meta-data to the field data such as point coordinates (when point attributes are selected and input is pointset) or structured coordinates etc. To enable this addition of extra information, turn this flag on. Off by default.

0

Accepts boolean values (0 or 1).

Extract Bag Plots

Extract Bag Plots.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkTable

The dataset must contain a field array (row)

with 1 component(s).

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Process the transposed of the input table (TransposeTable)

This flag indicates if the input table must be transposed first.

1

Accepts boolean values (0 or 1).

RobustPCA (RobustPCA)

This flag indicates if the PCA should be run in robust mode or not.

0

Accepts boolean values (0 or 1).

HDR smoothing parameter (Sigma)

Specify the smoothing parameter of the HDR.

1

GridSize (GridSize)

Width and height of the grid image to perform the PCA on.

100


Extract Block

This filter extracts a range of blocks from a multiblock dataset.This filter extracts a range of groups from a multiblock dataset

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Group filter.

Accepts input of following types:

  • vtkMultiBlockDataSet
BlockIndices (BlockIndices)

This property lists the ids of the blocks to extract from the input multiblock dataset.

PruneOutput (PruneOutput)

When set, the output mutliblock dataset will be pruned to remove empty nodes. On by default.

1

Accepts boolean values (0 or 1).

MaintainStructure (MaintainStructure)

This is used only when PruneOutput is ON. By default, when pruning the output i.e. remove empty blocks, if node has only 1 non-null child block, then that node is removed. To preserve these parent nodes, set this flag to true.

0

Accepts boolean values (0 or 1).

Extract CTH Parts

Create a surface from a CTH volume fraction.Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract CTH Parts filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (cell)

with 1 component(s).

Clip Type (ClipPlane)

This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.

The value can be one of the following:

  • None (implicit_functions)
  • Plane (implicit_functions)
Volume Arrays (VolumeArrays)

This property specifies the name(s) of the volume fraction array(s) for generating parts.

An array of scalars is required.

Volume Fraction Value (VolumeFractionSurfaceValue)

The value of this property is the volume fraction value for the surface.

0.1

CapSurfaces (CapSurfaces)

When enabled, volume surfaces are capped to produce visually closed surface.

1

Accepts boolean values (0 or 1).

RemoveGhostCells (RemoveGhostCells)

When set to false, the output surfaces will not hide contours extracted from ghost cells. This results in overlapping contours but overcomes holes. Default is set to true.

1

Accepts boolean values (0 or 1).

GenerateTriangles (GenerateTriangles)

Triangulate results. When set to false, the internal cut and contour filters are told not to triangulate results if possible.

0

Accepts boolean values (0 or 1).

Extract Cells By Region

This filter extracts cells that are inside/outside a region or at a region boundary. This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid. To use this filter you must specify a region (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Slice filter.

Accepts input of following types:

  • vtkDataSet
Intersect With (ImplicitFunction)

This property sets the region used to extract cells.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)
InputBounds (InputBounds)
Extraction Side (ExtractInside)

This parameter controls whether to extract cells that are inside or outside the region.

1

The value(s) is an enumeration of the following:

  • outside (0)
  • inside (1)
Extract only intersected (Extract only intersected)

This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.

0

Accepts boolean values (0 or 1).

Extract intersected (Extract intersected)

This parameter controls whether to extract cells that are on the boundary of the region.

0

Accepts boolean values (0 or 1).

Extract Component

This filter extracts a component of a multi-component attribute array.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the Extract Component filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

Input Array (SelectInputArray)

This property indicates the name of the array to be extracted.

The value must be field array name.

Component (Component)

This property indicates the component of the array to be extracted.

0

Output Array Name (OutputArrayName)

This property indicates the name of the output scalar array.

Result


Extract Edges

Extract edges of 2D and 3D cells as lines.The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Edges filter.

Accepts input of following types:

  • vtkDataSet

Extract Generic Dataset Surface

Extract geometry from a higher-order dataset Extract geometry from a higher-order dataset.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Geometry Filter.

Accepts input of following types:

  • vtkGenericDataSet
PassThroughCellIds (PassThroughCellIds)

Select whether to forward original ids.

1

Accepts boolean values (0 or 1).

Extract Level

This filter extracts a range of groups from a hierarchical dataset.This filter extracts a range of levels from a hierarchical dataset

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Group filter.

Accepts input of following types:

  • vtkUniformGridAMR
Levels (Levels)

This property lists the levels to extract from the input hierarchical dataset.


Extract Location

Sample or extract cells at a point. This filter allows you to specify a location and then either interpolate the data attributes from the input dataset at that location or extract the cell(s) at the location.


Property Description Default Value(s) Restrictions
Input (Input)

Set the input dataset producer

Accepts input of following types:

  • vtkDataSet
  • vtkCompositeDataSet

The dataset must contain a field array ()

Mode (Mode)

Select whether to interpolate (probe) data attributes at the specified location, or to extract cell(s) containing the specified location.

1

The value(s) is an enumeration of the following:

  • Interpolate At Location (0)
  • Extract Cell At Location (1)
Location (Location)

Select the location of interest in 3D space.

0.0 0.0 0.0

The value must lie within the bounding box of the dataset.

It will default to the mid in each dimension.


Extract Region Surface

Extract a 2D boundary surface using neighbor relations to eliminate internal faces.The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Surface filter.

Accepts input of following types:

  • vtkDataSet
PieceInvariant (PieceInvariant)

If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.

1

Accepts boolean values (0 or 1).

NonlinearSubdivisionLevel (NonlinearSubdivisionLevel)

If the input is an unstructured grid with nonlinear faces, this parameter determines how many times the face is subdivided into linear faces. If 0, the output is the equivalent of its linear couterpart (and the midpoints determining the nonlinear interpolation are discarded). If 1, the nonlinear face is triangulated based on the midpoints. If greater than 1, the triangulated pieces are recursively subdivided to reach the desired subdivision. Setting the value to greater than 1 may cause some point data to not be passed even if no quadratic faces exist. This option has no effect if the input is not an unstructured grid.

1

RegionArrayName (RegionArrayName)

This property specifies the name of the material array for generating parts.

material

SingleSided (SingleSided)

If the value of this property is set to 1 (the default), surfaces along the boundary are 1 layer thick. Otherwise there is a surface for the material on each side.

1

Accepts boolean values (0 or 1).

MaterialPropertiesName (MaterialPropertiesName)

This the name of the input material property field data array

material_properties

MaterialIDsName (MaterialIDsName)

This the name of the input and output material id field data array

material_ids

MaterialPIDsName (MaterialPIDsName)

This the name of the output material ancestry id field data array

material_ancestors

InterfaceIDsName (InterfaceIDsName)

This the name of the input and output interface id field data array

interface_ids


Extract Selection

Extract different type of selections.This filter extracts a set of cells/points given a selection. The selection can be obtained from a rubber-band selection (either cell, visible or in a frustum) or threshold selection and passed to the filter or specified by providing an ID list.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input from which the selection is extracted.

Accepts input of following types:

  • vtkDataSet
  • vtkTable
Selection (Selection)

The input that provides the selection object.

Accepts input of following types:

  • vtkSelection
PreserveTopology (PreserveTopology)

If this property is set to 1 the output preserves the topology of its input and adds an insidedness array to mark which cells are inside or out. If 0 then the output is an unstructured grid which contains only the subset of cells that are inside.

0

Accepts boolean values (0 or 1).

ShowBounds (ShowBounds)

For frustum selection, if this property is set to 1 the output is the outline of the frustum instead of the contents of the input that lie within the frustum.

0

Accepts boolean values (0 or 1).

Extract Selection (internal)

This filter extracts a given set of cells or points given a selection. The selection can be obtained from a rubber-band selection (either point, cell, visible or in a frustum) and passed to the filter or specified by providing an ID list. This is an internal filter, use "ExtractSelection" instead.


Property Description Default Value(s) Restrictions
Input (Input)

The input from which the selection is extracted.

Accepts input of following types:

  • vtkDataSet
Selection (Selection)

The input that provides the selection object.

Accepts input of following types:

  • vtkSelection

Extract Subset

Extract a subgrid from a structured grid with the option of setting subsample strides.The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Grid filter.

Accepts input of following types:

  • vtkImageData
  • vtkRectilinearGrid
  • vtkStructuredPoints
  • vtkStructuredGrid
VOI (VOI)

This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.

0 0 0 0 0 0

The value(s) must lie within the structured-extents of the input dataset.

SampleRateI (SampleRateI)

This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.

1

SampleRateJ (SampleRateJ)

This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.

1

SampleRateK (SampleRateK)

This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.

1

IncludeBoundary (IncludeBoundary)

If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.

0

Accepts boolean values (0 or 1).

Extract Surface

Extract a 2D boundary surface using neighbor relations to eliminate internal faces.The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Extract Surface filter.

Accepts input of following types:

  • vtkDataSet
PieceInvariant (PieceInvariant)

If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.

1

Accepts boolean values (0 or 1).

NonlinearSubdivisionLevel (NonlinearSubdivisionLevel)

If the input is an unstructured grid with nonlinear faces, this parameter determines how many times the face is subdivided into linear faces. If 0, the output is the equivalent of its linear couterpart (and the midpoints determining the nonlinear interpolation are discarded). If 1, the nonlinear face is triangulated based on the midpoints. If greater than 1, the triangulated pieces are recursively subdivided to reach the desired subdivision. Setting the value to greater than 1 may cause some point data to not be passed even if no quadratic faces exist. This option has no effect if the input is not an unstructured grid.

1


FFT Of Selection Over Time

Extracts selection over time and plots the FFT Extracts the data of a selection (e.g. points or cells) over time, takes the FFT of them, and plots them.

Property Description Default Value(s) Restrictions


Feature Edges

This filter will extract edges along sharp edges of surfaces or boundaries of surfaces. The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Feature Edges filter.

Accepts input of following types:

  • vtkPolyData
BoundaryEdges (BoundaryEdges)

If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.

1

Accepts boolean values (0 or 1).

FeatureEdges (FeatureEdges)

If the value of this property is set to 1, feature edges will be extracted. Feature edges are defined as edges that are used by two polygons whose dihedral angle is greater than the feature angle. (See the FeatureAngle property.) Toggle whether to extract feature edges.

1

Accepts boolean values (0 or 1).

Non-Manifold Edges (NonManifoldEdges)

If the value of this property is set to 1, non-manifold ediges will be extracted. Non-manifold edges are defined as edges that are use by three or more polygons.

1

Accepts boolean values (0 or 1).

ManifoldEdges (ManifoldEdges)

If the value of this property is set to 1, manifold edges will be extracted. Manifold edges are defined as edges that are used by exactly two polygons.

0

Accepts boolean values (0 or 1).

Coloring (Coloring)

If the value of this property is set to 1, then the extracted edges are assigned a scalar value based on the type of the edge.

0

Accepts boolean values (0 or 1).

FeatureAngle (FeatureAngle)

Ths value of this property is used to define a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. (See the FeatureEdges property.)

30.0


FlattenFilter

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Flatten Filter.

Accepts input of following types:

  • vtkPointSet
  • vtkGraph
  • vtkCompositeDataSet

Gaussian Resampling

Splat points into a volume with an elliptical, Gaussian distribution.vtkGaussianSplatter is a filter that injects input points into a structured points (volume) dataset. As each point is injected, it "splats" or distributes values to nearby voxels. Data is distributed using an elliptical, Gaussian distribution function. The distribution function is modified using scalar values (expands distribution) or normals (creates ellipsoidal distribution rather than spherical). Warning: results may be incorrect in parallel as points can't splat into other processor's cells.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 1 component(s).

Resample Field (SelectInputScalars)

Choose a scalar array to splat into the output cells. If ignore arrays is chosen, point density will be counted instead.

An array of scalars is required.The value must be field array name.

Resampling Grid (SampleDimensions)

Set / get the dimensions of the sampling structured point set. Higher values produce better results but are much slower.

50 50 50

Extent to Resample (ModelBounds)

Set / get the (xmin,xmax, ymin,ymax, zmin,zmax) bounding box in which the sampling is performed. If any of the (min,max) bounds values are min >= max, then the bounds will be computed automatically from the input data. Otherwise, the user-specified bounds will be used.

0.0 0.0 0.0 0.0 0.0 0.0

Gaussian Splat Radius (Radius)

Set / get the radius of propagation of the splat. This value is expressed as a percentage of the length of the longest side of the sampling volume. Smaller numbers greatly reduce execution time.

0.1

Gaussian Exponent Factor (ExponentFactor)

Set / get the sharpness of decay of the splats. This is the exponent constant in the Gaussian equation. Normally this is a negative value.

-5.0

Scale Splats (ScalarWarping)

Turn on/off the scaling of splats by scalar value.

1

Accepts boolean values (0 or 1).

Scale Factor (ScaleFactor)

Multiply Gaussian splat distribution by this value. If ScalarWarping is on, then the Scalar value will be multiplied by the ScaleFactor times the Gaussian function.

1.0

Elliptical Splats (NormalWarping)

Turn on/off the generation of elliptical splats. If normal warping is on, then the input normals affect the distribution of the splat. This boolean is used in combination with the Eccentricity ivar.

1

Accepts boolean values (0 or 1).

Ellipitical Eccentricity (Eccentricity)

Control the shape of elliptical splatting. Eccentricity is the ratio of the major axis (aligned along normal) to the minor (axes) aligned along other two axes. So Eccentricity gt 1 creates needles with the long axis in the direction of the normal; Eccentricity lt 1 creates pancakes perpendicular to the normal vector.

2.5

Fill Volume Boundary (Capping)

Turn on/off the capping of the outer boundary of the volume to a specified cap value. This can be used to close surfaces (after iso-surfacing) and create other effects.

1

Accepts boolean values (0 or 1).

Fill Value (CapValue)

Specify the cap value to use. (This instance variable only has effect if the ivar Capping is on.)

0.0

Splat Accumulation Mode (Accumulation Mode)

Specify the scalar accumulation mode. This mode expresses how scalar values are combined when splats are overlapped. The Max mode acts like a set union operation and is the most commonly used; the Min mode acts like a set intersection, and the sum is just weird.

1

The value(s) is an enumeration of the following:

  • Min (0)
  • Max (1)
  • Sum (2)
Empty Cell Value (NullValue)

Set the Null value for output points not receiving a contribution from the input points. (This is the initial value of the voxel samples.)

0.0


Generate Ids

Generate scalars from point and cell ids. This filter generates scalars using cell and point ids. That is, the point attribute data scalars are generated from the point ids, and the cell attribute data scalars or field data are generated from the the cell ids.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Cell Data to Point Data filter.

Accepts input of following types:

  • vtkDataSet
ArrayName (ArrayName)

The name of the array that will contain ids.

Ids


Generate Quadrature Points

Create a point set with data at quadrature points. "Create a point set with data at quadrature points."

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkUnstructuredGrid

The dataset must contain a field array (cell)

Quadrature Scheme Def (QuadratureSchemeDefinition)

Specifies the offset array from which we generate quadrature points.

An array of scalars is required.

Generate Quadrature Scheme Dictionary

Generate quadrature scheme dictionaries in data sets that do not have them. Generate quadrature scheme dictionaries in data sets that do not have them.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkUnstructuredGrid

Generate Surface Normals

This filter will produce surface normals used for smooth shading. Splitting is used to avoid smoothing across feature edges.This filter generates surface normals at the points of the input polygonal dataset to provide smooth shading of the dataset. The resulting dataset is also polygonal. The filter works by calculating a normal vector for each polygon in the dataset and then averaging the normals at the shared points.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Normals Generation filter.

Accepts input of following types:

  • vtkPolyData
FeatureAngle (FeatureAngle)

The value of this property defines a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. If Splitting is on, points are duplicated along these feature edges. (See the Splitting property.)

30

Splitting (Splitting)

This property controls the splitting of sharp edges. If sharp edges are split (property value = 1), then points are duplicated along these edges, and separate normals are computed for both sets of points to give crisp (rendered) surface definition.

1

Accepts boolean values (0 or 1).

Consistency (Consistency)

The value of this property controls whether consistent polygon ordering is enforced. Generally the normals for a data set should either all point inward or all point outward. If the value of this property is 1, then this filter will reorder the points of cells that whose normal vectors are oriented the opposite direction from the rest of those in the data set.

1

Accepts boolean values (0 or 1).

FlipNormals (FlipNormals)

If the value of this property is 1, this filter will reverse the normal direction (and reorder the points accordingly) for all polygons in the data set; this changes front-facing polygons to back-facing ones, and vice versa. You might want to do this if your viewing position will be inside the data set instead of outside of it.

0

Accepts boolean values (0 or 1).

Non-Manifold Traversal (NonManifoldTraversal)

Turn on/off traversal across non-manifold edges. Not traversing non-manifold edges will prevent problems where the consistency of polygonal ordering is corrupted due to topological loops.

1

Accepts boolean values (0 or 1).

ComputeCellNormals (ComputeCellNormals)

This filter computes the normals at the points in the data set. In the process of doing this it computes polygon normals too. If you want these normals to be passed to the output of this filter, set the value of this property to 1.

0

Accepts boolean values (0 or 1).

PieceInvariant (PieceInvariant)

Turn this option to to produce the same results regardless of the number of processors used (i.e., avoid seams along processor boundaries). Turn this off if you do want to process ghost levels and do not mind seams.

1

Accepts boolean values (0 or 1).

GeometryFilter

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Geoemtry Filter.

UseStrips (UseStrips)

Toggle whether to generate faces containing triangle strips. This should render faster and use less memory, but no cell data is copied.

0

Accepts boolean values (0 or 1).

ForceStrips (ForceStrips)

This makes UseStrips call Modified() after changing its setting to ensure that the filter's output is immediatley changed.

0

Accepts boolean values (0 or 1).

UseOutline (UseOutline)

Toggle whether to generate an outline or a surface.

0

Accepts boolean values (0 or 1).

NonlinearSubdivisionLevel (NonlinearSubdivisionLevel)

Nonlinear faces are approximated with flat polygons. This parameter controls how many times to subdivide nonlinear surface cells. Higher subdivisions generate closer approximations but take more memory and rendering time. Subdivision is recursive, so the number of output polygons can grow exponentially with this parameter.

1

PassThroughIds (PassThroughIds)

If on, the output polygonal dataset will have a celldata array that holds the cell index of the original 3D cell that produced each output cell. This is useful for cell picking.

1

Accepts boolean values (0 or 1).

PassThroughPointIds (PassThroughPointIds)

If on, the output polygonal dataset will have a pointdata array that holds the point index of the original 3D vertex that produced each output vertex. This is useful for picking.

1

Accepts boolean values (0 or 1).

Ghost Cells Generator

Generate ghost cells for unstructured grids. The GhostCellGenerator filter is available when ParaView is run in parallel (ie. with MPI). It operates on unstructured grids only. This filter does not redistribute the input data, it only generates ghost cells at processor boundary by fetching topological and geometrical information of those cells on neighbor ranks. The filter can take benefit of global point ids if they are available - if so it will perform faster, otherwise point coordinates will be exchanged and processed.


Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the ghost cells generator.

Accepts input of following types:

  • vtkUnstructuredGrid
BuildIfRequired (BuildIfRequired)

Specify if the filter must generate the ghost cells only if required by the pipeline downstream.

1

Accepts boolean values (0 or 1).

UseGlobalIds (UseGlobalIds)

Specify if the filter must take benefit of global point ids if they exist or if point coordinates should be used instead.

1

Accepts boolean values (0 or 1).

GlobalPointIdsArrayName (GlobalPointIdsArrayName)

This property provides the name for the input array containing the global point ids if the GlobalIds array of the point data if not set. Default is GlobalNodeIds.

GlobalNodeIds


Glyph

This filter produces a glyph at each point of in input data set. The glyphs can be oriented and scaled by point attributes of the input dataset. The Glyph filter generates a glyph (i.e., an arrow, cone, cube, cylinder, line, sphere, or 2D glyph) at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal

To use this filter, you first select the arrays to use for as the

    • Scalars** and **Vectors**, if any. To orient the glyphs using the

selected **Vectors**, use **Orient** property. To scale the glyphs using the selected **Scalars** or **Vectors**, use the **Scale Mode** property.

The **Glyph Mode** property controls which points in the input dataset are selected for glyphing (since in most cases, glyping all points in the input dataset can be both performance impeding as well as visually cluttred.


Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Glyph filter. This is the dataset from which the points are selecetd to be glyphed.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

with 1 component(s).

The dataset must contain a field array ()

with 3 component(s).

Glyph Type (Source)

This property determines which type of glyph will be placed at the points in the input dataset.

Accepts input of following types:

  • vtkPolyDataThe value can be one of the following:
  • ArrowSource (sources)
  • ConeSource (sources)
  • CubeSource (sources)
  • CylinderSource (sources)
  • LineSource (sources)
  • SphereSource (sources)
  • GlyphSource2D (sources)
Scalars (Scalars)

Select the input array to be treated as the active **Scalars**. You can scale the glyphs using the selected scalars by setting the **Scale Mode** property to **scalar**.

0

An array of scalars is required.The value must be field array name.

Vectors (Vectors)

Select the input array to be treated as the active **Vectors**. You can scale the glyphs using the selected vectors by setting the **Scale Mode** property to **vector** or **vector_components**. You can orient the glyphs using the selected vectors by checking the **Orient** property.

1

An array of vectors is required.The value must be field array name.

Orient (Orient)

If this property is set to 1, the glyphs will be oriented based on the vectors selected using the **Vectors** property.

1

Accepts boolean values (0 or 1).

ScaleMode (ScaleMode)

Select how to scale the glyphs. Set to **off** to disable scaling entirely. Set to **scalar** to scale the glyphs using the array selected using the **Scalars** property. Set to **vector** to scale the glyphs using the magnitude of the array selected using the **Vectors** property. Set to **vector_components** to scale using the **Vectors**, scaling each component individually.

3

The value(s) is an enumeration of the following:

  • scalar (0)
  • vector (1)
  • vector_components (2)
  • off (3)
ScaleFactor (ScaleFactor)

Specify the constant multiplier to use to scale the glyphs.

1.0

The value must lie within the range of the selected data array.The value must lie within the range of the selected data array. The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.

GlyphMode (GlyphMode)

This property indicates the mode that will be used to generate glyphs from the dataset.

2

The value(s) is an enumeration of the following:

  • All Points (0)
  • Every Nth Point (1)
  • Uniform Spatial Distribution (2)
MaximumNumberOfSamplePoints (MaximumNumberOfSamplePoints)

This property specifies the maximum number of sample points to use when sampling the space when Uniform Spatial Distribution is used.

5000

Seed (Seed)

This property specifies the seed that will be used for generating a uniform distribution of glyph points when a Uniform Spatial Distribution is used.

10339

Stride (Stride)

This property specifies the stride that will be used when glyphing by Every Nth Point.

1

GlyphTransform (GlyphTransform)

The values in this property allow you to specify the transform (translation, rotation, and scaling) to apply to the glyph source.

The value can be one of the following:

  • Transform2 (extended_sources)


Glyph With Custom Source

This filter generates a glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset. The Glyph filter generates a glyph at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Glyph filter. This is the dataset from which the points are selecetd to be glyphed.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 1 component(s).

The dataset must contain a field array (point)

with 3 component(s).

Glyph Type (Source)

This property determines which type of glyph will be placed at the points in the input dataset.

Accepts input of following types:

  • vtkPolyData
Scalars (Scalars)

Select the input array to be treated as the active "Scalars". You can scale the glyphs using the selected scalars by setting the "Scale Mode" property to "scalar"

An array of scalars is required.

Vectors (Vectors)

Select the input array to be treated as the active "Vectors". You can scale the glyphs using the selected vectors by setting the "Scale Mode" property to "vector" or "vector_components". You can orient the glyphs using the selected vectors by checking the "Orient" property.

1

An array of vectors is required.

Orient (Orient)

If this property is set to 1, the glyphs will be oriented based on the vectors selected using the "Vectors" property.

1

Accepts boolean values (0 or 1).

ScaleMode (ScaleMode)

Select how to scale the glyphs. Set to "off" to disable scaling entirely. Set to "scalar" to scale the glyphs using the array selected using the "Scalars" property. Set to "vector" to scale the glyphs using the magnitude of the array selected using the "Vectors" property. Set to "vector_components" to scale using the "Vectors", scaling each component individually.

3

The value(s) is an enumeration of the following:

  • scalar (0)
  • vector (1)
  • vector_components (2)
  • off (3)
ScaleFactor (ScaleFactor)

Specify the constant multiplier to use to scale the glyphs.

1.0

The value must lie within the range of the selected data array.The value must lie within the range of the selected data array. The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.

GlyphMode (GlyphMode)

This property indicates the mode that will be used to generate glyphs from the dataset.

2

The value(s) is an enumeration of the following:

  • All Points (0)
  • Every Nth Point (1)
  • Uniform Spatial Distribution (2)
MaximumNumberOfSamplePoints (MaximumNumberOfSamplePoints)

This property specifies the maximum number of sample points to use when sampling the space when Uniform Spatial Distribution is used.

5000

Seed (Seed)

This property specifies the seed that will be used for generating a uniform distribution of glyph points when a Uniform Spatial Distribution is used.

10339

Stride (Stride)

This property specifies the stride that will be used when glyphing by Every Nth Point.

1

GlyphTransform (GlyphTransform)

The values in this property allow you to specify the transform (translation, rotation, and scaling) to apply to the glyph source.

The value can be one of the following:

  • Transform2 (extended_sources)


Gradient

This filter computes gradient vectors for an image/volume.The Gradient filter computes the gradient vector at each point in an image or volume. This filter uses central differences to compute the gradients. The Gradient filter operates on uniform rectilinear (image) data and produces image data output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Gradient filter.

Accepts input of following types:

  • vtkImageData

The dataset must contain a field array (point)

with 1 component(s).

SelectInputScalars (SelectInputScalars)

This property lists the name of the array from which to compute the gradient.

An array of scalars is required.

Dimensionality (Dimensionality)

This property indicates whether to compute the gradient in two dimensions or in three. If the gradient is being computed in two dimensions, the X and Y dimensions are used.

3

The value(s) is an enumeration of the following:

  • Two (2)
  • Three (3)

Gradient Magnitude

Compute the magnitude of the gradient vectors for an image/volume.The Gradient Magnitude filter computes the magnitude of the gradient vector at each point in an image or volume. This filter operates on uniform rectilinear (image) data and produces image data output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Gradient Magnitude filter.

Accepts input of following types:

  • vtkImageData

The dataset must contain a field array (point)

with 1 component(s).

Dimensionality (Dimensionality)

This property indicates whether to compute the gradient magnitude in two or three dimensions. If computing the gradient magnitude in 2D, the gradients in X and Y are used for computing the gradient magnitude.

3

The value(s) is an enumeration of the following:

  • Two (2)
  • Three (3)

Gradient Of Unstructured DataSet

Estimate the gradient for each point or cell in any type of dataset. The Gradient (Unstructured) filter estimates the gradient vector at each point or cell. It operates on any type of vtkDataSet, and the output is the same type as the input. If the dataset is a vtkImageData, use the Gradient filter instead; it will be more efficient for this type of dataset.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Gradient (Unstructured) filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

Scalar Array (SelectInputScalars)

This property lists the name of the scalar array from which to compute the gradient.

An array of scalars is required.The value must be field array name.

ComputeGradient (ComputeGradient)

When this flag is on, the gradient filter will compute the gradient of the input array.

1

Accepts boolean values (0 or 1).

ResultArrayName (ResultArrayName)

This property provides a name for the output array containing the gradient vectors.

Gradients

FasterApproximation (FasterApproximation)

When this flag is on, the gradient filter will provide a less accurate (but close) algorithm that performs fewer derivative calculations (and is therefore faster). The error contains some smoothing of the output data and some possible errors on the boundary. This parameter has no effect when performing the gradient of cell data or when the input grid is not a vtkUnstructuredGrid.

0

Accepts boolean values (0 or 1).

ComputeDivergence (ComputeDivergence)

When this flag is on, the gradient filter will compute the divergence of a 3 component array.

0

Accepts boolean values (0 or 1).

DivergenceArrayName (DivergenceArrayName)

This property provides a name for the output array containing the divergence vector.

Divergence

ComputeVorticity (ComputeVorticity)

When this flag is on, the gradient filter will compute the vorticity/curl of a 3 component array.

0

Accepts boolean values (0 or 1).

VorticityArrayName (VorticityArrayName)

This property provides a name for the output array containing the vorticity vector.

Vorticity

ComputeQCriterion (ComputeQCriterion)

When this flag is on, the gradient filter will compute the Q-criterion of a 3 component array.

0

Accepts boolean values (0 or 1).

QCriterionArrayName (QCriterionArrayName)

This property provides a name for the output array containing Q criterion.

Q-criterion


Grid Connectivity

Mass properties of connected fragments for unstructured grids.This filter works on multiblock unstructured grid inputs and also works in parallel. It Ignores any cells with a cell data Status value of 0. It performs connectivity to distict fragments separately. It then integrates attributes of the fragments.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkUnstructuredGrid
  • vtkCompositeDataSet

Group Datasets

Group data sets. Groups multiple datasets to create a multiblock dataset

Property Description Default Value(s) Restrictions
Input (Input)

This property indicates the the inputs to the Group Datasets filter.

Accepts input of following types:

  • vtkDataObject

Histogram

Extract a histogram from field data.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Histogram filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

SelectInputArray (SelectInputArray)

This property indicates the name of the array from which to compute the histogram.

An array of scalars is required.The value must be field array name.

BinCount (BinCount)

The value of this property specifies the number of bins for the histogram.

10

Component (Component)

The value of this property specifies the array component from which the histogram should be computed.

0

CalculateAverages (CalculateAverages)

This option controls whether the algorithm calculates averages of variables other than the primary variable that fall into each bin.

0

Accepts boolean values (0 or 1).

UseCustomBinRanges (UseCustomBinRanges)

When set to true, CustomBinRanges will be used instead of using the full range for the selected array. By default, set to false.

0

Accepts boolean values (0 or 1).

CustomBinRanges (CustomBinRanges)

Set custom bin ranges to use. These are used only when UseCustomBinRanges is set to true.

0.0 100.0

The value must lie within the range of the selected data array.

Image Data To AMR

Converts certain images to AMR.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Cell Data to Point Data filter.

Accepts input of following types:

  • vtkImageData
Number of levels (NumberOfLevels)

This property specifies the number of levels in the amr data structure.

2

Maximum Number of Blocks (MaximumNumberOfLevels)

This property specifies the maximum number of blocks in the output amr data structure.

100

Refinement Ratio (RefinementRatio)

This property specifies the refinement ratio between levels.

2


Image Data To Uniform Grid

Create a uniform grid from an image data by specified blanking arrays. Create a vtkUniformGrid from a vtkImageData by passing in arrays to be used for point and/or cell blanking. By default, values of 0 in the specified array will result in a point or cell being blanked. Use Reverse to switch this.


Property Description Default Value(s) Restrictions
Input (Input)

Accepts input of following types:

  • vtkImageData

The dataset must contain a field array ()

with 1 component(s).

SelectInputScalars (SelectInputScalars)

Specify the array to use for blanking.

An array of scalars is required.The value must be field array name.

Reverse (Reverse)

Reverse the array value to whether or not a point or cell is blanked.

0

Accepts boolean values (0 or 1).

Image Data to Point Set

Converts an Image Data to a Point SetThe Image Data to Point Set filter takes an image data (uniform rectilinear grid) object and outputs an equivalent structured grid (which as a type of point set). This brings the data to a broader category of data storage but only adds a small amount of overhead. This filter can be helpful in applying filters that expect or manipulate point coordinates.

Property Description Default Value(s) Restrictions
Input (Input)

Accepts input of following types:

  • vtkImageData

Image Shrink

Reduce the size of an image/volume by subsampling.The Image Shrink filter reduces the size of an image/volume dataset by subsampling it (i.e., extracting every nth pixel/voxel in integer multiples). The sbsampling rate can be set separately for each dimension of the image/volume.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Image Shrink filter.

Accepts input of following types:

  • vtkImageData
ShrinkFactors (ShrinkFactors)

The value of this property indicates the amount by which to shrink along each axis.

1 1 1

Averaging (Averaging)

If the value of this property is 1, an average of neighborhood scalar values will be used as the output scalar value for each output point. If its value is 0, only subsampling will be performed, and the original scalar values at the points will be retained.

1

Accepts boolean values (0 or 1).

ImageResampling

Sample data attributes using a 3D image as probing mesh.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the dataset whose data will be probed

Accepts input of following types:

  • vtkDataSet
  • vtkCompositeDataSet
SamplingDimension (SamplingDimension)

How many linear resampling we want along each axis

10 10 10

UseInputBounds (UseInputBounds)

Do we use input bounds or custom ones?

1

Accepts boolean values (0 or 1).

CustomSamplingBounds (CustomSamplingBounds)

Custom probing bounds if needed

0 1 0 1 0 1


InSituParticlePath

Trace Particle Paths through time in a vector field. The Particle Trace filter generates pathlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4. This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter is available on the Toolbar.

Property Description Default Value(s) Restrictions
Restart Source (RestartSource)

Specify the restart dataset. This is optional and can be used to have particle histories that were computed previously be included in this filter's computation.

Accepts input of following types:

  • vtkDataSet
ClearCache (ClearCache)

Clear the particle cache from previous time steps.

0

Accepts boolean values (0 or 1).

FirstTimeStep (FirstTimeStep)

Set the first time step. Default is 0.

0

RestartedSimulation (RestartedSimulation)

Specify whether or not this is a restarted simulation.

0

Accepts boolean values (0 or 1).

DisableResetCache (DisableResetCache)

Prevents cache from getting reset so that new computation always start from previous results.

0

Accepts boolean values (0 or 1).

Integrate Variables

This filter integrates cell and point attributes. The Integrate Attributes filter integrates point and cell data over lines and surfaces. It also computes length of lines, area of surface, or volume.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Integrate Attributes filter.

Accepts input of following types:

  • vtkDataSet

Interpolate to Quadrature Points

Create scalar/vector data arrays interpolated to quadrature points. "Create scalar/vector data arrays interpolated to quadrature points."

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkUnstructuredGrid

The dataset must contain a field array (cell)

Quadrature Scheme Def (QuadratureSchemeDefinition)

Specifies the offset array from which we interpolate values to quadrature points.

An array of scalars is required.

Intersect Fragments

The Intersect Fragments filter perform geometric intersections on sets of fragments. The Intersect Fragments filter perform geometric intersections on sets of fragments. The filter takes two inputs, the first containing fragment geometry and the second containing fragment centers. The filter has two outputs. The first is geometry that results from the intersection. The second is a set of points that is an approximation of the center of where each fragment has been intersected.

Property Description Default Value(s) Restrictions
Input (Input)

This input must contian fragment geometry.

Accepts input of following types:

  • vtkMultiBlockDataSet
Source (Source)

This input must contian fragment centers.

Accepts input of following types:

  • vtkMultiBlockDataSet
Slice Type (CutFunction)

This property sets the type of intersecting geometry, and associated parameters.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)


Iso Volume

This filter extracts cells by clipping cells that have point scalars not in the specified range. This filter clip away the cells using lower and upper thresholds.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Threshold filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

with 1 component(s).

Input Scalars (SelectInputScalars)

The value of this property contains the name of the scalar array from which to perform thresholding.

An array of scalars is required.The value must be field array name.

Threshold Range (ThresholdBetween)

The values of this property specify the upper and lower bounds of the thresholding operation.

0 0

The value must lie within the range of the selected data array.

K Means

Compute a statistical model of a dataset and/or assess the dataset with a statistical model. This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.<p> This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select. The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center. The model is then a set of cluster centers. Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.

Property Description Default Value(s) Restrictions
Input (Input)

The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.

Accepts input of following types:

  • vtkImageData
  • vtkStructuredGrid
  • vtkPolyData
  • vtkUnstructuredGrid
  • vtkTable
  • vtkGraph

The dataset must contain a field array ()

ModelInput (ModelInput)

A previously-calculated model with which to assess a separate dataset. This input is optional.

Accepts input of following types:

  • vtkTable
  • vtkMultiBlockDataSet
AttributeMode (AttributeMode)

Specify which type of field data the arrays will be drawn from.

0

The value must be field array name.

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Task (Task)

Specify the task to be performed: modeling and/or assessment. <ol> <li> "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the <b>entire</b> input dataset;</li> <li> "Model a subset of the data," creates an output table (or tables) summarizing a <b>randomly-chosen subset</b> of the input dataset;</li> <li> "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and</li> <li> "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.</li> </ol> When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The <i>Training fraction</i> setting will be ignored for tasks 1 and 3.

3

The value(s) is an enumeration of the following:

  • Detailed model of input data (0)
  • Model a subset of the data (1)
  • Assess the data with a model (2)
  • Model and assess the same data (3)
TrainingFraction (TrainingFraction)

Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.

0.1

k (K)

Specify the number of clusters.

5

Max Iterations (MaxNumIterations)

Specify the maximum number of iterations in which cluster centers are moved before the algorithm terminates.

50

Tolerance (Tolerance)

Specify the relative tolerance that will cause early termination.

0.01


Legacy Glyph

This filter generates an arrow, cone, cube, cylinder, line, sphere, or 2D glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset. This the implementation of the Glyph filter available in ParaView version 4.1 and earlier.


Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 1 component(s).

The dataset must contain a field array (point)

with 3 component(s).

Scalars (SelectInputScalars)

This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)

An array of scalars is required.

Vectors (SelectInputVectors)

This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)

1

An array of vectors is required.

Glyph Type (Source)

This property determines which type of glyph will be placed at the points in the input dataset.

Accepts input of following types:

  • vtkPolyDataThe value can be one of the following:
  • ArrowSource (sources)
  • ConeSource (sources)
  • CubeSource (sources)
  • CylinderSource (sources)
  • LineSource (sources)
  • SphereSource (sources)
  • GlyphSource2D (sources)
GlyphTransform (GlyphTransform)

The values in this property allow you to specify the transform (translation, rotation, and scaling) to apply to the glyph source.

The value can be one of the following:

  • Transform2 (extended_sources)
Orient (SetOrient)

If this property is set to 1, the glyphs will be oriented based on the selected vector array.

1

Accepts boolean values (0 or 1).

Scale Mode (SetScaleMode)

The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.

1

The value(s) is an enumeration of the following:

  • scalar (0)
  • vector (1)
  • vector_components (2)
  • off (3)
SetScaleFactor (SetScaleFactor)

The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.

1.0

The value must lie within the range of the selected data array.The value must lie within the range of the selected data array. The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.

Maximum Number of Points (MaximumNumberOfPoints)

The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)

5000

Mask Points (UseMaskPoints)

If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)

1

Accepts boolean values (0 or 1).

RandomMode (RandomMode)

If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.

1

Accepts boolean values (0 or 1).

KeepRandomPoints (KeepRandomPoints)

If the value of this property is 1 and RandomMode is 1, then the randomly chosen points to glyph are saved and reused for other timesteps. This is only useful if the coordinates are the same and in the same order between timesteps.

0

Accepts boolean values (0 or 1).

LegacyArbitrarySourceGlyph

This filter generates a glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset. The Glyph filter generates a glyph at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 1 component(s).

The dataset must contain a field array (point)

with 3 component(s).

Glyph Type (Source)

This property determines which type of glyph will be placed at the points in the input dataset.

Accepts input of following types:

  • vtkPolyData
Scalars (SelectInputScalars)

This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)

An array of scalars is required.

Vectors (SelectInputVectors)

This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)

1

An array of vectors is required.

Orient (SetOrient)

If this property is set to 1, the glyphs will be oriented based on the selected vector array.

1

Accepts boolean values (0 or 1).

Scale Mode (SetScaleMode)

The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.

1

The value(s) is an enumeration of the following:

  • scalar (0)
  • vector (1)
  • vector_components (2)
  • off (3)
SetScaleFactor (SetScaleFactor)

The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.

1.0

The value must lie within the range of the selected data array.The value must lie within the range of the selected data array. The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.

Maximum Number of Points (MaximumNumberOfPoints)

The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)

5000

Mask Points (UseMaskPoints)

If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)

1

Accepts boolean values (0 or 1).

RandomMode (RandomMode)

If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.

1

Accepts boolean values (0 or 1).

KeepRandomPoints (KeepRandomPoints)

If the value of this property is 1 and RandomMode is 1, then the randomly chosen points to glyph are saved and reused for other timesteps. This is only useful if the coordinates are the same and in the same order between timesteps.

0

Accepts boolean values (0 or 1).

Level Scalars(Non-Overlapping AMR)

The Level Scalars filter uses colors to show levels of a hierarchical dataset.The Level Scalars filter uses colors to show levels of a hierarchical dataset.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Level Scalars filter.

Accepts input of following types:

  • vtkNonOverlappingAMR

Level Scalars(Overlapping AMR)

The Level Scalars filter uses colors to show levels of a hierarchical dataset.The Level Scalars filter uses colors to show levels of a hierarchical dataset.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Level Scalars filter.

Accepts input of following types:

  • vtkOverlappingAMR

Linear Extrusion

This filter creates a swept surface defined by translating the input along a vector.The Linear Extrusion filter creates a swept surface by translating the input dataset along a specified vector. This filter is intended to operate on 2D polygonal data. This filter operates on polygonal data and produces polygonal data output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Linear Extrusion filter.

Accepts input of following types:

  • vtkPolyData
ScaleFactor (ScaleFactor)

The value of this property determines the distance along the vector the dataset will be translated. (A scale factor of 0.5 will move the dataset half the length of the vector, and a scale factor of 2 will move it twice the vector's length.)

1.0

Vector (Vector)

The value of this property indicates the X, Y, and Z components of the vector along which to sweep the input dataset.

0 0 1

Capping (Capping)

The value of this property indicates whether to cap the ends of the swept surface. Capping works by placing a copy of the input dataset on either end of the swept surface, so it behaves properly if the input is a 2D surface composed of filled polygons. If the input dataset is a closed solid (e.g., a sphere), then if capping is on (i.e., this property is set to 1), two copies of the data set will be displayed on output (the second translated from the first one along the specified vector). If instead capping is off (i.e., this property is set to 0), then an input closed solid will produce no output.

1

Accepts boolean values (0 or 1).

PieceInvariant (PieceInvariant)

The value of this property determines whether the output will be the same regardless of the number of processors used to compute the result. The difference is whether there are internal polygonal faces on the processor boundaries. A value of 1 will keep the results the same; a value of 0 will allow internal faces on processor boundaries.

0

Accepts boolean values (0 or 1).

Loop Subdivision

This filter iteratively divides each triangle into four triangles. New points are placed so the output surface is smooth. The Loop Subdivision filter increases the granularity of a polygonal mesh. It works by dividing each triangle in the input into four new triangles. It is named for Charles Loop, the person who devised this subdivision scheme. This filter only operates on triangles, so a data set that contains other types of polygons should be passed through the Triangulate filter before applying this filter to it. This filter only operates on polygonal data (specifically triangle meshes), and it produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Loop Subdivision filter.

Accepts input of following types:

  • vtkPolyData
Number of Subdivisions (NumberOfSubdivisions)

Set the number of subdivision iterations to perform. Each subdivision divides single triangles into four new triangles.

1


MPIMoveData

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the MPI Move Data filter.

MoveMode (MoveMode)

Specify how the data is to be redistributed.

0

The value(s) is an enumeration of the following:

  • PassThrough (0)
  • Collect (1)
  • Clone (2)
OutputDataType (OutputDataType)

Specify the type of the dataset.

none

The value(s) is an enumeration of the following:

  • PolyData (0)
  • Unstructured Grid (4)
  • ImageData (6)

Mask Points

Reduce the number of points. This filter is often used before glyphing. Generating vertices is an option.The Mask Points filter reduces the number of points in the dataset. It operates on any type of dataset, but produces only points / vertices as output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Mask Points filter.

Accepts input of following types:

  • vtkDataSet
OnRatio (OnRatio)

The value of this property specifies that every OnStride-th points will be retained in the output when not using Random (the skip or stride size for point ids). (For example, if the on ratio is 3, then the output will contain every 3rd point, up to the the maximum number of points.)

2

Maximum Number of Points (MaximumNumberOfPoints)

The value of this property indicates the maximum number of points in the output dataset.

5000

Proportionally Distribute Maximum Number Of Points (ProportionalMaximumNumberOfPoints)

When this is off, the maximum number of points is taken per processor when running in parallel (total number of points = number of processors * maximum number of points). When this is on, the maximum number of points is proportionally distributed across processors depending on the number of points per processor ("total number of points" is the same as "maximum number of points" maximum number of points per processor = number of points on a processor

  • maximum number of points / total number of points across all processors

).

0

Accepts boolean values (0 or 1).

Offset (Offset)

The value of this property indicates the starting point id in the ordered list of input points from which to start masking.

0

Random Sampling (RandomMode)

If the value of this property is set to true, then the points in the output will be randomly selected from the input in various ways set by Random Mode; otherwise this filter will subsample point ids regularly.

0

Accepts boolean values (0 or 1).

Random Sampling Mode (RandomModeType)

Randomized Id Strides picks points with random id increments starting at Offset (the output probably isn't a statistically random sample). Random Sampling generates a statistically random sample of the input, ignoring Offset (fast - O(sample size)). Spatially Stratified Random Sampling is a variant of random sampling that splits the points into equal sized spatial strata before randomly sampling (slow - O(N log N)).

0

The value(s) is an enumeration of the following:

  • Randomized Id Strides (0)
  • Random Sampling (1)
  • Spatially Stratified Random Sampling (2)
GenerateVertices (GenerateVertices)

This property specifies whether to generate vertex cells as the topography of the output. If set to 1, the geometry (vertices) will be displayed in the rendering window; otherwise no geometry will be displayed.

0

Accepts boolean values (0 or 1).

SingleVertexPerCell (SingleVertexPerCell)

Tell filter to only generate one vertex per cell instead of multiple vertices in one cell.

0

Accepts boolean values (0 or 1).

Material Interface Filter

The Material Interface filter finds volumes in the input data containg material above a certain material fraction. The Material Interface filter finds voxels inside of which a material fraction (or normalized amount of material) is higher than a given threshold. As these voxels are identified surfaces enclosing adjacent voxels above the threshold are generated. The resulting volume and its surface are what we call a fragment. The filter has the ability to compute various volumetric attributes such as fragment volume, mass, center of mass as well as volume and mass weighted averages for any of the fields present. Any field selected for such computation will be also be coppied into the fragment surface's point data for visualization. The filter also has the ability to generate Oriented Bounding Boxes (OBB) for each fragment. The data generated by the filter is organized in three outputs. The "geometry" output, containing the fragment surfaces. The "statistics" output, containing a point set of the centers of mass. The "obb representaion" output, containing OBB representations (poly data). All computed attributes are coppied into the statistics and geometry output. The obb representation output is used for validation and debugging puproses and is turned off by default. To measure the size of craters, the filter can invert a volume fraction and clip the volume fraction with a sphere and/or a plane.

Property Description Default Value(s) Restrictions
Input (Input)

Input to the filter can be a hierarchical box data set containing image data or a multi-block of rectilinear grids.

Accepts input of following types:

  • vtkNonOverlappingAMR

The dataset must contain a field array (cell)

Select Material Fraction Arrays (SelectMaterialArray)

Material fraction is defined as normalized amount of material per voxel. It is expected that arrays containing material fraction data has been down converted to a unsigned char.

An array of scalars is required.

Material Fraction Threshold (MaterialFractionThreshold)

Material fraction is defined as normalized amount of material per voxel. Any voxel in the input data set with a material fraction greater than this value is included in the output data set.

0.5

InvertVolumeFraction (InvertVolumeFraction)

Inverting the volume fraction generates the negative of the material. It is useful for analyzing craters.

0

Accepts boolean values (0 or 1).

Clip Type (ClipFunction)

This property sets the type of clip geometry, and associated parameters.

The value can be one of the following:

  • None (implicit_functions)
  • Plane (implicit_functions)
  • Sphere (implicit_functions)
Select Mass Arrays (SelectMassArray)

Mass arrays are paired with material fraction arrays. This means that the first selected material fraction array is paired with the first selected mass array, and so on sequentially. As the filter identifies voxels meeting the minimum material fraction threshold, these voxel's mass will be used in fragment center of mass and mass calculation. A warning is generated if no mass array is selected for an individual material fraction array. However, in that case the filter will run without issue because the statistics output can be generated using fragments' centers computed from axis aligned bounding boxes.

An array of scalars is required.

Compute volume weighted average over: (SelectVolumeWtdAvgArray)

Specifies the arrays from which to volume weighted average. For arrays selected a volume weighted average is computed. The values of these arrays are also coppied into fragment geometry cell data as the fragment surfaces are generated.

Compute mass weighted average over: (SelectMassWtdAvgArray)

For arrays selected a mass weighted average is computed. These arrays are also coppied into fragment geometry cell data as the fragment surfaces are generated.

ComputeOBB (ComputeOBB)

Compute Object Oriented Bounding boxes (OBB). When active the result of this computation is coppied into the statistics output. In the case that the filter is built in its validation mode, the OBB's are rendered.

0

Accepts boolean values (0 or 1).

WriteGeometryOutput (WriteGeometryOutput)

If this property is set, then the geometry output is written to a text file. The file name will be coonstructed using the path in the "Output Base Name" widget.

0

Accepts boolean values (0 or 1).

WriteStatisticsOutput (WriteStatisticsOutput)

If this property is set, then the statistics output is written to a text file. The file name will be coonstructed using the path in the "Output Base Name" widget.

0

Accepts boolean values (0 or 1).

OutputBaseName (OutputBaseName)

This property specifies the base including path of where to write the statistics and gemoetry output text files. It follows the pattern "/path/to/folder/and/file" here file has no extention, as the filter will generate a unique extention.


Median

Compute the median scalar values in a specified neighborhood for image/volume datasets. The Median filter operates on uniform rectilinear (image or volume) data and produces uniform rectilinear output. It replaces the scalar value at each pixel / voxel with the median scalar value in the specified surrounding neighborhood. Since the median operation removes outliers, this filter is useful for removing high-intensity, low-probability noise (shot noise).

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Median filter.

Accepts input of following types:

  • vtkImageData

The dataset must contain a field array (point)

with 1 component(s).

SelectInputScalars (SelectInputScalars)

The value of this property lists the name of the scalar array to use in computing the median.

An array of scalars is required.

KernelSize (KernelSize)

The value of this property specifies the number of pixels/voxels in each dimension to use in computing the median to assign to each pixel/voxel. If the kernel size in a particular dimension is 1, then the median will not be computed in that direction.

1 1 1


Merge Blocks

Appends vtkCompositeDataSet leaves into a single vtkUnstructuredGrid vtkCompositeDataToUnstructuredGridFilter appends all vtkDataSet leaves of the input composite dataset to a single unstructure grid. The subtree to be combined can be choosen using the SubTreeCompositeIndex. If the SubTreeCompositeIndex is a leaf node, then no appending is required.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input composite dataset.

Accepts input of following types:

  • vtkCompositeDataSet
SubTreeCompositeIndex (SubTreeCompositeIndex)

Select the index of the subtree to be appended. For now, this property is internal.

0

Merge Points (MergePoints)

1

Accepts boolean values (0 or 1).

Mesh Quality

This filter creates a new cell array containing a geometric measure of each cell's fitness. Different quality measures can be chosen for different cell shapes.This filter creates a new cell array containing a geometric measure of each cell's fitness. Different quality measures can be chosen for different cell shapes. Supported shapes include triangles, quadrilaterals, tetrahedra, and hexahedra. For other shapes, a value of 0 is assigned.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Mesh Quality filter.

Accepts input of following types:

  • vtkDataSet
TriangleQualityMeasure (TriangleQualityMeasure)

This property indicates which quality measure will be used to evaluate triangle quality. The radius ratio is the size of a circle circumscribed by a triangle's 3 vertices divided by the size of a circle tangent to a triangle's 3 edges. The edge ratio is the ratio of the longest edge length to the shortest edge length.

2

The value(s) is an enumeration of the following:

  • Area (28)
  • Aspect Ratio (1)
  • Aspect Frobenius (3)
  • Condition (9)
  • Distortion (15)
  • Edge Ratio (0)
  • Maximum Angle (8)
  • Minimum Angle (6)
  • Scaled Jacobian (10)
  • Radius Ratio (2)
  • Relative Size Squared (12)
  • Shape (13)
  • Shape and Size (14)
QuadQualityMeasure (QuadQualityMeasure)

This property indicates which quality measure will be used to evaluate quadrilateral quality.

0

The value(s) is an enumeration of the following:

  • Area (28)
  • Aspect Ratio (1)
  • Condition (9)
  • Distortion (15)
  • Edge Ratio (0)
  • Jacobian (25)
  • Maximum Aspect Frobenius (5)
  • Maximum Aspect Frobenius (5)
  • Maximum Edge Ratio (16)
  • Mean Aspect Frobenius (4)
  • Minimum Angle (6)
  • Oddy (23)
  • Radius Ratio (2)
  • Relative Size Squared (12)
  • Scaled Jacobian (10)
  • Shape (13)
  • Shape and Size (14)
  • Shear (11)
  • Shear and Size (24)
  • Skew (17)
  • Stretch (20)
  • Taper (18)
  • Warpage (26)
TetQualityMeasure (TetQualityMeasure)

This property indicates which quality measure will be used to evaluate tetrahedral quality. The radius ratio is the size of a sphere circumscribed by a tetrahedron's 4 vertices divided by the size of a circle tangent to a tetrahedron's 4 faces. The edge ratio is the ratio of the longest edge length to the shortest edge length. The collapse ratio is the minimum ratio of height of a vertex above the triangle opposite it divided by the longest edge of the opposing triangle across all vertex/triangle pairs.

2

The value(s) is an enumeration of the following:

  • Edge Ratio (0)
  • Aspect Beta (29)
  • Aspect Gamma (27)
  • Aspect Frobenius (3)
  • Aspect Ratio (1)
  • Collapse Ratio (7)
  • Condition (9)
  • Distortion (15)
  • Jacobian (25)
  • Minimum Dihedral Angle (6)
  • Radius Ratio (2)
  • Relative Size Squared (12)
  • Scaled Jacobian (10)
  • Shape (13)
  • Shape and Size (14)
  • Volume (19)
HexQualityMeasure (HexQualityMeasure)

This property indicates which quality measure will be used to evaluate hexahedral quality.

5

The value(s) is an enumeration of the following:

  • Diagonal (21)
  • Dimension (22)
  • Distortion (15)
  • Edge Ratio (0)
  • Jacobian (25)
  • Maximum Edge Ratio (16)
  • Maximum Aspect Frobenius (5)
  • Mean Aspect Frobenius (4)
  • Oddy (23)
  • Relative Size Squared (12)
  • Scaled Jacobian (10)
  • Shape (13)
  • Shape and Size (14)
  • Shear (11)
  • Shear and Size (24)
  • Skew (17)
  • Stretch (20)
  • Taper (18)
  • Volume (19)

MinMax

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Min Max filter.

Accepts input of following types:

  • vtkDataSet
Operation (Operation)

Select whether to perform a min, max, or sum operation on the data.

MIN

The value(s) can be one of the following:

  • MIN
  • MAX
  • SUM

Multicorrelative Statistics

Compute a statistical model of a dataset and/or assess the dataset with a statistical model. This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.<p> This filter computes the covariance matrix for all the arrays you select plus the mean of each array. The model is thus a multivariate Gaussian distribution with the mean vector and variances provided. Data is assessed using this model by computing the Mahalanobis distance for each input point. This distance will always be positive.<p> The learned model output format is rather dense and can be confusing, so it is discussed here. The first filter output is a multiblock dataset consisting of 2 tables: <ol> <li> Raw covariance data. <li> Covariance matrix and its Cholesky decomposition. </ol> The raw covariance table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.<p> The second table (covariance matrix and Cholesky decomposition) contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, Mean contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.<p> The remaining columns (there are N, one for each array) contain 2 matrices in triangular format. The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred). The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero). Because the diagonal must be stored for both matrices, an additional row is required — hence the N+1 rows and the final entry of the column named "Column".

Property Description Default Value(s) Restrictions
Input (Input)

The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.

Accepts input of following types:

  • vtkImageData
  • vtkStructuredGrid
  • vtkPolyData
  • vtkUnstructuredGrid
  • vtkTable
  • vtkGraph

The dataset must contain a field array ()

ModelInput (ModelInput)

A previously-calculated model with which to assess a separate dataset. This input is optional.

Accepts input of following types:

  • vtkTable
  • vtkMultiBlockDataSet
AttributeMode (AttributeMode)

Specify which type of field data the arrays will be drawn from.

0

The value must be field array name.

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Task (Task)

Specify the task to be performed: modeling and/or assessment. <ol> <li> "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the <b>entire</b> input dataset;</li> <li> "Model a subset of the data," creates an output table (or tables) summarizing a <b>randomly-chosen subset</b> of the input dataset;</li> <li> "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and</li> <li> "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.</li> </ol> When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The <i>Training fraction</i> setting will be ignored for tasks 1 and 3.

3

The value(s) is an enumeration of the following:

  • Detailed model of input data (0)
  • Model a subset of the data (1)
  • Assess the data with a model (2)
  • Model and assess the same data (3)
TrainingFraction (TrainingFraction)

Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.

0.1


Normal Glyphs

Filter computing surface normals.Filter computing surface normals.

Property Description Default Value(s) Restrictions


Octree Depth Limit

This filter takes in a octree and produces a new octree which is no deeper than the maximum specified depth level.The Octree Depth Limit filter takes in an octree and produces a new octree that is nowhere deeper than the maximum specified depth level. The attribute data of pruned leaf cells are integrated in to their ancestors at the cut level.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Octree Depth Limit filter.

Accepts input of following types:

  • vtkHyperOctree
MaximumLevel (MaximumLevel)

The value of this property specifies the maximum depth of the output octree.

4


Octree Depth Scalars

This filter adds a scalar to each leaf of the octree that represents the leaf's depth within the tree.The vtkHyperOctreeDepth filter adds a scalar to each leaf of the octree that represents the leaf's depth within the tree.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Octree Depth Scalars filter.

Accepts input of following types:

  • vtkHyperOctree

OrderedCompositeDistributor

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Ordered Composite Distributor filter.

PassThrough (PassThrough)

Toggle whether to pass the data through without compositing.

0

Accepts boolean values (0 or 1).

PKdTree (PKdTree)

Set the vtkPKdTree to distribute with.

OutputType (OutputType)

When not empty, the output will be converted to the given type.


Outline

This filter generates a bounding box representation of the input.The Outline filter generates an axis-aligned bounding box for the input dataset. This filter operates on any type of dataset and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Outline filter.

Accepts input of following types:

  • vtkDataSet

Outline Corners

This filter generates a bounding box representation of the input. It only displays the corners of the bounding box.The Outline Corners filter generates the corners of a bounding box for the input dataset. This filter operates on any type of dataset and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Outline Corners filter.

Accepts input of following types:

  • vtkDataSet
CornerFactor (CornerFactor)

The value of this property sets the size of the corners as a percentage of the length of the corresponding bounding box edge.

0.2


Outline Curvilinear DataSet

This filter generates an outline representation of the input.The Outline filter generates an outline of the outside edges of the input dataset, rather than the dataset's bounding box. This filter operates on structured grid datasets and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the outline (curvilinear) filter.

Accepts input of following types:

  • vtkStructuredGrid

Outline Generic DataSet

This filter generates a bounding box representation of the input.The Generic Outline filter generates an axis-aligned bounding box for the input data set. The Input menu specifies the data set for which to create a bounding box. This filter operates on generic data sets and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Outline filter.

Accepts input of following types:

  • vtkGenericDataSet

ParticlePath

Trace Particle Paths through time in a vector field. The Particle Trace filter generates pathlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4. This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter is available on the Toolbar.

Property Description Default Value(s) Restrictions
Input (Input)

Specify which is the Input of the StreamTracer filter.

Accepts input of following types:

  • vtkDataObject

The dataset must contain a field array (point)

with 3 component(s).

Seed Source (Source)

Specify the seed dataset. Typically fron where the vector field integration should begin. Usually a point/radius or a line with a given resolution.

Accepts input of following types:

  • vtkDataSet
TerminationTime (TerminationTime)

Setting TerminationTime to a positive value will cause particles to terminate when the time is reached. The units of time should be consistent with the primary time variable.

0.0

TimestepValues (TimestepValues)
ForceReinjectionEveryNSteps (ForceReinjectionEveryNSteps)

When animating particles, it is nice to inject new ones every Nth step to produce a continuous flow. Setting ForceReinjectionEveryNSteps to a non zero value will cause the particle source to reinject particles every Nth step even if it is otherwise unchanged. Note that if the particle source is also animated, this flag will be redundant as the particles will be reinjected whenever the source changes anyway

0

StaticSeeds (StaticSeeds)

If the input seeds are not changing, then this can be set to 1 to avoid having to do a repeated grid search that would return the exact same result.

0

Accepts boolean values (0 or 1).

StaticMesh (StaticMesh)

If the input grid is not changing, then this can be set to 1 to avoid having to create cell locators for each update.

0

Accepts boolean values (0 or 1).

SelectInputVectors (SelectInputVectors)

Specify which vector array should be used for the integration through that filter.

An array of vectors is required.

ComputeVorticity (ComputeVorticity)

Compute vorticity and angular rotation of particles as they progress

1

Accepts boolean values (0 or 1).

Terminal Speed (TerminalSpeed)

This property specifies the terminal speed, below which particle advection/integration is terminated.

0.000000000001


ParticleTracer

Trace Particles through time in a vector field. The Particle Trace filter generates pathlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4. This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter is available on the Toolbar.

Property Description Default Value(s) Restrictions
Input (Input)

Specify which is the Input of the StreamTracer filter.

Accepts input of following types:

  • vtkDataObject

The dataset must contain a field array (point)

with 3 component(s).

Seed Source (Source)

Specify the seed dataset. Typically fron where the vector field integration should begin. Usually a point/radius or a line with a given resolution.

Accepts input of following types:

  • vtkDataSet
StaticSeeds (StaticSeeds)

If the input seeds are not changing, then this can be set to 1 to avoid having to do a repeated grid search that would return the exact same result.

0

Accepts boolean values (0 or 1).

StaticMesh (StaticMesh)

If the input grid is not changing, then this can be set to 1 to avoid having to create cell locators for each update.

0

Accepts boolean values (0 or 1).

TimestepValues (TimestepValues)
ForceReinjectionEveryNSteps (ForceReinjectionEveryNSteps)

When animating particles, it is nice to inject new ones every Nth step to produce a continuous flow. Setting ForceReinjectionEveryNSteps to a non zero value will cause the particle source to reinject particles every Nth step even if it is otherwise unchanged. Note that if the particle source is also animated, this flag will be redundant as the particles will be reinjected whenever the source changes anyway

0

SelectInputVectors (SelectInputVectors)

Specify which vector array should be used for the integration through that filter.

An array of vectors is required.

ComputeVorticity (ComputeVorticity)

Compute vorticity and angular rotation of particles as they progress

1

Accepts boolean values (0 or 1).

Pass Arrays

Pass specified point and cell data arrays. The Pass Arrays filter makes a shallow copy of the output data object from the input data object except for passing only the arrays specified to the output from the input.

Property Description Default Value(s) Restrictions
Input (Input)

Accepts input of following types:

  • vtkDataObject

The dataset must contain a field array (cell)

The dataset must contain a field array (point)

The dataset must contain a field array (field)

PointDataArrays (PointDataArrays)

Add a point array by name to be passed.

CellDataArrays (CellDataArrays)

Add a cell array by name to be passed.

FieldDataArrays (FieldDataArrays)

Add a field array by name to be passed.

UseFieldTypes (UseFieldTypes)

This hidden property must always be set to 1 for this proxy to work.

1

Accepts boolean values (0 or 1).

SelectedFieldTypes (SelectedFieldTypes)

This hidden property must always be set to 0 for this proxy to work.

0 1 2

Accepts boolean values (0 or 1).

Pass Through

A simple pass-through filter that doesn't transform data in any way.


Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkDataSet

Plot Data

Plot data arrays from the inputThis filter prepare arbitrary data to be plotted in any of the plots. By default the data is shown in a XY line plot.

Property Description Default Value(s) Restrictions
Input (Input)

The input.

Accepts input of following types:

  • vtkDataObject

Plot Global Variables Over Time

Extracts and plots data in field data over time. This filter extracts the variables that reside in a dataset's field data and are defined over time. The output is a 1D rectilinear grid where the x coordinates correspond to time (the same array is also copied to a point array named Time or TimeData (if Time exists in the input)).

Property Description Default Value(s) Restrictions
Input (Input)

The input from which the selection is extracted.

Accepts input of following types:

  • vtkDataSet

Plot On Intersection Curves

Extracts the edges in a 2D plane and plots them Extracts the surface, intersect it with a 2D plane. Plot the resulting polylines.

Property Description Default Value(s) Restrictions


Plot On Sorted Lines

The Plot on Sorted Lines filter sorts and orders polylines for graph visualization.The Plot on Sorted Lines filter sorts and orders polylines for graph visualization. See http://www.paraview.org/ParaView3/index.php/Plotting_Over_Curves for more information.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Plot Edges filter.

Accepts input of following types:

  • vtkPolyData

Plot Over Line

Sample data attributes at the points along a line. Probed lines will be displayed in a graph of the attributes. The Plot Over Line filter samples the data set attributes of the current data set at the points along a line. The values of the point-centered variables along that line will be displayed in an XY Plot. This filter uses interpolation to determine the values at the selected point, whether or not it lies at an input point. The Probe filter operates on any type of data and produces polygonal output (a line).

Property Description Default Value(s) Restrictions


Plot Selection Over Time

Extracts selection over time and then plots it. This filter extracts the selection over time, i.e. cell and/or point variables at a cells/point selected are extracted over time The output multi-block consists of 1D rectilinear grids where the x coordinate corresponds to time (the same array is also copied to a point array named Time or TimeData (if Time exists in the input)). If selection input is a Location based selection then the point values are interpolated from the nearby cells, ie those of the cell the location lies in.

Property Description Default Value(s) Restrictions
Input (Input)

The input from which the selection is extracted.

Accepts input of following types:

  • vtkDataSet
  • vtkTable
  • vtkCompositeDataSet
Selection (Selection)

The input that provides the selection object.

Accepts input of following types:

  • vtkSelection
Only Report Selection Statistics (Only Report Selection Statistics)

If this property is set to 1, the min, max, inter-quartile ranges, and (for numeric arrays) mean and standard deviation of all the selected points or cells within each time step are reported -- instead of breaking each selected point's or cell's attributes out into separate time history tables.

1

Accepts boolean values (0 or 1).

Point Data to Cell Data

Create cell attributes by averaging point attributes.The Point Data to Cell Data filter averages the values of the point attributes of the points of a cell to compute cell attributes. This filter operates on any type of dataset, and the output dataset is the same type as the input.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Point Data to Cell Data filter.

Accepts input of following types:

  • vtkDataSetOnce set, the input dataset cannot be changed.

The dataset must contain a field array (point)

PassPointData (PassPointData)

The value of this property controls whether the input point data will be passed to the output. If set to 1, then the input point data is passed through to the output; otherwise, only generated cell data is placed into the output.

0

Accepts boolean values (0 or 1).

PolyLine To Rectilinear Grid

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Polyline to Rectilinear Grid filter.

Accepts input of following types:

  • vtkPolyData

Principal Component Analysis

Compute a statistical model of a dataset and/or assess the dataset with a statistical model. This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset. <p> This filter performs additional analysis above and beyond the multicorrelative filter. It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter. Data is then assessed by projecting the original tuples into a possibly lower-dimensional space. <p> Since the PCA filter uses the multicorrelative filter's analysis, it shares the same raw covariance table specified in the multicorrelative documentation. The second table in the multiblock dataset comprising the model output is an expanded version of the multicorrelative version. <p> As with the multicorrlative filter, the second model table contains the mean values, the upper-triangular portion of the symmetric covariance matrix, and the non-zero lower-triangular portion of the Cholesky decomposition of the covariance matrix. Below these entries are the eigenvalues of the covariance matrix (in the column labeled "Mean") and the eigenvectors (as row vectors) in an additional NxN matrix.

Property Description Default Value(s) Restrictions
Input (Input)

The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.

Accepts input of following types:

  • vtkImageData
  • vtkStructuredGrid
  • vtkPolyData
  • vtkUnstructuredGrid
  • vtkTable
  • vtkGraph

The dataset must contain a field array ()

ModelInput (ModelInput)

A previously-calculated model with which to assess a separate dataset. This input is optional.

Accepts input of following types:

  • vtkTable
  • vtkMultiBlockDataSet
AttributeMode (AttributeMode)

Specify which type of field data the arrays will be drawn from.

0

The value must be field array name.

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Task (Task)

Specify the task to be performed: modeling and/or assessment. <ol> <li> "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the <b>entire</b> input dataset;</li> <li> "Model a subset of the data," creates an output table (or tables) summarizing a <b>randomly-chosen subset</b> of the input dataset;</li> <li> "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and</li> <li> "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.</li> </ol> When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The <i>Training fraction</i> setting will be ignored for tasks 1 and 3.

3

The value(s) is an enumeration of the following:

  • Detailed model of input data (0)
  • Model a subset of the data (1)
  • Assess the data with a model (2)
  • Model and assess the same data (3)
TrainingFraction (TrainingFraction)

Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.

0.1

Normalization Scheme (NormalizationScheme)

Before the eigenvector decomposition of the covariance matrix takes place, you may normalize each (i,j) entry by sqrt( cov(i,i) * cov(j,j) ). This implies that the variance of each variable of interest should be of equal importance.

2

The value(s) is an enumeration of the following:

  • No normalization (0)
  • Normalize using covariances (3)
Basis Scheme (BasisScheme)

When reporting assessments, should the full eigenvector decomposition be used to project the original vector into the new space (Full basis), or should a fixed subset of the decomposition be used (Fixed-size basis), or should the projection be clipped to preserve at least some fixed "energy" (Fixed-energy basis)?<p> As an example, suppose the variables of interest were {A,B,C,D,E} and that the eigenvalues of the covariance matrix for these were {5,2,1.5,1,.5}. If the "Full basis" scheme is used, then all 5 components of the eigenvectors will be used to project each {A,B,C,D,E}-tuple in the original data into a new 5-components space.<p> If the "Fixed-size" scheme is used and the "Basis Size" property is set to 4, then only the first 4 eigenvector components will be used to project each {A,B,C,D,E}-tuple into the new space and that space will be of dimension 4, not 5.<p> If the "Fixed-energy basis" scheme is used and the "Basis Energy" property is set to 0.8, then only the first 3 eigenvector components will be used to project each {A,B,C,D,E}-tuple into the new space, which will be of dimension 3. The number 3 is chosen because 3 is the lowest N for which the sum of the first N eigenvalues divided by the sum of all eigenvalues is larger than the specified "Basis Energy" (i.e., (5+2+1.5)/10 = 0.85 > 0.8).

0

The value(s) is an enumeration of the following:

  • Full basis (0)
  • Fixed-size basis (1)
  • Fixed-energy basis (2)
Basis Size (BasisSize)

The maximum number of eigenvector components to use when projecting into the new space.

2

Basis Energy (BasisEnergy)

The minimum energy to use when determining the dimensionality of the new space into which the assessment will project tuples.

0.1

RobustPCA (RobustPCA)

Compute robust PCA with medians instead of means.

0

Accepts boolean values (0 or 1).

Probe Location

Sample data attributes at the points in a point cloud. The Probe filter samples the data set attributes of the current data set at the points in a point cloud. The Probe filter uses interpolation to determine the values at the selected point, whether or not it lies at an input point. The Probe filter operates on any type of data and produces polygonal output (a point cloud).

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the dataset from which to obtain probe values.

Accepts input of following types:

  • vtkDataSet
  • vtkCompositeDataSet

The dataset must contain a field array ()

Probe Type (Source)

This property specifies the dataset whose geometry will be used in determining positions to probe.

The value can be one of the following:

  • FixedRadiusPointSource (extended_sources)
PassFieldArrays (PassFieldArrays)

Set whether to pass the field-data arrays from the Input i.e. the input providing the geometry to the output. On by default.

1

Accepts boolean values (0 or 1).

ComputeTolerance (ComputeTolerance)

Set whether to compute the tolerance or to use a user provided value. On by default.

1

Accepts boolean values (0 or 1).

Tolerance (Tolerance)

Set the tolerance to use for vtkDataSet::FindCell

2.2204460492503131e-16


Process Id Scalars

This filter uses colors to show how data is partitioned across processes.The Process Id Scalars filter assigns a unique scalar value to each piece of the input according to which processor it resides on. This filter operates on any type of data when ParaView is run in parallel. It is useful for determining whether your data is load-balanced across the processors being used. The output data set type is the same as that of the input.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Process Id Scalars filter.

Accepts input of following types:

  • vtkDataSet
RandomMode (RandomMode)

The value of this property determines whether to use random id values for the various pieces. If set to 1, the unique value per piece will be chosen at random; otherwise the unique value will match the id of the process.

0

Accepts boolean values (0 or 1).

Programmable Filter

Executes a user supplied python script on its input dataset to produce an output dataset. This filter will execute a python script to produce an output dataset. The filter keeps a copy of the python script in Script, and creates Interpretor, a python interpretor to run the script upon the first execution.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input(s) to the programmable filter.

Accepts input of following types:

  • vtkDataObject
OutputDataSetType (OutputDataSetType)

The value of this property determines the dataset type for the output of the programmable filter.

8

The value(s) is an enumeration of the following:

  • Same as Input (8)
  • vtkPolyData (0)
  • vtkStructuredGrid (2)
  • vtkRectilinearGrid (3)
  • vtkUnstructuredGrid (4)
  • vtkImageData (6)
  • vtkUniformGrid (10)
  • vtkMultiblockDataSet (13)
  • vtkHierarchicalBoxDataSet (15)
  • vtkTable (19)
Script (Script)

This property contains the text of a python program that the programmable filter runs.

RequestInformation Script (InformationScript)

This property is a python script that is executed during the RequestInformation pipeline pass. Use this to provide information such as WHOLE_EXTENT to the pipeline downstream.

RequestUpdateExtent Script (UpdateExtentScript)

This property is a python script that is executed during the RequestUpdateExtent pipeline pass. Use this to modify the update extent that your filter ask up stream for.

CopyArrays (CopyArrays)

If this property is set to true, all the cell and point arrays from first input are copied to the output.

0

Accepts boolean values (0 or 1).

Parameters (Parameters)
PythonPath (PythonPath)

A semi-colon (;) separated list of directories to add to the python library search path.

TimestepValues (TimestepValues)

Available timestep values.


Python Annotation

This filter evaluates a Python expression for a text annotation This filter uses Python to calculate an expression. It depends heavily on the numpy and paraview.vtk modules. To use the parallel functions, mpi4py is also necessary. The expression is evaluated and the resulting scalar value or numpy array is added to the output as an array. See numpy and paraview.vtk documentation for the list of available functions. This filter tries to make it easy for the user to write expressions by defining certain variables. The filter tries to assign each array to a variable of the same name. If the name of the array is not a valid Python variable, it has to be accessed through a dictionary called arrays (i.e. arrays['array_name']).

Property Description Default Value(s) Restrictions
Input (Input)

Set the input of the filter.

Accepts input of following types:

  • vtkDataObject
ArrayAssociation (ArrayAssociation)

Select the attribute to use to popular array names from.

2

The value(s) is an enumeration of the following:

  • Point Data (0)
  • Cell Data (1)
  • Field Data (2)
  • Row Data (6)
Expression (Expression)

The Python expression evaluated during execution. FieldData arrays are direclty available through their name. Set of provided variables [input, t_value, t_steps, t_range, t_index, FieldData, PointData, CellData] (i.e.: "Momentum: (%f, %f, %f)" % (XMOM[t_index,0], YMOM[t_index,0], ZMOM[t_index,0]) )

AnnotationValue (AnnotationValue)

Text that is used as annotation


Python Calculator

This filter evaluates a Python expressionThis filter uses Python to calculate an expression. It depends heavily on the numpy and paraview.vtk modules. To use the parallel functions, mpi4py is also necessary. The expression is evaluated and the resulting scalar value or numpy array is added to the output as an array. See numpy and paraview.vtk documentation for the list of available functions. This filter tries to make it easy for the user to write expressions by defining certain variables. The filter tries to assign each array to a variable of the same name. If the name of the array is not a valid Python variable, it has to be accessed through a dictionary called arrays (i.e. arrays['array_name']). The points can be accessed using the points variable.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input of the filter.

Accepts input of following types:

  • vtkDataSet
Expression (Expression)

The Python expression evaluated during execution.

ArrayAssociation (ArrayAssociation)

This property controls the association of the output array as well as which arrays are defined as variables.

0

The value(s) is an enumeration of the following:

  • Point Data (0)
  • Cell Data (1)
ArrayName (ArrayName)

The name of the output array.

result

CopyArrays (CopyArrays)

If this property is set to true, all the cell and point arrays from first input are copied to the output.

1

Accepts boolean values (0 or 1).

Quadric Clustering

This filter is the same filter used to generate level of detail for ParaView. It uses a structured grid of bins and merges all points contained in each bin.The Quadric Clustering filter produces a reduced-resolution polygonal approximation of the input polygonal dataset. This filter is the one used by ParaView for computing LODs. It uses spatial binning to reduce the number of points in the data set; points that lie within the same spatial bin are collapsed into one representative point.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Quadric Clustering filter.

Accepts input of following types:

  • vtkPolyData
Number of Dimensions (NumberOfDivisions)

This property specifies the number of bins along the X, Y, and Z axes of the data set.

50 50 50

UseInputPoints (UseInputPoints)

If the value of this property is set to 1, the representative point for each bin is selected from one of the input points that lies in that bin; the input point that produces the least error is chosen. If the value of this property is 0, the location of the representative point is calculated to produce the least error possible for that bin, but the point will most likely not be one of the input points.

1

Accepts boolean values (0 or 1).

UseFeatureEdges (UseFeatureEdges)

If this property is set to 1, feature edge quadrics will be used to maintain the boundary edges along processor divisions.

0

Accepts boolean values (0 or 1).

UseFeaturePoints (UseFeaturePoints)

If this property is set to 1, feature point quadrics will be used to maintain the boundary points along processor divisions.

0

Accepts boolean values (0 or 1).

CopyCellData (CopyCellData)

If this property is set to 1, the cell data from the input will be copied to the output.

1

Accepts boolean values (0 or 1).

UseInternalTriangles (UseInternalTriangles)

If this property is set to 1, triangles completely contained in a spatial bin will be included in the computation of the bin's quadrics. When this property is set to 0, the filters operates faster, but the resulting surface may not be as well-behaved.

0

Accepts boolean values (0 or 1).

Random Attributes

This filter creates a new random attribute array and sets it as the default array. The Random Attributes filter creates random attributes including scalars and vectors. These attributes can be generated as point data or cell data. The generation of each component is normalized between a user-specified minimum and maximum value.

This filter provides that capability to specify the data type of the attributes and the range for each of the components.


Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Random Scalars filter.

Accepts input of following types:

  • vtkDataSet
DataType (DataType)

Specify the type of array to create (all components of this array are of this type). This holds true for all arrays that are created.

3

The value(s) is an enumeration of the following:

  • Bit (1)
  • Char (2)
  • UnsignedChar (3)
  • Short (4)
  • UnsignedShort (5)
  • Int (6)
  • UnsignedInt (7)
  • Long (8)
  • UnsignedLong (9)
  • Float (10)
  • Double (11)
ComponentRange (ComponentRange)

Set the range values (minimum and maximum) for each component. This applies to all data that is generated.

0 255

AttributesConstantPerBlock (AttributesConstantPerBlock)

Indicate that the generated attributes are constant within a block. This can be used to highlight blocks in a composite dataset.

0

Accepts boolean values (0 or 1).

GeneratePointScalars (GeneratePointScalars)

Indicate that point scalars are to be generated.

0

Accepts boolean values (0 or 1).

GeneratePointVectors (GeneratePointVectors)

Indicate that point vectors are to be generated.

0

Accepts boolean values (0 or 1).

GenerateCellScalars (GenerateCellScalars)

Indicate that point scalars are to be generated.

0

Accepts boolean values (0 or 1).

GenerateCellVectors (GenerateCellVectors)

Indicate that point vectors are to be generated.

1

Accepts boolean values (0 or 1).

Random Vectors

This filter creates a new 3-component point data array and sets it as the default vector array. It uses a random number generator to create values.The Random Vectors filter generates a point-centered array of random vectors. It uses a random number generator to determine the components of the vectors. This filter operates on any type of data set, and the output data set will be of the same type as the input.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Random Vectors filter.

Accepts input of following types:

  • vtkDataSet
MinimumSpeed (MinimumSpeed)

This property specifies the minimum length of the random point vectors generated.

0

MaximumSpeed (MaximumSpeed)

This property specifies the maximum length of the random point vectors generated.

1


Rectilinear Data to Point Set

Converts a rectilinear grid to an equivalend structured gridThe Rectilinear Grid to Point Set filter takes an rectilinear grid object and outputs an equivalent Structured Grid (which is a type of point set). This brings the data to a broader category of data storage but only adds a small amount of overhead. This filter can be helpful in applying filters that expect or manipulate point coordinates.

Property Description Default Value(s) Restrictions
Input (Input)

Accepts input of following types:

  • vtkRectilinearGrid

Rectilinear Grid Connectivity

Parallel fragments extraction and attributes integration on rectilinear grids. Extracts material fragments from multi-block vtkRectilinearGrid datasets based on the selected volume fraction array(s) and a fraction isovalue and integrates the associated attributes.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkRectilinearGrid
  • vtkCompositeDataSet

The dataset must contain a field array (cell)

with 1 component(s).

Double Volume Arrays (AddDoubleVolumeArrayName)

This property specifies the name(s) of the volume fraction array(s) for generating parts.

An array of scalars is required.

Float Volume Arrays (AddFloatVolumeArrayName)

This property specifies the name(s) of the volume fraction array(s) for generating parts.

An array of scalars is required.

Unsigned Character Volume Arrays (AddUnsignedCharVolumeArrayName)

This property specifies the name(s) of the volume fraction array(s) for generating parts.

An array of scalars is required.

Volume Fraction Value (VolumeFractionSurfaceValue)

The value of this property is the volume fraction value for the surface.

0.1


RectilinearGridGeometryFilter

Extracts geometry for a rectilinear grid. Output is a polydata dataset. RectilinearGridGeometryFilter is a filter that extracts geometry from a rectilinear grid. By specifying appropriate i-j-k indices, it is possible to extract a point, a curve, a surface, or a "volume". The volume is actually a (n x m x o) region of points. The extent specification is zero-offset. That is, the first k-plane in a 50x50x50 rectilinear grid is given by (0,49, 0,49, 0,0).

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Rectilinear Grid Geometry filter.

Accepts input of following types:

  • vtkDataSet

ReductionFilter

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Reduction filter.

PreGatherHelperName (PreGatherHelperName)

Set the algorithm that runs on each node in parallel.

PostGatherHelperName (PostGatherHelperName)

Set the algorithm that takes multiple inputs and produces a single reduced output.

PostGatherHelper (PostGatherHelper)
PreGatherHelper (PreGatherHelper)
PassThrough (PassThrough)

If set to a non-negative value, then produce results using only the node Id specified.

-1

GenerateProcessIds (GenerateProcessIds)

If true, the filter will generate vtkOriginalProcessIds arrays indicating the process id on which the cell/point was generated.

0

Accepts boolean values (0 or 1).

Reflect

This filter takes the union of the input and its reflection over an axis-aligned plane.The Reflect filter reflects the input dataset across the specified plane. This filter operates on any type of data set and produces an unstructured grid output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Reflect filter.

Accepts input of following types:

  • vtkDataSet
Plane (Plane)

The value of this property determines which plane to reflect across. If the value is X, Y, or Z, the value of the Center property determines where the plane is placed along the specified axis. The other six options (X Min, X Max, etc.) place the reflection plane at the specified face of the bounding box of the input dataset.

0

The value(s) is an enumeration of the following:

  • X Min (0)
  • Y Min (1)
  • Z Min (2)
  • X Max (3)
  • Y Max (4)
  • Z Max (5)
  • X (6)
  • Y (7)
  • Z (8)
Center (Center)

If the value of the Plane property is X, Y, or Z, then the value of this property specifies the center of the reflection plane.

0.0

CopyInput (CopyInput)

If this property is set to 1, the output will contain the union of the input dataset and its reflection. Otherwise the output will contain only the reflection of the input data.

1

Accepts boolean values (0 or 1).

Resample AMR

Converts AMR data to a uniform gridThis filter allows the user to specify a Region of Interest(ROI) within the AMR data-set and extract it as a uniform grid.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input for this filter.

Accepts input of following types:

  • vtkOverlappingAMR
Demand-Driven Mode (Demand-Driven Mode)

This property specifies whether the resampling filter will operate in demand-driven mode or not.

1

Accepts boolean values (0 or 1).

TransferToNodes (TransferToNodes)

This property specifies whether the solution will be transfered to the nodes of the extracted region or the cells.

1

Accepts boolean values (0 or 1).

NumberOfPartitions (NumberOfPartitions)

Set the number of subdivisions for recursive coordinate bisection.

1

Number of Samples (Number of Samples)

Sets the number of samples in each dimension

10 10 10

Min (Min)

This property sets the minimum 3-D coordinate location by which the particles will be filtered out.

0.0 0.0 0.0

Max (Max)

This property sets the minimum 3-D coordinate location by which the particles will be filtered out.

0.0 0.0 0.0


Resample With Dataset

Sample data attributes at the points of a dataset. Probe is a filter that computes point attributes at specified point positions. The filter has two inputs: the Input and Source. The 'Source' geometric structure is passed through the filter. The point attributes are computed at the 'Source' point positions by interpolating into the 'Input' data. For example, we can compute data values on a plane (plane specified as Source) from a volume (Input). The cell data of the Input data is copied to the output based on in which Input cell each Source point is. If an array of the same name exists both in Input's point and cell data, only the one from the point data is probed.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the dataset from which to obtain probe values. The data attributes come from this dataset.

Accepts input of following types:

  • vtkDataSet
  • vtkCompositeDataSet

The dataset must contain a field array ()

Source (Source)

This property specifies the dataset whose geometry will be used in determining positions to probe. The mesh comes from this dataset.

Accepts input of following types:

  • vtkDataSet
PassCellArrays (PassCellArrays)

When set the input's cell data arrays are shallow copied to the output.

0

Accepts boolean values (0 or 1).

PassPointArrays (PassPointArrays)

When set the input's point data arrays are shallow copied to the output.

0

Accepts boolean values (0 or 1).

PassFieldArrays (PassFieldArrays)

Set whether to pass the field-data arrays from the Input i.e. the input providing the geometry to the output. On by default.

1

Accepts boolean values (0 or 1).

ComputeTolerance (ComputeTolerance)

Set whether to compute the tolerance or to use a user provided value. On by default.

1

Accepts boolean values (0 or 1).

Tolerance (Tolerance)

Set the tolernce to use for vtkDataSet::FindCell

2.2204460492503131e-16


Ribbon

This filter generates ribbon surface from lines. It is useful for displaying streamlines.The Ribbon filter creates ribbons from the lines in the input data set. This filter is useful for visualizing streamlines. Both the input and output of this filter are polygonal data. The input data set must also have at least one point-centered vector array.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Ribbon filter.

Accepts input of following types:

  • vtkPolyData

The dataset must contain a field array (point)

with 3 component(s).

The dataset must contain a field array (point)

with 1 component(s).

Scalars (SelectInputScalars)

The value of this property indicates the name of the input scalar array used by this filter. The width of the ribbons will be varied based on the values in the specified array if the value of the Width property is 1.

An array of scalars is required.

Vectors (SelectInputVectors)

The value of this property indicates the name of the input vector array used by this filter. If the UseDefaultNormal property is set to 0, the normal vectors for the ribbons come from the specified vector array.

1

An array of vectors is required.

Width (Width)

If the VaryWidth property is set to 1, the value of this property is the minimum ribbon width. If the VaryWidth property is set to 0, the value of this property is half the width of the ribbon.

1

The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.01.

Angle (Angle)

The value of this property specifies the offset angle (in degrees) of the ribbon from the line normal.

0

UseDefaultNormal (UseDefaultNormal)

If this property is set to 0, and the input contains no vector array, then default ribbon normals will be generated (DefaultNormal property); if a vector array has been set (SelectInputVectors property), the ribbon normals will be set from the specified array. If this property is set to 1, the default normal (DefaultNormal property) will be used, regardless of whether the SelectInputVectors property has been set.

0

Accepts boolean values (0 or 1).

DefaultNormal (DefaultNormal)

The value of this property specifies the normal to use when the UseDefaultNormal property is set to 1 or the input contains no vector array (SelectInputVectors property).

0 0 1

VaryWidth (VaryWidth)

If this property is set to 1, the ribbon width will be scaled according to the scalar array specified in the SelectInputScalars property. Toggle the variation of ribbon width with scalar value.

0

Accepts boolean values (0 or 1).

Rotational Extrusion

This filter generates a swept surface while translating the input along a circular path. The Rotational Extrusion filter forms a surface by rotating the input about the Z axis. This filter is intended to operate on 2D polygonal data. It produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Rotational Extrusion filter.

Accepts input of following types:

  • vtkPolyData
Resolution (Resolution)

The value of this property controls the number of intermediate node points used in performing the sweep (rotating from 0 degrees to the value specified by the Angle property.

12

Capping (Capping)

If this property is set to 1, the open ends of the swept surface will be capped with a copy of the input dataset. This works property if the input is a 2D surface composed of filled polygons. If the input dataset is a closed solid (e.g., a sphere), then either two copies of the dataset will be drawn or no surface will be drawn. No surface is drawn if either this property is set to 0 or if the two surfaces would occupy exactly the same 3D space (i.e., the Angle property's value is a multiple of 360, and the values of the Translation and DeltaRadius properties are 0).

1

Accepts boolean values (0 or 1).

Angle (Angle)

This property specifies the angle of rotation in degrees. The surface is swept from 0 to the value of this property.

360

Translation (Translation)

The value of this property specifies the total amount of translation along the Z axis during the sweep process. Specifying a non-zero value for this property allows you to create a corkscrew (value of DeltaRadius > 0) or spring effect.

0

DeltaRadius (DeltaRadius)

The value of this property specifies the change in radius during the sweep process.

0


Scatter Plot

Creates a scatter plot from a dataset.This filter creates a scatter plot from a dataset.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkDataSet

Shrink

This filter shrinks each input cell so they pull away from their neighbors.The Shrink filter causes the individual cells of a dataset to break apart from each other by moving each cell's points toward the centroid of the cell. (The centroid of a cell is the average position of its points.) This filter operates on any type of dataset and produces unstructured grid output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Shrink filter.

Accepts input of following types:

  • vtkDataSet
ShrinkFactor (ShrinkFactor)

The value of this property determines how far the points will move. A value of 0 positions the points at the centroid of the cell; a value of 1 leaves them at their original positions.

0.5


Slice

This filter slices a data set with a plane. Slicing is similar to a contour. It creates surfaces from volumes and lines from surfaces.This filter extracts the portion of the input dataset that lies along the specified plane. The Slice filter takes any type of dataset as input. The output of this filter is polygonal data.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Slice filter.

Accepts input of following types:

  • vtkDataSet
Slice Type (CutFunction)

This property sets the parameters of the slice function.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)
  • Cylinder (implicit_functions)
InputBounds (InputBounds)
Crinkle slice (PreserveInputCells)

This parameter controls whether to extract the entire cells that are sliced by the region or just extract a triangulated surface of that region.

0

Accepts boolean values (0 or 1).

Triangulate the slice (Triangulate the slice)

This parameter controls whether to produce triangles in the output.

1

Accepts boolean values (0 or 1).

Slice Offset Values (ContourValues)

The values in this property specify a list of current offset values. This can be used to create multiple slices with different centers. Each entry represents a new slice with its center shifted by the offset value.

Determine the length of the dataset's diagonal. The value must lie within -diagonal length to +diagonal length.


Slice (demand-driven-composite)

This filter slices a data set with a plane. Slicing is similar to a contour. It creates surfaces from volumes and lines from surfaces.This filter extracts the portion of the input dataset that lies along the specified plane. The Slice filter takes any type of dataset as input. The output of this filter is polygonal data.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Slice filter.

Accepts input of following types:

  • vtkDataObject
Slice Type (CutFunction)

This property sets the parameters of the slice function.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)
InputBounds (InputBounds)
Slice Offset Values (ContourValues)

The values in this property specify a list of current offset values. This can be used to create multiple slices with different centers. Each entry represents a new slice with its center shifted by the offset value.

Determine the length of the dataset's diagonal. The value must lie within -diagonal length to +diagonal length.


Slice AMR data

Slices AMR DataThis filter slices AMR data.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input for this filter.

Accepts input of following types:

  • vtkOverlappingAMR
ForwardUpstream (ForwardUpstream)

This property specifies whether or not requests will be propagated upstream.

0

Accepts boolean values (0 or 1).

EnablePrefetching (EnablePrefetching)

This property specifies whether or not requests pre-fetching of blocks of the next level will be enabled.

0

Accepts boolean values (0 or 1).

Level (Level)

Set maximum slice resolution.

0

OffSet (OffSet)

Set's the offset from the origin of the data-set

1.0

Normal (Normal)

This property sets the normal of the slice.

0

The value(s) is an enumeration of the following:

  • X-Normal (1)
  • Y-Normal (2)
  • Z-Normal (3)

Slice Along PolyLine

Slice along the surface defined by sweeping a polyline parallel to the z-axis. The Slice Along PolyLine filter is similar to the Slice Filter except that it slices along a surface that is defined by sweeping the input polyline parallel to the z-axis. Explained another way: take a laser cutter and move it so that it hits every point on the input polyline while keeping it parallel to the z-axis. The surface cut from the input dataset is the result.


Property Description Default Value(s) Restrictions
Dataset (Dataset)

Set the vtkDataObject to slice.

Accepts input of following types:

  • vtkDataSet
PolyLine (PolyLine)

Set the polyline to slice along.

Accepts input of following types:

  • vtkPolyData
Tolerance (Tolerance)

The threshold used internally to determine correspondence between the polyline and the output slice. If the output has sections missing, increasing this value may help.

10


Slice Generic Dataset

This filter cuts a data set with a plane or sphere. Cutting is similar to a contour. It creates surfaces from volumes and lines from surfaces.The Generic Cut filter extracts the portion of the input data set that lies along the specified plane or sphere. From the Cut Function menu, you can select whether cutting will be performed with a plane or a sphere. The appropriate 3D widget (plane widget or sphere widget) will be displayed. The parameters of the cut function can be specified interactively using the 3D widget or manually using the traditional user interface controls. Instructions for using these 3D widgets and their corresponding user interfaces are found in section 7.4. By default, the cut lies on the specified plane or sphere. Using the Cut Offset Values portion of the interface, it is also possible to cut the data set at some offset from the original cut function. The Cut Offset Values are in the spatial units of the data set. To add a single offset, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced offsets, use the controls in the Generate range of values section. Select the number of offsets to generate using the Number of Values slider. The Range slider controls the interval in which to generate the offsets. Once the number of values and range have been selected, click the Generate button. The new offsets will be added to the Offset Values list. To delete a value from the Cut Offset Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list. The Generic Cut filter takes a generic dataset as input. Use the Input menu to choose a data set to cut. The output of this filter is polygonal data.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Cut filter.

Accepts input of following types:

  • vtkGenericDataSet
Cut Type (CutFunction)

Set the parameters to the implicit function used for cutting.

The value can be one of the following:

  • Plane (implicit_functions)
  • Box (implicit_functions)
  • Sphere (implicit_functions)
InputBounds (InputBounds)
Slice Offset Values (ContourValues)

The values in this property specify a list of current offset values. This can be used to create multiple slices with different centers. Each entry represents a new slice with its center shifted by the offset value.

Determine the length of the dataset's diagonal. The value must lie within -diagonal length to +diagonal length.


Smooth

This filter smooths a polygonal surface by iteratively moving points toward their neighbors. The Smooth filter operates on a polygonal data set by iteratively adjusting the position of the points using Laplacian smoothing. (Because this filter only adjusts point positions, the output data set is also polygonal.) This results in better-shaped cells and more evenly distributed points. The Convergence slider limits the maximum motion of any point. It is expressed as a fraction of the length of the diagonal of the bounding box of the data set. If the maximum point motion during a smoothing iteration is less than the Convergence value, the smoothing operation terminates.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Smooth filter.

Accepts input of following types:

  • vtkPolyData
Number of Iterations (NumberOfIterations)

This property sets the maximum number of smoothing iterations to perform. More iterations produce better smoothing.

20

Convergence (Convergence)

The value of this property limits the maximum motion of any point. It is expressed as a fraction of the length of the diagonal of the bounding box of the input dataset. If the maximum point motion during a smoothing iteration is less than the value of this property, the smoothing operation terminates.

0.0


StreakLine

Trace Streak lines through time in a vector field. The Particle Trace filter generates pathlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4. This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter is available on the Toolbar.

Property Description Default Value(s) Restrictions
Input (Input)

Specify which is the Input of the StreamTracer filter.

Accepts input of following types:

  • vtkDataObject

The dataset must contain a field array (point)

with 3 component(s).

StaticSeeds (StaticSeeds)

If the input seeds are not changing, then this can be set to 1 to avoid having to do a repeated grid search that would return the exact same result.

0

Accepts boolean values (0 or 1).

StaticMesh (StaticMesh)

If the input grid is not changing, then this can be set to 1 to avoid having to create cell locators for each update.

0

Accepts boolean values (0 or 1).

Seed Source (Source)

Specify the seed dataset. Typically fron where the vector field integration should begin. Usually a point/radius or a line with a given resolution.

Accepts input of following types:

  • vtkDataSet
TerminationTime (TerminationTime)

Setting TerminationTime to a positive value will cause particles to terminate when the time is reached. The units of time should be consistent with the primary time variable.

0.0

TimestepValues (TimestepValues)
ForceReinjectionEveryNSteps (ForceReinjectionEveryNSteps)

When animating particles, it is nice to inject new ones every Nth step to produce a continuous flow. Setting ForceReinjectionEveryNSteps to a non zero value will cause the particle source to reinject particles every Nth step even if it is otherwise unchanged. Note that if the particle source is also animated, this flag will be redundant as the particles will be reinjected whenever the source changes anyway

1

SelectInputVectors (SelectInputVectors)

Specify which vector array should be used for the integration through that filter.

An array of vectors is required.

ComputeVorticity (ComputeVorticity)

Compute vorticity and angular rotation of particles as they progress

1

Accepts boolean values (0 or 1).

DisableResetCache (DisableResetCache)

Prevents cache from getting reset so that new computation always start from previous results.

0

Accepts boolean values (0 or 1).

Stream Tracer

Integrate streamlines in a vector field.The Stream Tracer filter generates streamlines in a vector field from a collection of seed points. Production of streamlines terminates if a streamline crosses the exterior boundary of the input dataset. Other reasons for termination are listed for the MaximumNumberOfSteps, TerminalSpeed, and MaximumPropagation properties. This filter operates on any type of dataset, provided it has point-centered vectors. The output is polygonal data containing polylines.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Stream Tracer filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (any)

with 3 component(s).

Vectors (SelectInputVectors)

This property contains the name of the vector array from which to generate streamlines.

An array of vectors is required.

InterpolatorType (InterpolatorType)

This property determines which interpolator to use for evaluating the velocity vector field. The first is faster though the second is more robust in locating cells during streamline integration.

0

The value(s) is an enumeration of the following:

  • Interpolator with Point Locator (0)
  • Interpolator with Cell Locator (1)
Surface Streamlines (Surface Streamlines)

Specify whether or not to compute surface streamlines.

0

Accepts boolean values (0 or 1).

IntegrationDirection (IntegrationDirection)

This property determines in which direction(s) a streamline is generated.

2

The value(s) is an enumeration of the following:

  • FORWARD (0)
  • BACKWARD (1)
  • BOTH (2)
IntegratorType (IntegratorType)

This property determines which integrator (with increasing accuracy) to use for creating streamlines.

2

The value(s) is an enumeration of the following:

  • Runge-Kutta 2 (0)
  • Runge-Kutta 4 (1)
  • Runge-Kutta 4-5 (2)
Integration Step Unit (IntegrationStepUnit)

This property specifies the unit for Minimum/Initial/Maximum integration step size. The Length unit refers to the arc length that a particle travels/advects within a single step. The Cell Length unit represents the step size as a number of cells.

2

The value(s) is an enumeration of the following:

  • Length (1)
  • Cell Length (2)
Initial Step Length (InitialIntegrationStep)

This property specifies the initial integration step size. For non-adaptive integrators (Runge-Kutta 2 and Runge-Kutta 4), it is fixed (always equal to this initial value) throughout the integration. For an adaptive integrator (Runge-Kutta 4-5), the actual step size varies such that the numerical error is less than a specified threshold.

0.2

Minimum Step Length (MinimumIntegrationStep)

When using the Runge-Kutta 4-5 ingrator, this property specifies the minimum integration step size.

0.01

Maximum Step Length (MaximumIntegrationStep)

When using the Runge-Kutta 4-5 ingrator, this property specifies the maximum integration step size.

0.5

Maximum Steps (MaximumNumberOfSteps)

This property specifies the maximum number of steps, beyond which streamline integration is terminated.

2000

Maximum Streamline Length (MaximumPropagation)

This property specifies the maximum streamline length (i.e., physical arc length), beyond which line integration is terminated.

1.0

The value must be less than the largest dimension of the dataset multiplied by a scale factor of 1.0.

Terminal Speed (TerminalSpeed)

This property specifies the terminal speed, below which particle advection/integration is terminated.

0.000000000001

MaximumError (MaximumError)

This property specifies the maximum error (for Runge-Kutta 4-5) tolerated throughout streamline integration. The Runge-Kutta 4-5 integrator tries to adjust the step size such that the estimated error is less than this threshold.

0.000001

ComputeVorticity (ComputeVorticity)

Specify whether or not to compute vorticity.

1

Accepts boolean values (0 or 1).

Seed Type (Source)

The value of this property determines how the seeds for the streamlines will be generated.

The value can be one of the following:

  • PointSource (extended_sources)
  • HighResLineSource (extended_sources)


Stream Tracer For Generic Datasets

Integrate streamlines in a vector field.The Generic Stream Tracer filter generates streamlines in a vector field from a collection of seed points. The vector field used is selected from the Vectors menu, so the input data set is required to have point-centered vectors. The Seed portion of the interface allows you to select whether the seed points for this integration lie in a point cloud or along a line. Depending on which is selected, the appropriate 3D widget (point or line widget) is displayed along with traditional user interface controls for positioning the point cloud or line within the data set. Instructions for using the 3D widgets and the corresponding manual controls can be found in section 7.4. The Max. Propagation entry box allows you to specify the maximum length of the streamlines. From the Max. Propagation menu, you can select the units to be either Time (the time a particle would travel with steady flow) or Length (in the data set's spatial coordinates). The Init. Step Len. menu and entry specify the initial step size for integration. (For non-adaptive integrators, Runge-Kutta 2 and 4, the initial step size is used throughout the integration.) The menu allows you to specify the units. Time and Length have the same meaning as for Max. Propagation. Cell Length specifies the step length as a number of cells. The Integration Direction menu determines in which direction(s) the stream trace will be generated: FORWARD, BACKWARD, or BOTH. The Integrator Type section of the interface determines which calculation to use for integration: Runge-Kutta 2, Runge-Kutta 4, or Runge-Kutta 4-5. If Runge-Kutta 4-5 is selected, controls are displayed for specifying the minimum and maximum step length and the maximum error. The controls for specifying Min. Step Len. and Max. Step Len. are the same as those for Init. Step Len. The Runge-Kutta 4-5 integrator tries to choose the step size so that the estimated error is less than the value of the Maximum Error entry. If the integration takes more than Max. Steps to complete, if the speed goes below Term. Speed, if Max. Propagation is reached, or if a boundary of the input data set is crossed, integration terminates. This filter operates on any type of data set, provided it has point-centered vectors. The output is polygonal data containing polylines.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Stream Tracer filter.

Accepts input of following types:

  • vtkGenericDataSet

The dataset must contain a field array (point)

with 3 component(s).

Seed Type (Source)

The value of this property determines how the seeds for the streamlines will be generated.

The value can be one of the following:

  • PointSource (extended_sources)
  • HighResLineSource (extended_sources)
Vectors (SelectInputVectors)

This property contains the name of the vector array from which to generate streamlines.

An array of vectors is required.

MaximumPropagation (MaximumPropagation)

Specify the maximum streamline length.

1.0

The value must be less than the largest dimension of the dataset multiplied by a scale factor of 1.0.

InitialIntegrationStep (InitialIntegrationStep)

Specify the initial integration step.

0.5

IntegrationDirection (IntegrationDirection)

This property determines in which direction(s) a streamline is generated.

2

The value(s) is an enumeration of the following:

  • FORWARD (0)
  • BACKWARD (1)
  • BOTH (2)
IntegratorType (IntegratorType)

This property determines which integrator (with increasing accuracy) to use for creating streamlines.

2

The value(s) is an enumeration of the following:

  • Runge-Kutta 2 (0)
  • Runge-Kutta 4 (1)
  • Runge-Kutta 4-5 (2)
MaximumError (MaximumError)

Set the maximum error allowed in the integration. The meaning of this value depends on the integrator chosen.

0.000001

MinimumIntegrationStep (MinimumIntegrationStep)

Specify the minimum integration step.

0.01

IntegrationStepUnit (IntegrationStepUnit)

Choose the unit to use for the integration step.

2

The value(s) is an enumeration of the following:

  • Time (0)
  • Length (1)
  • Cell Length (2)
MaximumIntegrationStep (MaximumIntegrationStep)

Specify the maximum integration step.

0.01

MaximumNumberOfSteps (MaximumNumberOfSteps)

Specify the maximum number of steps used in the integration.

2000

TerminalSpeed (TerminalSpeed)

If at any point the speed is below this value, the integration is terminated.

0.000000000001


Stream Tracer With Custom Source

Integrate streamlines in a vector field.The Stream Tracer filter generates streamlines in a vector field from a collection of seed points. Production of streamlines terminates if a streamline crosses the exterior boundary of the input dataset. Other reasons for termination are listed for the MaximumNumberOfSteps, TerminalSpeed, and MaximumPropagation properties. This filter operates on any type of dataset, provided it has point-centered vectors. The output is polygonal data containing polylines. This filter takes a Source input that provides the seed points.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Stream Tracer filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 3 component(s).

Vectors (SelectInputVectors)

This property contains the name of the vector array from which to generate streamlines.

An array of vectors is required.

Surface Streamlines (Surface Streamlines)

Specify whether or not to compute surface streamlines.

0

Accepts boolean values (0 or 1).

IntegrationDirection (IntegrationDirection)

This property determines in which direction(s) a streamline is generated.

2

The value(s) is an enumeration of the following:

  • FORWARD (0)
  • BACKWARD (1)
  • BOTH (2)
IntegratorType (IntegratorType)

This property determines which integrator (with increasing accuracy) to use for creating streamlines.

2

The value(s) is an enumeration of the following:

  • Runge-Kutta 2 (0)
  • Runge-Kutta 4 (1)
  • Runge-Kutta 4-5 (2)
Integration Step Unit (IntegrationStepUnit)

This property specifies the unit for Minimum/Initial/Maximum integration step size. The Length unit refers to the arc length that a particle travels/advects within a single step. The Cell Length unit represents the step size as a number of cells.

2

The value(s) is an enumeration of the following:

  • Length (1)
  • Cell Length (2)
Initial Step Length (InitialIntegrationStep)

This property specifies the initial integration step size. For non-adaptive integrators (Runge-Kutta 2 and Runge-Kutta 4), it is fixed (always equal to this initial value) throughout the integration. For an adaptive integrator (Runge-Kutta 4-5), the actual step size varies such that the numerical error is less than a specified threshold.

0.2

Minimum Step Length (MinimumIntegrationStep)

When using the Runge-Kutta 4-5 ingrator, this property specifies the minimum integration step size.

0.01

Maximum Step Length (MaximumIntegrationStep)

When using the Runge-Kutta 4-5 ingrator, this property specifies the maximum integration step size.

0.5

Maximum Steps (MaximumNumberOfSteps)

This property specifies the maximum number of steps, beyond which streamline integration is terminated.

2000

Maximum Streamline Length (MaximumPropagation)

This property specifies the maximum streamline length (i.e., physical arc length), beyond which line integration is terminated.

1.0

The value must be less than the largest dimension of the dataset multiplied by a scale factor of 1.0.

Terminal Speed (TerminalSpeed)

This property specifies the terminal speed, below which particle advection/integration is terminated.

0.000000000001

MaximumError (MaximumError)

This property specifies the maximum error (for Runge-Kutta 4-5) tolerated throughout streamline integration. The Runge-Kutta 4-5 integrator tries to adjust the step size such that the estimated error is less than this threshold.

0.000001

ComputeVorticity (ComputeVorticity)

Specify whether or not to compute vorticity.

1

Accepts boolean values (0 or 1).

Seed Source (Source)

This property specifies the input used to obtain the seed points.


Subdivide

This filter iteratively divide triangles into four smaller triangles. New points are placed linearly so the output surface matches the input surface. The Subdivide filter iteratively divides each triangle in the input dataset into 4 new triangles. Three new points are added per triangle -- one at the midpoint of each edge. This filter operates only on polygonal data containing triangles, so run your polygonal data through the Triangulate filter first if it is not composed of triangles. The output of this filter is also polygonal.

Property Description Default Value(s) Restrictions
Input (Input)

This parameter specifies the input to the Subdivide filter.

Accepts input of following types:

  • vtkPolyData
Number of Subdivisions (NumberOfSubdivisions)

The value of this property specifies the number of subdivision iterations to perform.

1


Surface Flow

This filter integrates flow through a surface. The flow integration fitler integrates the dot product of a point flow vector field and surface normal. It computes the net flow across the 2D surface. It operates on any type of dataset and produces an unstructured grid output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Surface Flow filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 3 component(s).

SelectInputVectors (SelectInputVectors)

The value of this property specifies the name of the input vector array containing the flow vector field.

An array of vectors is required.

Surface Vectors

This filter constrains vectors to lie on a surface. The Surface Vectors filter is used for 2D data sets. It constrains vectors to lie in a surface by removing components of the vectors normal to the local surface.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Surface Vectors filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 3 component(s).

SelectInputVectors (SelectInputVectors)

This property specifies the name of the input vector array to process.

An array of vectors is required.

ConstraintMode (ConstraintMode)

This property specifies whether the vectors will be parallel or perpendicular to the surface. If the value is set to PerpendicularScale (2), then the output will contain a scalar array with the dot product of the surface normal and the vector at each point.

0

The value(s) is an enumeration of the following:

  • Parallel (0)
  • Perpendicular (1)
  • PerpendicularScale (2)

Table FFT

Performs the Fast Fourier Transform on the columns of a table.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkTable

The dataset must contain a field array (row)

with 1 component(s).


Table To Points

Converts table to set of points.The TableToPolyData filter converts a vtkTable to a set of points in a vtkPolyData. One must specifies the columns in the input table to use as the X, Y and Z coordinates for the points in the output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input..

Accepts input of following types:

  • vtkTable

The dataset must contain a field array (row)

with 1 component(s).

XColumn (XColumn)

This property specifies which data array is going to be used as the X coordinate in the generated polydata dataset.

YColumn (YColumn)

This property specifies which data array is going to be used as the Y coordinate in the generated polydata dataset.

ZColumn (ZColumn)

This property specifies which data array is going to be used as the Z coordinate in the generated polydata dataset.

2D Points (Create2DPoints)

Specify whether the points of the polydata are 3D or 2D. If this is set to true then the Z Column will be ignored and the z value of each point on the polydata will be set to 0. By default this will be off.

0

Accepts boolean values (0 or 1).

KeepAllDataArrays (KeepAllDataArrays)

Allow user to keep columns specified as X,Y,Z as Data arrays. By default this will be off.

0

Accepts boolean values (0 or 1).

Table To Structured Grid

Converts to table to structured grid.The TableToStructuredGrid filter converts a vtkTable to a vtkStructuredGrid. One must specifies the columns in the input table to use as the X, Y and Z coordinates for the points in the output, and the whole extent.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input..

Accepts input of following types:

  • vtkTable

The dataset must contain a field array (row)

with 1 component(s).

WholeExtent (WholeExtent)

0 0 0 0 0 0

XColumn (XColumn)

This property specifies which data array is going to be used as the X coordinate in the generated polydata dataset.

YColumn (YColumn)

This property specifies which data array is going to be used as the Y coordinate in the generated polydata dataset.

ZColumn (ZColumn)

This property specifies which data array is going to be used as the Z coordinate in the generated polydata dataset.


Temporal Cache

Saves a copy of the data set for a fixed number of time steps.The Temporal Cache can be used to save multiple copies of a data set at different time steps to prevent thrashing in the pipeline caused by downstream filters that adjust the requested time step. For example, assume that there is a downstream Temporal Interpolator filter. This filter will (usually) request two time steps from the upstream filters, which in turn (usually) causes the upstream filters to run twice, once for each time step. The next time the interpolator requests the same two time steps, they might force the upstream filters to re-evaluate the same two time steps. The Temporal Cache can keep copies of both of these time steps and provide the requested data without having to run upstream filters.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the Temporal Cache filter.

Accepts input of following types:

  • vtkDataObject
CacheSize (CacheSize)

The cache size determines the number of time steps that can be cached at one time. The maximum number is 10. The minimum is 2 (since it makes little sense to cache less than that).

2

TimestepValues (TimestepValues)


Temporal Interpolator

Interpolate between time steps.The Temporal Interpolator converts data that is defined at discrete time steps to one that is defined over a continuum of time by linearly interpolating the data's field data between two adjacent time steps. The interpolated values are a simple approximation and should not be interpreted as anything more. The Temporal Interpolator assumes that the topology between adjacent time steps does not change.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the Temporal Interpolator.

Accepts input of following types:

  • vtkDataObject
DiscreteTimeStepInterval (DiscreteTimeStepInterval)

If Discrete Time Step Interval is set to 0, then the Temporal Interpolator will provide a continuous region of time on its output. If set to anything else, then the output will define a finite set of time points on its output, each spaced by the Discrete Time Step Interval. The output will have (time range)/(discrete time step interval) time steps. (Note that the time range is defined by the time range of the data of the input filter, which may be different from other pipeline objects or the range defined in the animation inspector.) This is a useful option to use if you have a dataset with one missing time step and wish to 'file-in' the missing data with an interpolated value from the steps on either side.

0.0

TimestepValues (TimestepValues)
TimeRange (TimeRange)


Temporal Particles To Pathlines

Creates polylines representing pathlines of animating particles Particle Pathlines takes any dataset as input, it extracts the point locations of all cells over time to build up a polyline trail. The point number (index) is used as the 'key' if the points are randomly changing their respective order in the points list, then you should specify a scalar that represents the unique ID. This is intended to handle the output of a filter such as the TemporalStreamTracer.


Property Description Default Value(s) Restrictions
Input (Input)

The input cells to create pathlines for.

Accepts input of following types:

  • vtkPointSet

The dataset must contain a field array (point)

Selection (Selection)

Set a second input, which is a selection. Particles with the same Id in the selection as the primary input will be chosen for pathlines Note that you must have the same IdChannelArray in the selection as the input

Accepts input of following types:

  • vtkDataSet
MaskPoints (MaskPoints)

Set the number of particles to track as a ratio of the input. Example: setting MaskPoints to 10 will track every 10th point.

100

MaxTrackLength (MaxTrackLength)

If the Particles being traced animate for a long time, the trails or traces will become long and stringy. Setting the MaxTraceTimeLength will limit how much of the trace is displayed. Tracks longer then the Max will disappear and the trace will apppear like a snake of fixed length which progresses as the particle moves. This length is given with respect to timesteps.

25

MaxStepDistance (MaxStepDistance)

If a particle disappears from one end of a simulation and reappears on the other side, the track left will be unrepresentative. Set a MaxStepDistance{x,y,z} which acts as a threshold above which if a step occurs larger than the value (for the dimension), the track will be dropped and restarted after the step. (ie the part before the wrap around will be dropped and the newer part kept).

1.0 1.0 1.0

IdChannelArray (IdChannelArray)

Specify the name of a scalar array which will be used to fetch the index of each point. This is necessary only if the particles change position (Id order) on each time step. The Id can be used to identify particles at each step and hence track them properly. If this array is set to "Global or Local IDs", the global point ids are used if they exist or the point index is otherwise.

Global or Local IDs

An array of scalars is required.

Temporal Shift Scale

Shift and scale time values.The Temporal Shift Scale filter linearly transforms the time values of a pipeline object by applying a shift and then scale. Given a data at time t on the input, it will be transformed to time t*Shift + Scale on the output. Inversely, if this filter has a request for time t, it will request time (t-Shift)/Scale on its input.

Property Description Default Value(s) Restrictions
Input (Input)

The input to the Temporal Shift Scale filter.

Accepts input of following types:

  • vtkDataObject
PreShift (PreShift)

Apply a translation to the data before scaling. To convert T{5,100} to T{0,1} use Preshift=-5, Scale=1/95, PostShift=0 To convert T{5,105} to T{5,10} use Preshift=-5, Scale=5/100, PostShift=5

0.0

PostShift (PostShift)

The amount of time the input is shifted.

0.0

Scale (Scale)

The factor by which the input time is scaled.

1.0

Periodic (Periodic)

If Periodic is true, requests for time will be wrapped around so that the source appears to be a periodic time source. If data exists for times {0,N-1}, setting periodic to true will cause time 0 to be produced when time N, 2N, 2N etc is requested. This effectively gives the source the ability to generate time data indefinitely in a loop. When combined with Shift/Scale, the time becomes periodic in the shifted and scaled time frame of reference. Note: Since the input time may not start at zero, the wrapping of time from the end of one period to the start of the next, will subtract the initial time - a source with T{5..6} repeated periodicaly will have output time {5..6..7..8} etc.

0

Accepts boolean values (0 or 1).

PeriodicEndCorrection (PeriodicEndCorrection)

If Periodic time is enabled, this flag determines if the last time step is the same as the first. If PeriodicEndCorrection is true, then it is assumed that the input data goes from 0-1 (or whatever scaled/shifted actual time) and time 1 is the same as time 0 so that steps will be 0,1,2,3...N,1,2,3...N,1,2,3 where step N is the same as 0 and step 0 is not repeated. When this flag is false the data is assumed to be literal and output is of the form 0,1,2,3...N,0,1,2,3... By default this flag is ON

1

Accepts boolean values (0 or 1).

MaximumNumberOfPeriods (MaximumNumberOfPeriods)

If Periodic time is enabled, this controls how many time periods time is reported for. A filter cannot output an infinite number of time steps and therefore a finite number of periods is generated when reporting time.

1.0

TimestepValues (TimestepValues)


Temporal Snap-to-Time-Step

Modifies the time range/steps of temporal data. This file modifies the time range or time steps of the data without changing the data itself. The data is not resampled by this filter, only the information accompanying the data is modified.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input of the filter.

Accepts input of following types:

  • vtkDataObject
SnapMode (SnapMode)

Determine which time step to snap to.

0

The value(s) is an enumeration of the following:

  • Nearest (0)
  • NextBelowOrEqual (1)
  • NextAboveOrEqual (2)
TimestepValues (TimestepValues)


Temporal Statistics

Loads in all time steps of a data set and computes some statistics about how each point and cell variable changes over time.Given an input that changes over time, vtkTemporalStatistics looks at the data for each time step and computes some statistical information of how a point or cell variable changes over time. For example, vtkTemporalStatistics can compute the average value of "pressure" over time of each point. Note that this filter will require the upstream filter to be run on every time step that it reports that it can compute. This may be a time consuming operation. vtkTemporalStatistics ignores the temporal spacing. Each timestep will be weighted the same regardless of how long of an interval it is to the next timestep. Thus, the average statistic may be quite different from an integration of the variable if the time spacing varies.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Temporal Statistics filter.

Accepts input of following types:

  • vtkDataSet
ComputeAverage (ComputeAverage)

Compute the average of each point and cell variable over time.

1

Accepts boolean values (0 or 1).

ComputeMinimum (ComputeMinimum)

Compute the minimum of each point and cell variable over time.

1

Accepts boolean values (0 or 1).

ComputeMaximum (ComputeMaximum)

Compute the maximum of each point and cell variable over time.

1

Accepts boolean values (0 or 1).

ComputeStandardDeviation (ComputeStandardDeviation)

Compute the standard deviation of each point and cell variable over time.

1

Accepts boolean values (0 or 1).

Tensor Glyph

This filter generates an ellipsoid, cuboid, cylinder or superquadric glyph at each point of the input data set. The glyphs are oriented and scaled according to eigenvalues and eigenvectors of tensor point data of the input data set.

The Tensor Glyph filter generates an ellipsoid, cuboid, cylinder or superquadric glyph at every point in the input data set. The glyphs are oriented and scaled according to eigenvalues and eigenvectors of tensor point data of the input data set. The Tensor Glyph filter operates on any type of data set. Its output is polygonal.


Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Glyph filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array (point)

with 9 component(s).

The dataset must contain a field array (point)

with 1 component(s).

Tensors (SelectInputTensors)

This property indicates the name of the tensor array on which to operate. The indicated array's eigenvalues and eigenvectors are used for scaling and orienting the glyphs.

Glyph Type (Source)

This property determines which type of glyph will be placed at the points in the input dataset.

Accepts input of following types:

  • vtkPolyDataThe value can be one of the following:
  • SphereSource (sources)
  • CylinderSource (sources)
  • CubeSource (sources)
  • SuperquadricSource (sources)
ExtractEigenvalues (ExtractEigenvalues)

Toggle whether to extract eigenvalues from tensor. If false, eigenvalues/eigenvectors are not extracted and the columns of the tensor are taken as the eigenvectors (the norm of column, always positive, is the eigenvalue). If true, the glyph is scaled and oriented according to eigenvalues and eigenvectors; additionally, eigenvalues are provided as new data array.

1

Accepts boolean values (0 or 1).

ColorGlyphs (ColorGlyphs)

This property determines whether or not to color the glyphes.

1

Accepts boolean values (0 or 1).

Scalars (SelectInputScalars)

This property indicates the name of the scalar array to use for coloring

1

An array of scalars is required.

Color by (ColorMode)

This property determines whether input scalars or computed eigenvalues at the point should be used to color the glyphs. If ThreeGlyphs is set and the eigenvalues are chosen for coloring then each glyph is colored by the corresponding eigenvalue and if not set the color corresponding to the largest eigenvalue is chosen.

0

The value(s) is an enumeration of the following:

  • input scalars (0)
  • eigenvalues (1)
ScaleFactor (ScaleFactor)

This property specifies the scale factor to scale every glyph by.

1

LimitScalingByEigenvalues (LimitScalingByEigenvalues)

This property determines whether scaling of glyphs by ScaleFactor times eigenvalue should be limited. This is useful to prevent uncontrolled scaling near singularities.

0

Accepts boolean values (0 or 1).

MaxScaleFactor (MaxScaleFactor)

If scaling by eigenvalues should be limited, this value sets an upper limit for scale factor times eigenvalue.

10

Symmetric (Symmetric)

This property determines whether or not to draw a mirror of each glyph.

0

Accepts boolean values (0 or 1).

ThreeGlyphs (ThreeGlyphs)

Toggle whether to produce three glyphs, each of which oriented along an eigenvector and scaled according to the corresponding eigenvector.

0

Accepts boolean values (0 or 1).

Tessellate

Tessellate nonlinear curves, surfaces, and volumes with lines, triangles, and tetrahedra.The Tessellate filter tessellates cells with nonlinear geometry and/or scalar fields into a simplicial complex with linearly interpolated field values that more closely approximate the original field. This is useful for datasets containing quadratic cells.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Tessellate filter.

Accepts input of following types:

  • vtkPolyData
  • vtkDataSet
  • vtkUnstructuredGrid
OutputDimension (OutputDimension)

The value of this property sets the maximum dimensionality of the output tessellation. When the value of this property is 3, 3D cells produce tetrahedra, 2D cells produce triangles, and 1D cells produce line segments. When the value is 2, 3D cells will have their boundaries tessellated with triangles. When the value is 1, all cells except points produce line segments.

3

ChordError (ChordError)

This property controls the maximum chord error allowed at any edge midpoint in the output tessellation. The chord error is measured as the distance between the midpoint of any output edge and the original nonlinear geometry.

1e-3

Field Error (FieldError2)

This proeprty controls the maximum field error allowed at any edge midpoint in the output tessellation. The field error is measured as the difference between a field value at the midpoint of an output edge and the value of the corresponding field in the original nonlinear geometry.

Maximum Number of Subdivisions (MaximumNumberOfSubdivisions)

This property specifies the maximum number of times an edge may be subdivided. Increasing this number allows further refinement but can drastically increase the computational and storage requirements, especially when the value of the OutputDimension property is 3.

3

MergePoints (MergePoints)

If the value of this property is set to 1, coincident vertices will be merged after tessellation has occurred. Only geometry is considered during the merge and the first vertex encountered is the one whose point attributes will be used. Any discontinuities in point fields will be lost. On the other hand, many operations, such as streamline generation, require coincident vertices to be merged. Toggle whether to merge coincident vertices.

1

Accepts boolean values (0 or 1).

Tessellate Generic Dataset

Tessellate a higher-order datasetTessellate a higher-order dataset.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Generic Tessellator filter.

Accepts input of following types:

  • vtkGenericDataSet

Tetrahedralize

This filter converts 3-d cells to tetrahedrons and polygons to triangles. The output is always of type unstructured grid.The Tetrahedralize filter converts the 3D cells of any type of dataset to tetrahedrons and the 2D ones to triangles. This filter always produces unstructured grid output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Tetrahedralize filter.

Accepts input of following types:

  • vtkDataSet

Texture Map to Cylinder

Generate texture coordinates by mapping points to cylinder. This is a filter that generates 2D texture coordinates by mapping input dataset points onto a cylinder. The cylinder is generated automatically. The cylinder is generated automatically by computing the axis of the cylinder. Note that the generated texture coordinates for the s-coordinate ranges from (0-1) (corresponding to angle of 0->360 around axis), while the mapping of the t-coordinate is controlled by the projection of points along the axis.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Texture Map to Cylinder filter.

Accepts input of following types:

  • vtkDataSet
PreventSeam (PreventSeam)

Control how the texture coordinates are generated. If Prevent Seam is set, the s-coordinate ranges from 0->1 and 1->0 corresponding to the theta angle variation between 0->180 and 180->0 degrees. Otherwise, the s-coordinate ranges from 0->1 between 0->360 degrees.

1

Accepts boolean values (0 or 1).

Texture Map to Plane

Generate texture coordinates by mapping points to plane. TextureMapToPlane is a filter that generates 2D texture coordinates by mapping input dataset points onto a plane. The plane is generated automatically. A least squares method is used to generate the plane automatically.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Texture Map to Plane filter.

Accepts input of following types:

  • vtkDataSet

Texture Map to Sphere

Generate texture coordinates by mapping points to sphere. This is a filter that generates 2D texture coordinates by mapping input dataset points onto a sphere. The sphere is generated automatically. The sphere is generated automatically by computing the center i.e. averaged coordinates, of the sphere. Note that the generated texture coordinates range between (0,1). The s-coordinate lies in the angular direction around the z-axis, measured counter-clockwise from the x-axis. The t-coordinate lies in the angular direction measured down from the north pole towards the south pole.

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Texture Map to Sphere filter.

Accepts input of following types:

  • vtkDataSet
PreventSeam (PreventSeam)

Control how the texture coordinates are generated. If Prevent Seam is set, the s-coordinate ranges from 0->1 and 1->0 corresponding to the theta angle variation between 0->180 and 180->0 degrees. Otherwise, the s-coordinate ranges from 0->1 between 0->360 degrees.

1

Accepts boolean values (0 or 1).

Threshold

This filter extracts cells that have point or cell scalars in the specified range. The Threshold filter extracts the portions of the input dataset whose scalars lie within the specified range. This filter operates on either point-centered or cell-centered data. This filter operates on any type of dataset and produces unstructured grid output. To select between these two options, select either Point Data or Cell Data from the Attribute Mode menu. Once the Attribute Mode has been selected, choose the scalar array from which to threshold the data from the Scalars menu. The Lower Threshold and Upper Threshold sliders determine the range of the scalars to retain in the output. The All Scalars check box only takes effect when the Attribute Mode is set to Point Data. If the All Scalars option is checked, then a cell will only be passed to the output if the scalar values of all of its points lie within the range indicated by the Lower Threshold and Upper Threshold sliders. If unchecked, then a cell will be added to the output if the specified scalar value for any of its points is within the chosen range.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Threshold filter.

Accepts input of following types:

  • vtkDataSet

The dataset must contain a field array ()

with 1 component(s).

Scalars (SelectInputScalars)

The value of this property contains the name of the scalar array from which to perform thresholding.

An array of scalars is required.The value must be field array name.

Threshold Range (ThresholdBetween)

The values of this property specify the upper and lower bounds of the thresholding operation.

0 0

The value must lie within the range of the selected data array.

AllScalars (AllScalars)

If the value of this property is 1, then a cell is only included in the output if the value of the selected array for all its points is within the threshold. This is only relevant when thresholding by a point-centered array.

1

Accepts boolean values (0 or 1).

UseContinuousCellRange (UseContinuousCellRange)

If off, the vertex scalars are treated as a discrete set. If on, they are treated as a continuous interval over the minimum and maximum. One important "on" use case: When setting lower and upper threshold equal to some value and turning AllScalars off, the results are cells containing the iso-surface for that value. WARNING: Whether on or off, for higher order input, the filter will not give accurate results.

0

Accepts boolean values (0 or 1).

Transform

This filter applies transformation to the polygons.The Transform filter allows you to specify the position, size, and orientation of polygonal, unstructured grid, and curvilinear data sets.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Transform filter.

Accepts input of following types:

  • vtkPointSet
  • vtkImageData
  • vtkRectilinearGrid
Transform (Transform)

The values in this property allow you to specify the transform (translation, rotation, and scaling) to apply to the input dataset.

The value can be one of the following:

  • Transform3 (extended_sources)


Transpose Table

Transpose a table.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the filter.

Accepts input of following types:

  • vtkTable

The dataset must contain a field array (row)

with 1 component(s).

Variables of Interest (SelectArrays)

Choose arrays whose entries will be used to form observations for statistical analysis.

Add a column with original columns name (AddIdColumn)

This flag indicates if a column must be inserted at index 0 with the names (ids) of the input columns.

1

Accepts boolean values (0 or 1).

Use the column with original columns name (UseIdColumn)

This flag indicates if the output column must be named using the names listed in the index 0 column.

0

Accepts boolean values (0 or 1).

Only extract selected columns (DoNotTranspose)

This flag indicates if the sub-table must be effectively transposed or not.

0

Accepts boolean values (0 or 1).

Triangle Strips

This filter uses a greedy algorithm to convert triangles into triangle stripsThe Triangle Strips filter converts triangles into triangle strips and lines into polylines. This filter operates on polygonal data sets and produces polygonal output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Triangle Strips filter.

Accepts input of following types:

  • vtkPolyData
MaximumLength (MaximumLength)

This property specifies the maximum number of triangles/lines to include in a triangle strip or polyline.

1000


Triangulate

This filter converts polygons and triangle strips to basic triangles.The Triangulate filter decomposes polygonal data into only triangles, points, and lines. It separates triangle strips and polylines into individual triangles and lines, respectively. The output is polygonal data. Some filters that take polygonal data as input require that the data be composed of triangles rather than other polygons, so passing your data through this filter first is useful in such situations. You should use this filter in these cases rather than the Tetrahedralize filter because they produce different output dataset types. The filters referenced require polygonal input, and the Tetrahedralize filter produces unstructured grid output.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Triangulate filter.

Accepts input of following types:

  • vtkPolyData

Tube

Convert lines into tubes. Normals are used to avoid cracks between tube segments.The Tube filter creates tubes around the lines in the input polygonal dataset. The output is also polygonal.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Tube filter.

Accepts input of following types:

  • vtkPolyData

The dataset must contain a field array (point)

with 1 component(s).

The dataset must contain a field array (point)

with 3 component(s).

Scalars (SelectInputScalars)

This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the tubes. (See the VaryRadius property.)

An array of scalars is required.

Vectors (SelectInputVectors)

This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the tubes. (See the VaryRadius property.)

1

An array of vectors is required.

Number of Sides (NumberOfSides)

The value of this property indicates the number of faces around the circumference of the tube.

6

Capping (Capping)

If this property is set to 1, endcaps will be drawn on the tube. Otherwise the ends of the tube will be open.

1

Accepts boolean values (0 or 1).

Radius (Radius)

The value of this property sets the radius of the tube. If the radius is varying (VaryRadius property), then this value is the minimum radius.

1.0

The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.01.

VaryRadius (VaryRadius)

The property determines whether/how to vary the radius of the tube. If varying by scalar (1), the tube radius is based on the point-based scalar values in the dataset. If it is varied by vector, the vector magnitude is used in varying the radius.

0

The value(s) is an enumeration of the following:

  • Off (0)
  • By Scalar (1)
  • By Vector (2)
  • By Absolute Scalar (3)
RadiusFactor (RadiusFactor)

If varying the radius (VaryRadius property), the property sets the maximum tube radius in terms of a multiple of the minimum radius. If not varying the radius, this value has no effect.

10

UseDefaultNormal (UseDefaultNormal)

If this property is set to 0, and the input contains no vector array, then default ribbon normals will be generated (DefaultNormal property); if a vector array has been set (SelectInputVectors property), the ribbon normals will be set from the specified array. If this property is set to 1, the default normal (DefaultNormal property) will be used, regardless of whether the SelectInputVectors property has been set.

0

Accepts boolean values (0 or 1).

DefaultNormal (DefaultNormal)

The value of this property specifies the normal to use when the UseDefaultNormal property is set to 1 or the input contains no vector array (SelectInputVectors property).

0 0 1


UpdateSuppressor2

Property Description Default Value(s) Restrictions
Input (Input)

Set the input to the Update Suppressor filter.

Enabled (Enabled)

Toggle whether the update suppressor is enabled.

1

Accepts boolean values (0 or 1).

UpdateTime (UpdateTime)

none


Warp By Scalar

This filter moves point coordinates along a vector scaled by a point attribute. It can be used to produce carpet plots. The Warp (scalar) filter translates the points of the input data set along a vector by a distance determined by the specified scalars. This filter operates on polygonal, curvilinear, and unstructured grid data sets containing single-component scalar arrays. Because it only changes the positions of the points, the output data set type is the same as that of the input. Any scalars in the input dataset are copied to the output, so the data can be colored by them.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Warp (scalar) filter.

Accepts input of following types:

  • vtkPointSet
  • vtkImageData
  • vtkRectilinearGrid

The dataset must contain a field array (point)

with 1 component(s).

Scalars (SelectInputScalars)

This property contains the name of the scalar array by which to warp the dataset.

An array of scalars is required.

ScaleFactor (ScaleFactor)

The scalar value at a given point is multiplied by the value of this property to determine the magnitude of the change vector for that point.

1.0

Normal (Normal)

The values of this property specify the direction along which to warp the dataset if any normals contained in the input dataset are not being used for this purpose. (See the UseNormal property.)

0 0 1

UseNormal (UseNormal)

If point normals are present in the dataset, the value of this property toggles whether to use a single normal value (value = 1) or the normals from the dataset (value = 0).

0

Accepts boolean values (0 or 1).

XY Plane (XYPlane)

If the value of this property is 1, then the Z-coordinates from the input are considered to be the scalar values, and the displacement is along the Z axis. This is useful for creating carpet plots.

0

Accepts boolean values (0 or 1).

Warp By Vector

This filter displaces point coordinates along a vector attribute. It is useful for showing mechanical deformation. The Warp (vector) filter translates the points of the input dataset using a specified vector array. The vector array chosen specifies a vector per point in the input. Each point is translated along its vector by a given scale factor. This filter operates on polygonal, curvilinear, and unstructured grid datasets. Because this filter only changes the positions of the points, the output dataset type is the same as that of the input.

Property Description Default Value(s) Restrictions
Input (Input)

This property specifies the input to the Warp (vector) filter.

Accepts input of following types:

  • vtkPointSet
  • vtkImageData
  • vtkRectilinearGrid

The dataset must contain a field array (point)

with 3 component(s).

Vectors (SelectInputVectors)

The value of this property contains the name of the vector array by which to warp the dataset's point coordinates.

An array of vectors is required.

ScaleFactor (ScaleFactor)

Each component of the selected vector array will be multiplied by the value of this property before being used to compute new point coordinates.

1.0


Youngs Material Interface

Computes linear material interfaces in 2D or 3D mixed cells produced by eulerian or ALE simulation codes Computes linear material interfaces in 2D or 3D mixed cells produced by Eulerian or ALE simulation codes

Property Description Default Value(s) Restrictions
Input (Input)

Accepts input of following types:

  • vtkCompositeDataSet

The dataset must contain a field array (cell)

with 1 component(s).

The dataset must contain a field array (cell)

with 3 component(s).

InverseNormal (InverseNormal)

0

Accepts boolean values (0 or 1).

ReverseMaterialOrder (ReverseMaterialOrder)

0

Accepts boolean values (0 or 1).

OnionPeel (OnionPeel)

1

Accepts boolean values (0 or 1).

AxisSymetric (AxisSymetric)

1

Accepts boolean values (0 or 1).

FillMaterial (FillMaterial)

1

Accepts boolean values (0 or 1).

UseFractionAsDistance (UseFractionAsDistance)

0

Accepts boolean values (0 or 1).

VolumeFractionRange (VolumeFractionRange)

0.01 0.99

NumberOfDomainsInformation (NumberOfDomainsInformation)
VolumeFractionArrays (VolumeFractionArrays)

An array of scalars is required.

NormalArrays (NormalArrays)

An array of vectors is required.The value must be field array name.

OrderingArrays (OrderingArrays)

An array of scalars is required.The value must be field array name.