ParaView/Users Guide/List of filters: Difference between revisions

From KitwarePublic
Jump to navigationJump to search
No edit summary
No edit summary
Line 1: Line 1:
==AMR Contour==
==AMR Contour==


 
Iso surface cell array.


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 9: Line 9:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Capping'''<br>''(Capping)''
|'''Input''' (Input)
|
This property specifies the input of the filter.
|
|
If this property is on, the the boundary of the data set is capped.


| 1
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkCompositeDataSet
The dataset much contain a field array (cell)


with 1 component(s).


|-
|-
| '''Degenerate Cells'''<br>''(DegenerateCells)''
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
|
If this property is on, a transition mesh between levels is created.


| 1
This property specifies the cell arrays from which the contour filter will
compute contour cells.
|
|
Only the values 0 and 1 are accepted.


|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|


|-
This property specifies the values at which to compute the isosurface.
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
0.1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a cell array with 1 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.


|-
|-
| '''Merge Points'''<br>''(MergePoints)''
|'''Capping''' (Capping)
|
|
Use more memory to merge points on the boundaries of blocks.


| 1
If this property is on, the the boundary of the data set is capped.
|
|
Only the values 0 and 1 are accepted.
1
 
|
 
Accepts boolean values (0 or 1).
|-
|-
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)''
|'''DegenerateCells''' (DegenerateCells)
|
|
If this property is off, each process executes independantly.


| 1
If this property is on, a transition mesh between levels is created.
|
1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)''
|'''MultiprocessCommunication''' (MultiprocessCommunication)
|
|
This property specifies the cell arrays from which the contour filter will
compute contour cells.


If this property is off, each process executes independantly.
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Skip Ghost Copy'''<br>''(SkipGhostCopy)''
|'''SkipGhostCopy''' (SkipGhostCopy)
|
|
A simple test to see if ghost values are already set properly.


| 1
A simple test to see if ghost values are already set properly.
|
1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Triangulate'''<br>''(Triangulate)''
|'''Triangulate''' (Triangulate)
|
|
Use triangles instead of quads on capping surfaces.


| 1
Use triangles instead of quads on capping surfaces.
|
1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|'''MergePoints''' (MergePoints)
|
|
This property specifies the values at which to compute the isosurface.


| 0.1
Use more memory to merge points on the boundaries of blocks.
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
1
 
|
Accepts boolean values (0 or 1).


|}
|}


==AMR Dual Clip==
==AMR Dual Clip==


 
Clip with scalars. Tetrahedra.
Clip with scalars. Tetrahedra.
 


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 119: Line 117:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
| This property specifies the input of the filter.
|
|
This property specifies the input of the filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|
Accepts input of following types:
* vtkCompositeDataSet
The dataset much contain a field array (cell)


The dataset must contain a cell array with 1 components.
with 1 component(s).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.
 


|-
|-
| '''Internal Decimation'''<br>''(InternalDecimation)''
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
|
If this property is on, internal tetrahedra are decimation


| 1
This property specifies the cell arrays from which the clip filter will
compute clipped cells.
|
|
Only the values 0 and 1 are accepted.


|
An array of scalars is required.
|-
|-
| '''Merge Points'''<br>''(MergePoints)''
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
|
Use more memory to merge points on the boundaries of blocks.


| 1
This property specifies the values at which to compute the isosurface.
|
0.1
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)''
|'''InternalDecimation''' (InternalDecimation)
|
|
If this property is off, each process executes independantly.


| 1
If this property is on, internal tetrahedra are decimation
|
1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)''
|'''MultiprocessCommunication''' (MultiprocessCommunication)
|
|
This property specifies the cell arrays from which the clip filter will
compute clipped cells.


If this property is off, each process executes independantly.
|
|
1
|
|
An array of scalars is required.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|'''MergePoints''' (MergePoints)
|
|
This property specifies the values at which to compute the isosurface.


| 0.1
Use more memory to merge points on the boundaries of blocks.
|
1
|
Accepts boolean values (0 or 1).
 
|}
 
==All to N==
 
Redistribute data to a subset of available processes.
The All to N filter is available when ParaView is run in parallel. It redistributes the data so that it is located on the number of processes specified in the Number of Processes entry box. It also does load-balancing of the data among these processes. This filter operates on polygonal data and produces polygonal output.
 
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
Set the input to the All to N filter.
|
 
|
Accepts input of following types:
* vtkPolyData
|-
|'''Number of Processes''' (NumberOfProcesses)
|
 
Set the number of processes across which to split the input data.
|
1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.




|}
|}


==Annotate Time Filter==
==Annotate Time Filter==


Shows input data time as text annnotation in the view.
Shows input data time as text annnotation in the view.
The Annotate Time filter can be used to show the data time in a text annotation.


The Annotate Time filter can be used to show the data time in a text annotation.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 200: Line 234:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Format'''<br>''(Format)''
|'''Input''' (Input)
|
|
The value of this property is a format string used to display the input time. The format string is specified using printf style.


| Time: %f
This property specifies the input dataset for which to display the time.
 
|
|
|
|-
|-
| '''Input'''<br>''(Input)''
|'''Format''' (Format)
|
|
This property specifies the input dataset for which to display the time.
 
The value of this property is a format string used to display the input time. The format string is specified using printf style.


|
|
Time: %f
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|-
| '''Scale'''<br>''(Scale)''
|'''Shift''' (Shift)
|
|
The factor by which the input time is scaled.


| 1
The amount of time the input is shifted (after scaling).
|
|
0.0
|
|-
|-
| '''Shift'''<br>''(Shift)''
|'''Scale''' (Scale)
|
|
The amount of time the input is shifted (after scaling).


| 0
The factor by which the input time is scaled.
|
1.0
|
|
|}
|}


==Append Attributes==
==Append Attributes==


Copies geometry from first input. Puts all of the arrays into the output.
The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.


Copies geometry from first input.  Puts all of the arrays into the output.
The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 247: Line 290:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Append Attributes filter.


This property specifies the input to the Append Attributes filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
 
* vtkDataSet
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Append Datasets==
==Append Datasets==


Takes an input of multiple datasets and output has only one unstructured grid.
Takes an input of multiple datasets and output has only one unstructured grid.
The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.


The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 276: Line 317:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the datasets to be merged into a single dataset by the Append Datasets filter.


This property specifies the datasets to be merged into a single dataset by the Append Datasets filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
* vtkDataSet


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
==Append Geometry==
 
Takes an input of multiple poly data parts and output has only one part.
The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.




|}
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
Set the input to the Append Geometry filter.
|


|
Accepts input of following types:
* vtkPolyData


==Append Geometry==
|}


==Balance==


Takes an input of multiple poly data parts and output has only one part.
Balance data among available processes.
The Balance filter is available when ParaView is run in parallel. It does load-balancing so that all processes have the same number of cells. It operates on polygonal data sets and produces polygonal output.


The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 305: Line 371:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
Set the input to the Append Geometry filter.


Set the input to the Balance filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
 
* vtkPolyData
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 


|}
|}


==Block Scalars==
==Block Scalars==


The Level Scalars filter uses colors to show levels of a multiblock dataset.
The Level Scalars filter uses colors to show levels of a multiblock dataset.
The Level Scalars filter uses colors to show levels of a multiblock dataset.


The Level Scalars filter uses colors to show levels of a multiblock dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 334: Line 398:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Level Scalars filter.


This property specifies the input to the Level Scalars filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
* vtkMultiBlockDataSet


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
==CTH Surface==


Not finished yet.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|'''Input''' (Input)
|
This property specifies the input of the filter.
|
|
Accepts input of following types:
* vtkCompositeDataSet


|}
|}


 
==CacheKeeper==
==Calculator==




Compute new attribute arrays as function of existing arrays.
vtkPVCacheKeeper manages data cache for flip book animations. When
caching is disabled, this simply acts as a pass through filter. When
caching is enabled, is the current time step has been previously cached
then this filter shuts the update request, otherwise propagates the
update and then cache the result for later use. The current time step
is set using SetCacheTime().


The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector. The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies.<br>
Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.<br>
Clear: Erase the current function (displayed in the read-only text box above the calculator buttons).<br>
/: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses.<br>
*: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses.<br>
-: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses.<br>
+: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses.<br>
sin: Compute the sine of a scalar.<br>
cos: Compute the cosine of a scalar.<br>
tan: Compute the tangent of a scalar.<br>
asin: Compute the arcsine of a scalar.<br>
acos: Compute the arccosine of a scalar.<br>
atan: Compute the arctangent of a scalar.<br>
sinh: Compute the hyperbolic sine of a scalar.<br>
cosh: Compute the hyperbolic cosine of a scalar.<br>
tanh: Compute the hyperbolic tangent of a scalar.<br>
min: Compute minimum of two scalars.<br>
max: Compute maximum of two scalars.<br>
x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses.<br>
sqrt: Compute the square root of a scalar.<br>
e^x: Raise e to the power of a scalar.<br>
log: Compute the logarithm of a scalar (deprecated. same as log10).<br>
log10: Compute the logarithm of a scalar to the base 10.<br>
ln: Compute the logarithm of a scalar to the base 'e'.<br>
ceil: Compute the ceiling of a scalar.<br>
floor: Compute the floor of a scalar.<br>
abs: Compute the absolute value of a scalar.<br>
v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses.<br>
cross: Compute cross product of two vectors.<br>
mag: Compute the magnitude of a vector.<br>
norm: Normalize a vector.<br>
The operands are described below.<br>
The digits 0 - 9 and the decimal point are used to enter constant scalar values.<br>
iHat, jHat, and kHat are vector constants representing unit vectors in the X, Y, and Z directions, respectively.<br>
The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell).<br>
The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 398: Line 453:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
|
This property determines whether the computation is to be performed on point-centered or cell-centered data.


| 0
Set the input to the Update Suppressor filter.
|
|
The value must be one of the following: point_data (1), cell_data (2), field_data (5).


|


|-
|-
| '''Coordinate Results'''<br>''(CoordinateResults)''
|'''CacheTime''' (CacheTime)
|
|
The value of this property determines whether the results of this computation should be used as point coordinates or as a new array.


| 0
|
|
Only the values 0 and 1 are accepted.
0.0
 
|


|-
|-
| '''Function'''<br>''(Function)''
|'''CachingEnabled''' (CachingEnabled)
|
|
This property contains the equation for computing the new array.


Toggle whether the caching is enabled.
|
|
1
|
|
Accepts boolean values (0 or 1).
|}
==Calculator==
Compute new attribute arrays as function of existing arrays.
The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector. The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies.
Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.
Clear: Erase the current function (displayed in the read-only text box above the calculator buttons).
/: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses.
*: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses.
-: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses.
+: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses.
sin: Compute the sine of a scalar.
cos: Compute the cosine of a scalar.
tan: Compute the tangent of a scalar.
asin: Compute the arcsine of a scalar.
acos: Compute the arccosine of a scalar.
atan: Compute the arctangent of a scalar.
sinh: Compute the hyperbolic sine of a scalar.
cosh: Compute the hyperbolic cosine of a scalar.
tanh: Compute the hyperbolic tangent of a scalar.
min: Compute minimum of two scalars.
max: Compute maximum of two scalars.
x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses.
sqrt: Compute the square root of a scalar.
e^x: Raise e to the power of a scalar.
log: Compute the logarithm of a scalar (deprecated. same as log10).
log10: Compute the logarithm of a scalar to the base 10.
ln: Compute the logarithm of a scalar to the base 'e'.
ceil: Compute the ceiling of a scalar.
floor: Compute the floor of a scalar.
abs: Compute the absolute value of a scalar.
v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses.
cross: Compute cross product of two vectors.
mag: Compute the magnitude of a vector.
norm: Normalize a vector.
The operands are described below.
The digits 0 - 9 and the decimal point are used to enter constant scalar values.
iHat, jHat, and kHat are vector constants representing unit vectors in the X, Y, and Z directions, respectively.
The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell).
The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Input'''<br>''(Input)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays.
|
 
|
Accepts input of following types:
* vtkDataSet
|-
|'''ResultArrayName''' (ResultArrayName)
|
|
This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays.


This property contains the name for the output array containing the result of this computation.
|
|
Result
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Function''' (Function)
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
This property contains the equation for computing the new array.
|


|


|-
|-
| '''Replace Invalid Results'''<br>''(ReplaceInvalidValues)''
|'''CoordinateResults''' (CoordinateResults)
|
|
This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.)


| 1
The value of this property determines whether the results of this computation should be used as point coordinates or as a new array.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''AttributeMode''' (AttributeMode)
|
|
Only the values 0 and 1 are accepted.


This property determines whether the computation is to be performed on point-centered or cell-centered data.


|
0
|
The value(s) is an enumeration of the following:
* point_data (1)
* cell_data (2)
* field_data (5)
|-
|-
| '''Replacement Value'''<br>''(ReplacementValue)''
|'''Replace Invalid Results''' (ReplaceInvalidValues)
|
|
If invalid values in the computation are to be replaced with another value, this property contains that value.


| 0
This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.)
 
|
|
1
|
Accepts boolean values (0 or 1).
|-
|-
| '''Result Array Name'''<br>''(ResultArrayName)''
|'''ReplacementValue''' (ReplacementValue)
|
|
This property contains the name for the output array containing the result of this computation.


| Result
If invalid values in the computation are to be replaced with another value, this property contains that value.
 
|
|
0.0
|
|}
|}


==Cell Centers==
==Cell Centers==


Create a point (no geometry) at the center of each input cell.
Create a point (no geometry) at the center of each input cell.
The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.


The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 478: Line 622:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
 
This property specifies the input to the Cell Centers filter.
|
|
This property specifies the input to the Cell Centers filter.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''VertexCells''' (VertexCells)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated.


|-
| '''Vertex Cells'''<br>''(VertexCells)''
|
|
If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated.
0
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Cell Data to Point Data==
==Cell Data to Point Data==


Create point attributes by averaging cell attributes.
Create point attributes by averaging cell attributes.
The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.


The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 517: Line 659:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Cell Data to Point Data filter.


This property specifies the input to the Cell Data to Point Data filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
* vtkDataSet
The dataset much contain a field array (cell)


|-
|'''PassCellData''' (PassCellData)
|


The dataset must contain a cell array.
If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Pass Cell Data'''<br>''(PassCellData)''
|'''PieceInvariant''' (PieceInvariant)
|
If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output.
 
| 0
|
|
Only the values 0 and 1 are accepted.


If the value of this property is set to 1, this filter will request ghost levels so that the values at boundary points match across processes. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.


|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
If the value of this property is set to 1, this filter will request ghost levels so that the values at boundary points match across processes. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.
0
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Clean==
==Clean==


Merge coincident points if they do not meet a feature edge criteria.
Merge coincident points if they do not meet a feature edge criteria.
The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).


The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 569: Line 708:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Absolute Tolerance'''<br>''(AbsoluteTolerance)''
|'''Input''' (Input)
|
|
If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set.


| 1
Set the input to the Clean filter.
|
|
The value must be greater than or equal to 0.


|
Accepts input of following types:
* vtkPolyData
|-
|-
| '''Convert Lines To Points'''<br>''(ConvertLinesToPoints)''
|'''PieceInvariant''' (PieceInvariant)
|
|
If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points.


| 1
If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel.
 
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Tolerance''' (Tolerance)
|
|
Only the values 0 and 1 are accepted.


If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set.


|-
| '''Convert Polys To Lines'''<br>''(ConvertPolysToLines)''
|
|
If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines.
0.0
|


| 1
|-
|'''AbsoluteTolerance''' (AbsoluteTolerance)
|
|
Only the values 0 and 1 are accepted.


If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set.


|-
| '''Convert Strips To Polys'''<br>''(ConvertStripsToPolys)''
|
|
If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles.
1.0
|


| 1
|-
|'''ToleranceIsAbsolute''' (ToleranceIsAbsolute)
|
|
Only the values 0 and 1 are accepted.


This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging.


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Input'''<br>''(Input)''
|'''ConvertLinesToPoints''' (ConvertLinesToPoints)
|
|
Set the input to the Clean filter.
 
If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points.


|
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ConvertPolysToLines''' (ConvertPolysToLines)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines.


|
1
|
Accepts boolean values (0 or 1).
|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|'''ConvertStripsToPolys''' (ConvertStripsToPolys)
|
|
If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel.


| 1
If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles.
|
Only the values 0 and 1 are accepted.


|-
| '''Point Merging'''<br>''(PointMerging)''
|
|
If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged.
1
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Tolerance'''<br>''(Tolerance)''
|'''PointMerging''' (PointMerging)
|
If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set.
 
| 0
|
|
The value must be greater than or equal to 0 and less than or equal to 1.


If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged.


|-
| '''Tolerance Is Absolute'''<br>''(ToleranceIsAbsolute)''
|
|
This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging.
1
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Clean Cells to Grid==
==Clean Cells to Grid==


This filter merges cells and converts the data set to unstructured grid.
This filter merges cells and converts the data set to unstructured grid.
Merges degenerate cells. Assumes the input grid does not contain duplicate
points. You may want to run vtkCleanUnstructuredGrid first to assert it. If
duplicated cells are found they are removed in the output. The filter also
handles the case, where a cell may contain degenerate nodes (i.e. one and
the same node is referenced by a cell more than once).


Merges degenerate cells. Assumes the input grid does not contain duplicate<br>
points. You may want to run vtkCleanUnstructuredGrid first to assert it. If<br>
duplicated cells are found they are removed in the output. The filter also<br>
handles the case, where a cell may contain degenerate nodes (i.e. one and<br>
the same node is referenced by a cell more than once).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 682: Line 819:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Clean Cells to Grid filter.


This property specifies the input to the Clean Cells to Grid filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
 
* vtkUnstructuredGrid
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
 


|}
|}


==Clean to Grid==
==Clean to Grid==


This filter merges points and converts the data set to unstructured grid.
This filter merges points and converts the data set to unstructured grid.
The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.


The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 711: Line 846:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Clean to Grid filter.


This property specifies the input to the Clean to Grid filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
 
* vtkDataSet
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==ClientServerMoveData==


==Clip==


Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.
The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 740: Line 871:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
|'''Input''' (Input)
|
 
Set the input to the Client Server Move Data filter.
|
|
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.


|
|
|-
|'''OutputDataType''' (OutputDataType)
|
|
The value must be set to one of the following: Plane, Box, Sphere, Scalar.


|
0
|


|-
|-
| '''Input'''<br>''(Input)''
|'''WholeExtent''' (WholeExtent)
|
|
This property specifies the dataset on which the Clip filter will operate.


|
|
0 -1 0 -1 0 -1
|
|
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|}
 
==Clip==
 
Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.
The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Inside Out'''<br>''(InsideOut)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
 
This property specifies the dataset on which the Clip filter will operate.
|
|
If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkDataSet
The dataset much contain a field array ()


with 1 component(s).


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''Clip Type''' (ClipFunction)
|
|
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.


This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.
|
|
|
|
An array of scalars is required.
The value can be one of the following:
* Plane (implicit_functions)


* Box (implicit_functions)


Valud array names will be chosen from point and cell data.
* Sphere (implicit_functions)


* Scalar (implicit_functions)


|-
|-
| '''Use Value As Offset'''<br>''(UseValueAsOffset)''
|'''InputBounds''' (InputBounds)
|
|
If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array.


| 0
|
|
Only the values 0 and 1 are accepted.


|


|-
|-
| '''Value'''<br>''(Value)''
|'''Scalars''' (SelectInputScalars)
|
|
If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface.


| 0
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.
|
|
The value must lie within the range of the selected data array.


|
An array of scalars is required.The value must be field array name.
|-
|'''Value''' (Value)
|


|}
If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface.
|
0.0
|
The value must lie within the range of the selected data array.
|-
|'''InsideOut''' (InsideOut)
|


If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''UseValueAsOffset''' (UseValueAsOffset)
|
If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array.
|
0
|
Accepts boolean values (0 or 1).
|}


==Clip Closed Surface==
==Clip Closed Surface==


Clip a polygonal dataset with a plane to produce closed surfaces
Clip a polygonal dataset with a plane to produce closed surfaces
This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.


This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 822: Line 1,010:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Base Color'''<br>''(BaseColor)''
|'''Input''' (Input)
|
|
Specify the color for the faces from the input.


| 0.1 0.1 1
This property specifies the dataset on which the Clip filter will operate.
|
|
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1).


|
Accepts input of following types:
* vtkPolyData
The dataset much contain a field array (point)
with 1 component(s).


|-
|-
| '''Clip Color'''<br>''(ClipColor)''
|'''Clipping Plane''' (ClippingPlane)
|
|
Specifiy the color for the capping faces (generated on the clipping interface).


| 1 0.11 0.1
This property specifies the parameters of the clipping plane used to clip the polygonal data.
|
|
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1).


|
The value can be one of the following:
* Plane (implicit_functions)


|-
|-
| '''Clipping Plane'''<br>''(ClippingPlane)''
|'''GenerateFaces''' (GenerateFaces)
|
|
This property specifies the parameters of the clipping plane used to clip the polygonal data.


Generate polygonal faces in the output.
|
1
|
|
Accepts boolean values (0 or 1).
|-
|'''GenerateOutline''' (GenerateOutline)
|
|
The value must be set to one of the following: Plane.


Generate clipping outlines in the output wherever an input face is cut by the clipping plane.


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Generate Cell Origins'''<br>''(GenerateColorScalars)''
|'''Generate Cell Origins''' (GenerateColorScalars)
|
|
Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells.


| 0
Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''InsideOut''' (InsideOut)
|
|
Only the values 0 and 1 are accepted.


If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane.


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Generate Faces'''<br>''(GenerateFaces)''
|'''Clipping Tolerance''' (Tolerance)
|
Generate polygonal faces in the output.
 
| 1
|
|
Only the values 0 and 1 are accepted.


Specify the tolerance for creating new points. A small value might incur degenerate triangles.


|-
| '''Generate Outline'''<br>''(GenerateOutline)''
|
|
Generate clipping outlines in the output wherever an input face is cut by the clipping plane.
0.000001
 
| 0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Input'''<br>''(Input)''
|'''Base Color''' (BaseColor)
|
|
This property specifies the dataset on which the Clip filter will operate.
 
Specify the color for the faces from the input.


|
|
0.10 0.10 1.00
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Clip Color''' (ClipColor)
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
Specifiy the color for the capping faces (generated on the clipping interface).
 


|-
| '''Inside Out'''<br>''(InsideOut)''
|
|
If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane.
1.00 0.11 0.10
 
| 0
|
|
Only the values 0 and 1 are accepted.




|-
| '''Clipping Tolerance'''<br>''(Tolerance)''
|
Specify the tolerance for creating new points. A small value might incur degenerate triangles.
| 1e-06
|
|}
|}


==Clip Generic Dataset==
==Clip Generic Dataset==


Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set. This output data type of this filter is always an unstructured grid.
The Generic Clip filter cuts away a portion of the input data set using a plane, a sphere, a box, or a scalar value. The menu in the Clip Function portion of the interface allows the user to select which implicit function to use or whether to clip using a scalar value. Making this selection loads the appropriate user interface. For the implicit functions, the appropriate 3D widget (plane, sphere, or box) is also displayed. The use of these 3D widgets, including their user interface components, is discussed in section 7.4.
If an implicit function is selected, the clip filter returns that portion of the input data set that lies inside the function. If Scalars is selected, then the user must specify a scalar array to clip according to. The clip filter will return the portions of the data set whose value in the selected Scalars array is larger than the Clip value. Regardless of the selection from the Clip Function menu, if the Inside Out option is checked, the opposite portions of the data set will be returned.
This filter operates on all types of data sets, and it returns unstructured grid data on output.


Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set.  This output data type of this filter is always an unstructured grid.
The Generic Clip filter cuts away a portion of the input data set using a plane, a sphere, a box, or a scalar value. The menu in the Clip Function portion of the interface allows the user to select which implicit function to use or whether to clip using a scalar value. Making this selection loads the appropriate user interface. For the implicit functions, the appropriate 3D widget (plane, sphere, or box) is also displayed. The use of these 3D widgets, including their user interface components, is discussed in section 7.4.<br>
If an implicit function is selected, the clip filter returns that portion of the input data set that lies inside the function. If Scalars is selected, then the user must specify a scalar array to clip according to. The clip filter will return the portions of the data set whose value in the selected Scalars array is larger than the Clip value. Regardless of the selection from the Clip Function menu, if the Inside Out option is checked, the opposite portions of the data set will be returned.<br>
This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 930: Line 1,125:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
|'''Input''' (Input)
|
|
Set the parameters of the clip function.


Set the input to the Generic Clip filter.
|
|
|
|
The value must be set to one of the following: Plane, Box, Sphere, Scalar.
Accepts input of following types:
 
* vtkGenericDataSet
The dataset much contain a field array (point)


|-
|-
| '''Input'''<br>''(Input)''
|'''Clip Type''' (ClipFunction)
|
|
Set the input to the Generic Clip filter.


Set the parameters of the clip function.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value can be one of the following:
* Plane (implicit_functions)


* Box (implicit_functions)


The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.
* Sphere (implicit_functions)


* Scalar (implicit_functions)


|-
|-
| '''Inside Out'''<br>''(InsideOut)''
|'''InputBounds''' (InputBounds)
|
|
Choose which portion of the dataset should be clipped away.


| 0
|
|
Only the values 0 and 1 are accepted.


|


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|'''Scalars''' (SelectInputScalars)
|
|
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.


If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.
|
|
|
An array of scalars is required.The value must be field array name.
|-
|'''InsideOut''' (InsideOut)
|
|
An array of scalars is required.
Valud array names will be chosen from point and cell data.


Choose which portion of the dataset should be clipped away.


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Value'''<br>''(Value)''
|'''Value''' (Value)
|
|
If clipping with a scalar array, choose the clipping value.


| 0
If clipping with a scalar array, choose the clipping value.
|
0.0
|
|
The value must lie within the range of the selected data array.
The value must lie within the range of the selected data array.


|}
|}


==Compute Derivatives==
==Compute Derivatives==


This filter computes derivatives of scalars and vectors.
This filter computes derivatives of scalars and vectors.
CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.


CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,002: Line 1,210:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the filter.


This property specifies the input to the filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
* vtkDataSet
The dataset much contain a field array (point)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The dataset much contain a field array (point)


with 3 component(s).


|-
|-
| '''Output Tensor Type'''<br>''(OutputTensorType)''
|'''Scalars''' (SelectInputScalars)
|
|
This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.


| 1
This property indicates the name of the scalar array to differentiate.
|
|
The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2).


|
An array of scalars is required.
|-
|-
| '''Output Vector Type'''<br>''(OutputVectorType)''
|'''Vectors''' (SelectInputVectors)
|
|
This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.


| 1
This property indicates the name of the vector array to differentiate.
|
1
|
An array of vectors is required.
|-
|'''OutputVectorType''' (OutputVectorType)
|
|
The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2).


This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.


|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
|
This property indicates the name of the scalar array to differentiate.
1
 
|
|
The value(s) is an enumeration of the following:
* Nothing (0)
* Scalar Gradient (1)
* Vorticity (2)
|-
|'''OutputTensorType''' (OutputTensorType)
|
|
An array of scalars is required.


This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.


|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
|
This property indicates the name of the vector array to differentiate.
1
 
| 1
|
|
An array of vectors is required.
The value(s) is an enumeration of the following:
 
* Nothing (0)
* Vector Gradient (1)
* Strain (2)


|}
|}


==Connectivity==
==Connectivity==


Mark connected components with integer point attribute array.
Mark connected components with integer point attribute array.
The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.


The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,071: Line 1,291:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Color Regions'''<br>''(ColorRegions)''
|'''Input''' (Input)
|
|
Controls the coloring of the connected regions.


| 1
This property specifies the input to the Connectivity filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''Extraction Mode'''<br>''(ExtractionMode)''
|'''ExtractionMode''' (ExtractionMode)
|
|
Controls the extraction of connected surfaces.


| 5
Controls the extraction of connected surfaces.
 
|
|
The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6).
5
 
|
 
The value(s) is an enumeration of the following:
* Extract Point Seeded Regions (1)
* Extract Cell Seeded Regions (2)
* Extract Specified Regions (3)
* Extract Largest Region (4)
* Extract All Regions (5)
* Extract Closes Point Region (6)
|-
|-
| '''Input'''<br>''(Input)''
|'''ColorRegions''' (ColorRegions)
|
|
This property specifies the input to the Connectivity filter.


Controls the coloring of the connected regions.
|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Contingency Statistics==
==Contingency Statistics==


Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
This filter computes contingency tables between pairs of attributes. This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model. Data is assessed by computing &lt;ul&gt;
&lt;li&gt; the probability of observing both variables simultaneously;
&lt;li&gt; the probability of each variable conditioned on the other (the two values need not be identical); and
&lt;li&gt; the pointwise mutual information (PMI). &lt;/ul&gt;
Finally, the summary statistics include the information entropy of the observations.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br>
This filter computes contingency tables between pairs of attributes.  This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.  Data is assessed by computing <br>
*  the probability of observing both variables simultaneously;<br>
*  the probability of each variable conditioned on the other (the two values need not be identical); and<br>
*  the pointwise mutual information (PMI).
<br>
Finally, the summary statistics include the information entropy of the observations.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,126: Line 1,349:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
|
Specify which type of field data the arrays will be drawn from.


| 0
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
 
|
|
Valud array names will be chosen from point and cell data.


|
Accepts input of following types:
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset much contain a field array ()


|-
|-
| '''Input'''<br>''(Input)''
|'''ModelInput''' (ModelInput)
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
 
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
|
Accepts input of following types:
* vtkTable
* vtkMultiBlockDataSet
|-
|'''AttributeMode''' (AttributeMode)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Specify which type of field data the arrays will be drawn from.


The dataset must contain a point or cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
|-
| '''Model Input'''<br>''(ModelInput)''
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.
0
 
|
|
The value must be field array name.
|-
|'''Variables of Interest''' (SelectArrays)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Choose arrays whose entries will be used to form observations for statistical analysis.


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
|


|


|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|'''Task''' (Task)
|
|
Choose arrays whose entries will be used to form observations for statistical analysis.
 
Specify the task to be performed: modeling and/or assessment. &lt;ol&gt;
&lt;li&gt; "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the input dataset;&lt;/li&gt;
&lt;li&gt; "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and&lt;/li&gt;
&lt;li&gt; "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.&lt;/li&gt;
&lt;/ol&gt;
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be ignored for tasks 1 and 3.


|
|
3
|
|
An array of scalars is required.
The value(s) is an enumeration of the following:
 
* Detailed model of input data (0)
 
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|-
| '''Task'''<br>''(Task)''
|'''TrainingFraction''' (TrainingFraction)
|
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.
 
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
 
| 3
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).


Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
0.1
 
| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.




|}
|}


==Contour==
==Contour==


Generate isolines or isosurfaces using point scalars.
Generate isolines or isosurfaces using point scalars.
The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.


The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,217: Line 1,445:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Gradients'''<br>''(ComputeGradients)''
|'''Input''' (Input)
|
 
This property specifies the input dataset to be used by the contour filter.
|
|
If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.


| 0
|
|
Only the values 0 and 1 are accepted.
Accepts input of following types:
* vtkDataSet
The dataset much contain a field array (point)


with 1 component(s).


|-
|-
| '''Compute Normals'''<br>''(ComputeNormals)''
|'''Contour By''' (SelectInputScalars)
|
|
If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0.
Select whether to compute normals.


| 1
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
|
|
Only the values 0 and 1 are accepted.


|
An array of scalars is required.The value must be field array name.
|-
|-
| '''Compute Scalars'''<br>''(ComputeScalars)''
|'''Isosurfaces''' (ContourValues)
|
|
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.


| 0
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
|
|
Only the values 0 and 1 are accepted.


|
The value must lie within the range of the selected data array.
|-
|-
| '''Isosurfaces'''<br>''(ContourValues)''
|'''ComputeNormals''' (ComputeNormals)
|
|
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
 
If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0.
Select whether to compute normals.


|
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeGradients''' (ComputeGradients)
|
|
The value must lie within the range of the selected data array.


If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.


|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input dataset to be used by the contour filter.
0
 
|
|
Accepts boolean values (0 or 1).
|-
|'''ComputeScalars''' (ComputeScalars)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.


The dataset must contain a point array with 1 components.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
| '''Point Merge Method'''<br>''(Locator)''
|
|
This property specifies an incremental point locator for merging duplicate / coincident points.
0
 
|
|
Accepts boolean values (0 or 1).
|-
|'''Point Merge Method''' (Locator)
|
|
The selected object must be the result of the following: incremental_point_locators.
The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.


This property specifies an incremental point locator for merging duplicate / coincident points.


|-
| '''Contour By'''<br>''(SelectInputScalars)''
|
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.


|
|
|
The value can be one of the following:
An array of scalars is required.
* MergePoints (incremental_point_locators)


* IncrementalOctreeMergePoints (incremental_point_locators)


Valud array names will be chosen from point and cell data.
* NonMergingPointLocator (incremental_point_locators)




|}
|}


==Contour Generic Dataset==
==Contour Generic Dataset==


Generate isolines or isosurfaces using point scalars.
Generate isolines or isosurfaces using point scalars.
The Generic Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The available scalar arrays are listed in the Scalars menu. The scalar range of the selected array will be displayed.
The interface for adding contour values is very similar to the one for selecting cut offsets (in the Cut filter). To add a single contour value, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced contours, use the controls in the Generate range of values section. Select the number of contour values to generate using the Number of Values slider. The Range slider controls the interval in which to generate the contour values. Once the number of values and range have been selected, click the Generate button. The new values will be added to the Contour Values list. To delete a value from the Contour Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list. If no values are in the Contour Values list when Accept is pressed, the current value of the New Value slider will be used.
In addition to selecting contour values, you can also select additional computations to perform. If any of Compute Normals, Compute Gradients, or Compute Scalars is selected, the appropriate computation will be performed, and a corresponding point-centered array will be added to the output.
The Generic Contour filter operates on a generic data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.


The Generic Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The available scalar arrays are listed in the Scalars menu. The scalar range of the selected array will be displayed.<br>
The interface for adding contour values is very similar to the one for selecting cut offsets (in the Cut filter). To add a single contour value, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced contours, use the controls in the Generate range of values section. Select the number of contour values to generate using the Number of Values slider. The Range slider controls the interval in which to generate the contour values. Once the number of values and range have been selected, click the Generate button. The new values will be added to the Contour Values list. To delete a value from the Contour Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list. If no values are in the Contour Values list when Accept is pressed, the current value of the New Value slider will be used.<br>
In addition to selecting contour values, you can also select additional computations to perform. If any of Compute Normals, Compute Gradients, or Compute Scalars is selected, the appropriate computation will be performed, and a corresponding point-centered array will be added to the output.<br>
The Generic Contour filter operates on a generic data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,319: Line 1,546:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Gradients'''<br>''(ComputeGradients)''
|'''Input''' (Input)
|
|
Select whether to compute gradients.


| 0
Set the input to the Generic Contour filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkGenericDataSet
The dataset much contain a field array (point)
with 1 component(s).


|-
|-
| '''Compute Normals'''<br>''(ComputeNormals)''
|'''Contour By''' (SelectInputScalars)
|
|
Select whether to compute normals.


| 1
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
|
|
Only the values 0 and 1 are accepted.


|
An array of scalars is required.The value must be field array name.
|-
|-
| '''Compute Scalars'''<br>''(ComputeScalars)''
|'''Isosurfaces''' (ContourValues)
|
|
Select whether to compute scalars.


| 0
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
|
|
Only the values 0 and 1 are accepted.


|
The value must lie within the range of the selected data array.
|-
|-
| '''Isosurfaces'''<br>''(ContourValues)''
|'''ComputeNormals''' (ComputeNormals)
|
|
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
 
Select whether to compute normals.


|
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeGradients''' (ComputeGradients)
|
|
The value must lie within the range of the selected data array.


Select whether to compute gradients.


|-
| '''Input'''<br>''(Input)''
|
|
Set the input to the Generic Contour filter.
0
 
|
|
Accepts boolean values (0 or 1).
|-
|'''ComputeScalars''' (ComputeScalars)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.


Select whether to compute scalars.


|-
| '''Point Merge Method'''<br>''(Locator)''
|
|
This property specifies an incremental point locator for merging duplicate / coincident points.
0
 
|
|
Accepts boolean values (0 or 1).
|-
|'''Point Merge Method''' (Locator)
|
|
The selected object must be the result of the following: incremental_point_locators.


The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.
This property specifies an incremental point locator for merging duplicate / coincident points.


|-
| '''Contour By'''<br>''(SelectInputScalars)''
|
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.


|
|
|
The value can be one of the following:
An array of scalars is required.
* MergePoints (incremental_point_locators)


* IncrementalOctreeMergePoints (incremental_point_locators)


Valud array names will be chosen from point and cell data.
* NonMergingPointLocator (incremental_point_locators)




|}
|}


==ConvertSelection==


==Curvature==


Converts a selection from one type to another.


This filter will compute the Gaussian or mean curvature of the mesh at each point.
The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br>
; the type can be selected from the Curvature type menu button.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,415: Line 1,643:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Curvature Type'''<br>''(CurvatureType)''
|'''DataInput''' (DataInput)
|
|
This propery specifies which type of curvature to compute.


| 0
Set the vtkDataObject input used to convert the selection.
|
|
The value must be one of the following: Gaussian (0), Mean (1).


|
Accepts input of following types:
* vtkDataObject
|-
|'''Input''' (Input)
|
Set the selection to convert.
|


|
Accepts input of following types:
* vtkSelection
|-
|-
| '''Input'''<br>''(Input)''
|'''OutputType''' (OutputType)
|
|
This property specifies the input to the Curvature filter.


Set the ContentType for the output.
|
|
5
|
The value(s) is an enumeration of the following:
* SELECTIONS (0)
* GLOBALIDs (1)
* PEDIGREEIDS (2)
* VALUES (3)
* INDICES (4)
* FRUSTUM (5)
* LOCATION (6)
* THRESHOLDS (7)
|-
|'''ArrayNames''' (ArrayNames)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|
 


|-
|-
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)''
|'''MatchAnyValues''' (MatchAnyValues)
|
|
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.


| 0
|
|
Only the values 0 and 1 are accepted.
0
 
|
Accepts boolean values (0 or 1).


|}
|}


==Crop==


==D3==
Efficiently extract an area/volume of interest from a 2-d image or 3-d volume.
 
The Crop filter extracts an area/volume of interest from a 2D image or a 3D volume by allowing the user to specify the minimum and maximum extents of each dimension of the data. Both the input and output of this filter are uniform rectilinear data.
 
Repartition a data set into load-balanced spatially convex regions. Create ghost cells if requested.


The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,464: Line 1,715:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Boundary Mode'''<br>''(BoundaryMode)''
|'''Input''' (Input)
|
|
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.


| 0
This property specifies the input to the Crop filter.
|
|
The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2).


|
Accepts input of following types:
* vtkImageData
|-
|-
| '''Input'''<br>''(Input)''
|'''OutputWholeExtent''' (OutputWholeExtent)
|
|
This property specifies the input to the D3 filter.
|
|
The selected object must be the result of the following: sources (includes readers), filters.


This property gives the minimum and maximum point index (extent) in each dimension for the output dataset.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
| '''Minimal Memory'''<br>''(UseMinimalMemory)''
|
|
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.
0 0 0 0 0 0
 
| 0
|
|
Only the values 0 and 1 are accepted.
The value(s) must lie within the structured-extents of the input dataset.
 


|}
|}


==Curvature==


==Decimate==
This filter will compute the Gaussian or mean curvature of the mesh at each point.
The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.


; the type can be selected from the Curvature type menu button.


Simplify a polygonal model using an adaptive edge collapse algorithm.  This filter works with triangles only.
The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,513: Line 1,754:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)''
|'''Input''' (Input)
|
|
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.


| 1
This property specifies the input to the Curvature filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkPolyData
|-
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|'''InvertMeanCurvature''' (InvertMeanCurvature)
|
|
The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.


| 15
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.
 
|
0
|
|
The value must be greater than or equal to 0 and less than or equal to 180.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''CurvatureType''' (CurvatureType)
|
|
This property specifies the input to the Decimate filter.
 
This propery specifies which type of curvature to compute.


|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value(s) is an enumeration of the following:
* Gaussian (0)
* Mean (1)
 
|}
 
==D3==


Repartition a data set into load-balanced spatially convex regions. Create ghost cells if requested.
The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
|'''Input''' (Input)
|
|
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.


| 0
This property specifies the input to the D3 filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkDataSet
|-
|'''BoundaryMode''' (BoundaryMode)
|


This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.
|
0
|
The value(s) is an enumeration of the following:
* Assign cells uniquely (0)
* Duplicate cells (1)
* Divide cells (2)
|-
|-
| '''Target Reduction'''<br>''(TargetReduction)''
|'''Minimal Memory''' (UseMinimalMemory)
|
|
This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)


| 0.9
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.
 
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
0
 
|
Accepts boolean values (0 or 1).


|}
|}


==Decimate==


==Delaunay 2D==
Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only.
 
The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.
 
Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.


Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br>
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br>
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br>
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,588: Line 1,853:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Alpha'''<br>''(Alpha)''
|'''Input''' (Input)
|
|
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.


| 0
This property specifies the input to the Decimate filter.
|
|
The value must be greater than or equal to 0.


|
Accepts input of following types:
* vtkPolyData
|-
|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|'''TargetReduction''' (TargetReduction)
|
|
If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.


| 0
This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)
|
0.9
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Input'''<br>''(Input)''
|'''PreserveTopology''' (PreserveTopology)
|
|
This property specifies the input dataset to the Delaunay 2D filter.
 
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.


|
|
0
|
Accepts boolean values (0 or 1).
|-
|'''FeatureAngle''' (FeatureAngle)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
|
 
15.0
|


|-
|-
| '''Offset'''<br>''(Offset)''
|'''BoundaryVertexDeletion''' (BoundaryVertexDeletion)
|
|
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.


| 1
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.
|
The value must be greater than or equal to 0.75.


|-
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)''
|
|
This property determines type of projection plane to use in performing the triangulation.
1
 
| 0
|
|
The value must be one of the following: XY Plane (0), Best-Fitting Plane (2).
Accepts boolean values (0 or 1).


|}


|-
==Delaunay 2D==
| '''Tolerance'''<br>''(Tolerance)''
|
This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.
 
| 1e-05
|
The value must be greater than or equal to 0 and less than or equal to 1.


Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.
Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.


|}
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.


Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.


==Delaunay 3D==
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.


Warning:
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.


Create a 3D Delaunay triangulation of input                               points. It expects a vtkPointSet as input and                                produces vtkUnstructuredGrid as output.
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.


Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br>
from a list of input points. These points may be represented by any<br>
dataset of type vtkPointSet and subclasses. The output of the filter<br>
is an unstructured grid dataset. Usually the output is a tetrahedral<br>
mesh, but if a non-zero alpha distance value is specified (called<br>
the "alpha" value), then only tetrahedra, triangles, edges, and<br>
vertices lying within the alpha radius are output. In other words,<br>
non-zero alpha values may result in arbitrary combinations of<br>
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br>
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br>
The 3D Delaunay triangulation is defined as the triangulation that<br>
satisfies the Delaunay criterion for n-dimensional simplexes (in<br>
this case n=3 and the simplexes are tetrahedra). This criterion<br>
states that a circumsphere of each simplex in a triangulation<br>
contains only the n+1 defining points of the simplex. (See text for<br>
more information.) While in two dimensions this translates into an<br>
"optimal" triangulation, this is not true in 3D, since a measurement<br>
for optimality in 3D is not agreed on.<br><br><br>
Delaunay triangulations are used to build topological structures<br>
from unorganized (or unstructured) points. The input to this filter<br>
is a list of points specified in 3D. (If you wish to create 2D<br>
triangulations see Delaunay2D.) The output is an unstructured<br>
grid.<br><br><br>
The Delaunay triangulation can be numerically sensitive. To prevent<br>
problems, try to avoid injecting points that will result in<br>
triangles with bad aspect ratios (1000:1 or greater). In practice<br>
this means inserting points that are "widely dispersed", and enables<br>
smooth transition of triangle sizes throughout the mesh. (You may<br>
even want to add extra points to create a better point<br>
distribution.) If numerical problems are present, you will see a<br>
warning message to this effect at the end of the triangulation<br>
process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can<br>
be triangulated in more than one way (at least according to the<br>
Delaunay criterion). The choice of triangulation (as implemented by<br>
this algorithm) depends on the order of the input points. The first<br>
four points will form a tetrahedron; other degenerate points<br>
(relative to this initial tetrahedron) will not break it.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the<br>
algorithm. This is because the Delaunay triangulation requires<br>
unique input points. You can control the definition of coincidence<br>
with the "Tolerance" instance variable.<br><br><br>
The output of the Delaunay triangulation is supposedly a convex<br>
hull. In certain cases this implementation may not generate the<br>
convex hull. This behavior can be controlled by the Offset instance<br>
variable. Offset is a multiplier used to control the size of the<br>
initial triangulation. The larger the offset value, the more likely<br>
you will generate a convex hull; and the more likely you are to see<br>
numerical problems.<br><br><br>
The implementation of this algorithm varies from the 2D Delaunay<br>
algorithm (i.e., Delaunay2D) in an important way. When points are<br>
injected into the triangulation, the search for the enclosing<br>
tetrahedron is quite different. In the 3D case, the closest<br>
previously inserted point point is found, and then the connected<br>
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br>
towards the enclosing triangle is performed.) If the triangulation<br>
is Delaunay, then an enclosing tetrahedron will be found. However,<br>
in degenerate cases an enclosing tetrahedron may not be found and<br>
the point will be rejected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,726: Line 1,931:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Alpha'''<br>''(Alpha)''
|'''Input''' (Input)
|
 
This property specifies the input dataset to the Delaunay 2D filter.
|
|
This property specifies the alpha (or distance) value to control
the output of this filter.  For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of
radius alpha) will be output.  Otherwise, only tetrahedra will be
output.


| 0
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
* vtkPointSet
|-
|'''ProjectionPlaneMode''' (ProjectionPlaneMode)
|


This property determines type of projection plane to use in performing the triangulation.


|
0
|
The value(s) is an enumeration of the following:
* XY Plane (0)
* Best-Fitting Plane (2)
|-
|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|'''Alpha''' (Alpha)
|
|
This boolean controls whether bounding triangulation points (and
associated triangles) are included in the output. (These are
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)


| 0
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.
 
|
0.0
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Input'''<br>''(Input)''
|'''Tolerance''' (Tolerance)
|
|
This property specifies the input dataset to the Delaunay 3D filter.
 
This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.


|
|
0.00001
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Offset''' (Offset)
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.


|
1.0
|


|-
|-
| '''Offset'''<br>''(Offset)''
|'''BoundingTriangulation''' (BoundingTriangulation)
|
|
This property specifies a multiplier to control the size of the
initial, bounding Delaunay triangulation.


| 2.5
If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.
 
|
0
|
|
The value must be greater than or equal to 2.5.
Accepts boolean values (0 or 1).


|}


|-
==Delaunay 3D==
| '''Tolerance'''<br>''(Tolerance)''
|
This property specifies a tolerance to control discarding of
closely spaced points. This tolerance is specified as a fraction
of the diagonal length of the bounding box of the points.


| 0.001
Create a 3D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkUnstructuredGrid as output.
|
The value must be greater than or equal to 0 and less than or equal to 1.


Delaunay3D is a filter that constructs a 3D Delaunay triangulation
from a list of input points. These points may be represented by any
dataset of type vtkPointSet and subclasses. The output of the filter
is an unstructured grid dataset. Usually the output is a tetrahedral
mesh, but if a non-zero alpha distance value is specified (called
the "alpha" value), then only tetrahedra, triangles, edges, and
vertices lying within the alpha radius are output. In other words,
non-zero alpha values may result in arbitrary combinations of
tetrahedra, triangles, lines, and vertices. (The notion of alpha
value is derived from Edelsbrunner's work on "alpha shapes".)


|}
The 3D Delaunay triangulation is defined as the triangulation that
satisfies the Delaunay criterion for n-dimensional simplexes (in
this case n=3 and the simplexes are tetrahedra). This criterion
states that a circumsphere of each simplex in a triangulation
contains only the n+1 defining points of the simplex. (See text for
more information.) While in two dimensions this translates into an
"optimal" triangulation, this is not true in 3D, since a measurement
for optimality in 3D is not agreed on.


Delaunay triangulations are used to build topological structures
from unorganized (or unstructured) points. The input to this filter
is a list of points specified in 3D. (If you wish to create 2D
triangulations see Delaunay2D.) The output is an unstructured
grid.


==Descriptive Statistics==
The Delaunay triangulation can be numerically sensitive. To prevent
problems, try to avoid injecting points that will result in
triangles with bad aspect ratios (1000:1 or greater). In practice
this means inserting points that are "widely dispersed", and enables
smooth transition of triangle sizes throughout the mesh. (You may
even want to add extra points to create a better point
distribution.) If numerical problems are present, you will see a
warning message to this effect at the end of the triangulation
process.


Warning:
Points arranged on a regular lattice (termed degenerate cases) can
be triangulated in more than one way (at least according to the
Delaunay criterion). The choice of triangulation (as implemented by
this algorithm) depends on the order of the input points. The first
four points will form a tetrahedron; other degenerate points
(relative to this initial tetrahedron) will not break it.


Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
Points that are coincident (or nearly so) may be discarded by the
algorithm. This is because the Delaunay triangulation requires
unique input points. You can control the definition of coincidence
with the "Tolerance" instance variable.


This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
The output of the Delaunay triangulation is supposedly a convex
<br>
hull. In certain cases this implementation may not generate the
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.
convex hull. This behavior can be controlled by the Offset instance
variable. Offset is a multiplier used to control the size of the
initial triangulation. The larger the offset value, the more likely
you will generate a convex hull; and the more likely you are to see
numerical problems.


<br>
The implementation of this algorithm varies from the 2D Delaunay
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br>
algorithm (i.e., Delaunay2D) in an important way. When points are
injected into the triangulation, the search for the enclosing
tetrahedron is quite different. In the 3D case, the closest
previously inserted point point is found, and then the connected
tetrahedra are searched to find the containing one. (In 2D, a "walk"
towards the enclosing triangle is performed.) If the triangulation
is Delaunay, then an enclosing tetrahedron will be found. However,
in degenerate cases an enclosing tetrahedron may not be found and
the point will be rejected.




Line 1,811: Line 2,077:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|'''Input''' (Input)
|
|
Specify which type of field data the arrays will be drawn from.


| 0
This property specifies the input dataset to the Delaunay 3D filter.
 
|
|
Valud array names will be chosen from point and cell data.


|
Accepts input of following types:
* vtkPointSet
|-
|-
| '''Input'''<br>''(Input)''
|'''Alpha''' (Alpha)
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
 
This property specifies the alpha (or distance) value to control
the output of this filter. For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of
radius alpha) will be output. Otherwise, only tetrahedra will be
output.


|
|
0.0
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''Tolerance''' (Tolerance)
|


The dataset must contain a point or cell array.
This property specifies a tolerance to control discarding of
closely spaced points. This tolerance is specified as a fraction
of the diagonal length of the bounding box of the points.


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
|-
| '''Model Input'''<br>''(ModelInput)''
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.
0.001
 
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.


|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|'''Offset''' (Offset)
|
Choose arrays whose entries will be used to form observations for statistical analysis.
 
|
|
|
An array of scalars is required.


This property specifies a multiplier to control the size of the
initial, bounding Delaunay triangulation.


|-
| '''Deviations should be'''<br>''(SignedDeviations)''
|
|
Should the assessed values be signed deviations or unsigned?
2.5
 
| 0
|
|
The value must be one of the following: Unsigned (0), Signed (1).


|-
|-
| '''Task'''<br>''(Task)''
|'''BoundingTriangulation''' (BoundingTriangulation)
|
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.
 
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.
 
| 3
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).


This boolean controls whether bounding triangulation points (and
associated triangles) are included in the output. (These are
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)


|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
0
 
| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
Accepts boolean values (0 or 1).
 


|}
|}


==Descriptive Statistics==


==Elevation==
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
 
This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.&lt;p&gt;
 
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.&lt;p&gt;
Create point attribute array by projecting points onto an elevation vector.
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.


The Elevation filter generates point scalar values for an input dataset along a specified direction vector.<br><br><br>
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,913: Line 2,156:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''High Point'''<br>''(HighPoint)''
|'''Input''' (Input)
|
|
This property defines the other end of the direction vector (large scalar values).


| 0 0 1
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
 
|
|
The coordinate must lie within the bounding box of the dataset. It will default to the maximum in each dimension.


|
Accepts input of following types:
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset much contain a field array ()


|-
|-
| '''Input'''<br>''(Input)''
|'''ModelInput''' (ModelInput)
|
|
This property specifies the input dataset to the Elevation filter.
 
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
|
Accepts input of following types:
* vtkTable
* vtkMultiBlockDataSet
|-
|'''SelectArrayInfo''' (SelectArrayInfo)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|


|-
|'''AttributeMode''' (AttributeMode)
|


Specify which type of field data the arrays will be drawn from.
|
0
|
The value must be field array name.
|-
|-
| '''Low Point'''<br>''(LowPoint)''
|'''Variables of Interest''' (SelectArrays)
|
|
This property defines one end of the direction vector (small scalar values).


| 0 0 0
Choose arrays whose entries will be used to form observations for statistical analysis.
 
|
|
The coordinate must lie within the bounding box of the dataset. It will default to the minimum in each dimension.


|


|-
|-
| '''Scalar Range'''<br>''(ScalarRange)''
|'''Task''' (Task)
|
|
This property determines the range into which scalars will be mapped.


| 0 1
Specify the task to be performed: modeling and/or assessment. &lt;ol&gt;
&lt;li&gt; "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the input dataset;&lt;/li&gt;
&lt;li&gt; "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and&lt;/li&gt;
&lt;li&gt; "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.&lt;/li&gt;
&lt;/ol&gt;
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be ignored for tasks 1 and 3.
 
|
3
|
The value(s) is an enumeration of the following:
* Detailed model of input data (0)
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|'''TrainingFraction''' (TrainingFraction)
|
|
|}


Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


==Extract AMR Blocks==
This filter extracts a list of datasets from hierarchical datasets.
This filter extracts a list of datasets from hierarchical datasets.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Extract Datasets filter.
0.1
 
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.


|-
|-
| '''Selected Data Sets'''<br>''(SelectedDataSets)''
|'''Deviations should be''' (SignedDeviations)
|
|
This property provides a list of datasets to extract.
 
Should the assessed values be signed deviations or unsigned?


|
|
0
|
|
The value(s) is an enumeration of the following:
* Unsigned (0)
* Signed (1)
|}
|}


==Elevation==


==Extract Block==
Create point attribute array by projecting points onto an elevation vector.
 
The Elevation filter generates point scalar values for an input dataset along a specified direction vector.


This filter extracts a range of blocks from a multiblock dataset.
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.


This filter extracts a range of groups from a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,005: Line 2,274:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Block Indices'''<br>''(BlockIndices)''
|'''Input''' (Input)
|
|
This property lists the ids of the blocks to extract
from the input multiblock dataset.


This property specifies the input dataset to the Elevation filter.
|
|
|
|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''Input'''<br>''(Input)''
|'''ScalarRange''' (ScalarRange)
|
|
This property specifies the input to the Extract Group filter.


This property determines the range into which scalars will be mapped.
|
|
0 1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.


|-
|-
| '''Maintain Structure'''<br>''(MaintainStructure)''
|'''Low Point''' (LowPoint)
|
|
This is used only when PruneOutput is ON. By default, when pruning the
output i.e. remove empty blocks, if node has only 1 non-null child
block, then that node is removed. To preserve these parent nodes, set
this flag to true.


| 0
This property defines one end of the direction vector (small scalar values).
|
0 0 0
|
|
Only the values 0 and 1 are accepted.


The value must lie within the bounding box of the dataset.
It will default to the min in each dimension.


|-
|-
| '''Prune Output'''<br>''(PruneOutput)''
|'''High Point''' (HighPoint)
|
|
When set, the output mutliblock dataset will be pruned to remove empty
nodes. On by default.


| 1
This property defines the other end of the direction vector (large scalar values).
|
0 0 1
|
|
Only the values 0 and 1 are accepted.


The value must lie within the bounding box of the dataset.


|}
It will default to the max in each dimension.




==Extract CTH Parts==
|}


==Extract AMR Blocks==


Create a surface from a CTH volume fraction.
This filter extracts a list of datasets from hierarchical datasets.
This filter extracts a list of datasets from hierarchical datasets.


Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,066: Line 2,339:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)''
|'''Input''' (Input)
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


This property specifies the input to the Extract Datasets filter.
|
|
|
|
An array of scalars is required.
Accepts input of following types:
 
* vtkHierarchicalBoxDataSet
 
|-
|-
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)''
|'''SelectedDataSets''' (SelectedDataSets)
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


This property provides a list of datasets to extract.
|
|
|
|
An array of scalars is required.




|-
|}
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)''
 
|
==Extract Attributes==
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
Extract attribute data as a table.
|
This is a filter that produces a vtkTable from the chosen attribute in
An array of scalars is required.
the input dataobject. This filter can accept composite datasets. If the
input is a composite dataset, the output is a multiblock with vtkTable
leaves.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Clip Type'''<br>''(ClipPlane)''
| '''Property'''
|
| '''Description'''
This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
This property specifies the input of the filter.
|
|
The value must be set to one of the following: None, Plane, Box, Sphere.


|
Accepts input of following types:
* vtkDataObject
|-
|-
| '''Input'''<br>''(Input)''
|'''FieldAssociation''' (FieldAssociation)
|
|
This property specifies the input to the Extract CTH Parts filter.


Select the attribute data to pass.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value(s) is an enumeration of the following:
 
* Points (0)
 
* Cells (1)
The dataset must contain a cell array with 1 components.
* Field Data (2)
 
* Vertices (4)
 
* Edges (5)
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
* Rows (6)
 
 
|-
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|'''AddMetaData''' (AddMetaData)
|
|
The value of this property is the volume fraction value for the surface.


| 0.1
It is possible for this filter to add additional meta-data to the
field data such as point coordinates (when point attributes are
selected and input is pointset) or structured coordinates etc. To
enable this addition of extra information, turn this flag on. Off by
default.
|
0
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
Accepts boolean values (0 or 1).
 


|}
|}


==Extract Block==


==Extract Cells By Region==
This filter extracts a range of blocks from a multiblock dataset.
 
This filter extracts a range of groups from a multiblock dataset
 
This filter extracts cells that are inside/outside a region or at a region boundary.


This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.<br>
To use this filter you must specify a region  (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,149: Line 2,434:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Extract intersected'''<br>''(Extract intersected)''
|'''Input''' (Input)
|
|
This parameter controls whether to extract cells that are on the boundary of the region.


| 0
This property specifies the input to the Extract Group filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkMultiBlockDataSet
|-
|-
| '''Extract only intersected'''<br>''(Extract only intersected)''
|'''BlockIndices''' (BlockIndices)
|
|
This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.


| 0
This property lists the ids of the blocks to extract
|
from the input multiblock dataset.
Only the values 0 and 1 are accepted.


|-
| '''Extraction Side'''<br>''(ExtractInside)''
|
|
This parameter controls whether to extract cells that are inside or outside the region.


| 1
|
|
The value must be one of the following: outside (0), inside (1).


|-
|-
| '''Intersect With'''<br>''(ImplicitFunction)''
|'''PruneOutput''' (PruneOutput)
|
|
This property sets the region used to extract cells.


When set, the output mutliblock dataset will be pruned to remove empty
nodes. On by default.
|
|
1
|
|
The value must be set to one of the following: Plane, Box, Sphere.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Input'''<br>''(Input)''
|'''MaintainStructure''' (MaintainStructure)
|
|
This property specifies the input to the Slice filter.


This is used only when PruneOutput is ON. By default, when pruning the
output i.e. remove empty blocks, if node has only 1 non-null child
block, then that node is removed. To preserve these parent nodes, set
this flag to true.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 


|}
|}


==Extract CTH Parts==


==Extract Edges==
Create a surface from a CTH volume fraction.
 
Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.
 
Extract edges of 2D and 3D cells as lines.


The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,218: Line 2,496:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Extract Edges filter.


This property specifies the input to the Extract CTH Parts filter.
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts input of following types:
* vtkDataSet
The dataset much contain a field array (cell)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
|'''Clip Type''' (ClipPlane)
|


This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.
|


|}
|
The value can be one of the following:
* None (implicit_functions)


* Plane (implicit_functions)


==Extract Generic Dataset Surface==
* Box (implicit_functions)


* Sphere (implicit_functions)


Extract geometry from a higher-order dataset
|-
|'''Double Volume Arrays''' (AddDoubleVolumeArrayName)
|


Extract geometry from a higher-order dataset.<br>
This property specifies the name(s) of the volume fraction array(s) for generating parts.
|


{| class="PropertiesTable" border="1" cellpadding="5"
|
An array of scalars is required.
|-
|-
| '''Property'''
|'''Float Volume Arrays''' (AddFloatVolumeArrayName)
| '''Description'''
|
| '''Default Value(s)'''
 
| '''Restrictions'''
This property specifies the name(s) of the volume fraction array(s) for generating parts.
|-
| '''Input'''<br>''(Input)''
|
|
Set the input to the Generic Geometry Filter.


|
|
An array of scalars is required.
|-
|'''Unsigned Character Volume Arrays''' (AddUnsignedCharVolumeArrayName)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


This property specifies the name(s) of the volume fraction array(s) for generating parts.
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.
|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|


The value of this property is the volume fraction value for the surface.


|-
| '''Pass Through Cell Ids'''<br>''(PassThroughCellIds)''
|
|
Select whether to forward original ids.
0.1
 
| 1
|
|
Only the values 0 and 1 are accepted.




|}
|}


==Extract Cells By Region==


==Extract Level==
This filter extracts cells that are inside/outside a region or at a region boundary.
This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.
To use this filter you must specify a region (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).


This filter extracts a range of groups from a hierarchical dataset.
This filter extracts a range of levels from a hierarchical dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,286: Line 2,586:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
|
This property specifies the input to the Extract Group filter.


This property specifies the input to the Slice filter.
|
|
|
Accepts input of following types:
* vtkDataSet
|-
|'''Intersect With''' (ImplicitFunction)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


This property sets the region used to extract cells.
|
|
The value can be one of the following:
* Plane (implicit_functions)


The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
* Box (implicit_functions)


* Sphere (implicit_functions)


|-
|-
| '''Levels'''<br>''(Levels)''
|'''InputBounds''' (InputBounds)
|
|
This property lists the levels to extract
from the input hierarchical dataset.


|
|
|
|
|}


|-
|'''Extraction Side''' (ExtractInside)
|


==Extract Selection==
This parameter controls whether to extract cells that are inside or outside the region.


|
1
|
The value(s) is an enumeration of the following:
* outside (0)
* inside (1)
|-
|'''Extract only intersected''' (Extract only intersected)
|


Extract different type of selections.
This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.
 
This filter extracts a set of cells/points given a selection.<br>
The selection can be obtained from a rubber-band selection<br>
(either cell, visible or in a frustum) or threshold selection<br>
and passed to the filter or specified by providing an ID list.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input from which the selection is extracted.
0
 
|
|
Accepts boolean values (0 or 1).
|-
|'''Extract intersected''' (Extract intersected)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable.


This parameter controls whether to extract cells that are on the boundary of the region.


|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
|
|
If this property is set to 1 the output preserves the topology of its
0
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).


|}


|-
==Extract Edges==
| '''Selection'''<br>''(Selection)''
|
The input that provides the selection object.


|
Extract edges of 2D and 3D cells as lines.
|
The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|-
| '''Show Bounds'''<br>''(ShowBounds)''
|'''Input''' (Input)
|
|
For frustum selection, if this property is set to 1 the output is the
outline of the frustum instead of the contents of the input that lie
within the frustum.


| 0
This property specifies the input to the Extract Edges filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkDataSet


|}
|}


==Extract Generic Dataset Surface==


==Extract Subset==
Extract geometry from a higher-order dataset
Extract geometry from a higher-order dataset.


Extract a subgrid from a structured grid with the option of setting subsample strides.
The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,393: Line 2,696:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Include Boundary'''<br>''(IncludeBoundary)''
|'''Input''' (Input)
|
|
If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.


| 0
Set the input to the Generic Geometry Filter.
|
|
Only the values 0 and 1 are accepted.


|
Accepts input of following types:
* vtkGenericDataSet
|-
|-
| '''Input'''<br>''(Input)''
|'''PassThroughCellIds''' (PassThroughCellIds)
|
|
This property specifies the input to the Extract Grid filter.
 
Select whether to forward original ids.


|
|
1
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
|}


==Extract Level==


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkRectilinearGrid, vtkStructuredPoints, vtkStructuredGrid.
This filter extracts a range of groups from a hierarchical dataset.
This filter extracts a range of levels from a hierarchical dataset




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Sample Rate I'''<br>''(SampleRateI)''
| '''Property'''
|
| '''Description'''
This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
| '''Default Value(s)'''
 
| '''Restrictions'''
| 1
|
The value must be greater than or equal to 1.
 


|-
|-
| '''Sample Rate J'''<br>''(SampleRateJ)''
|'''Input''' (Input)
|
|
This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


| 1
This property specifies the input to the Extract Group filter.
|
|
The value must be greater than or equal to 1.


|
Accepts input of following types:
* vtkHierarchicalBoxDataSet
|-
|-
| '''Sample Rate K'''<br>''(SampleRateK)''
|'''Levels''' (Levels)
|
This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
 
| 1
|
|
The value must be greater than or equal to 1.


This property lists the levels to extract
from the input hierarchical dataset.


|-
| '''V OI'''<br>''(VOI)''
|
|
This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.


| 0 0 0 0 0 0
|
|
The values must lie within the extent of the input dataset.




|}
|}


==Extract Selection==


==Extract Surface==
Extract different type of selections.
 
This filter extracts a set of cells/points given a selection.
 
The selection can be obtained from a rubber-band selection
Extract a 2D boundary surface using neighbor relations to eliminate internal faces.
(either cell, visible or in a frustum) or threshold selection
and passed to the filter or specified by providing an ID list.


The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,472: Line 2,774:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
 
This property specifies the input from which the selection is extracted.
 
|
|
This property specifies the input to the Extract Surface filter.


|
|
Accepts input of following types:
* vtkDataSet
* vtkTable
|-
|'''Selection''' (Selection)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The input that provides the selection object.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|
 


|
Accepts input of following types:
* vtkSelection
|-
|-
| '''Nonlinear Subdivision Level'''<br>''(NonlinearSubdivisionLevel)''
|'''PreserveTopology''' (PreserveTopology)
|
|
If the input is an unstructured grid with nonlinear faces, this
parameter determines how many times the face is subdivided into
linear faces.  If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear
interpolation are discarded).  If 1, the nonlinear face is
triangulated based on the midpoints.  If greater than 1, the
triangulated pieces are recursively subdivided to reach the
desired subdivision.  Setting the value to greater than 1 may
cause some point data to not be passed even if no quadratic faces
exist.  This option has no effect if the input is not an
unstructured grid.


| 1
If this property is set to 1 the output preserves the topology of its
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ShowBounds''' (ShowBounds)
|
|
The value must be greater than or equal to 0 and less than or equal to 4.


For frustum selection, if this property is set to 1 the output is the
outline of the frustum instead of the contents of the input that lie
within the frustum.


|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.
0
 
| 1
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 


|}
|}


==Extract Selection (internal)==


==FFT Of Selection Over Time==


This filter extracts a given set of cells or points given a selection.
The selection can be obtained from a rubber-band selection
(either point, cell, visible or in a frustum) and passed to the filter
or specified by providing an ID list.
This is an internal filter, use "ExtractSelection" instead.


Extracts selection over time and plots the FFT
Extracts the data of a selection (e.g. points or cells) over time,<br>
takes the FFT of them, and plots them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,532: Line 2,842:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
|'''Input''' (Input)
|
 
The input from which the selection is extracted.
 
|
|
The input from which the selection is extracted.


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''Selection''' (Selection)
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The input that provides the selection object.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable, vtkCompositeDataSet.
|
 


|
Accepts input of following types:
* vtkSelection
|-
|-
| '''Selection'''<br>''(Selection)''
|'''PreserveTopology''' (PreserveTopology)
|
|
The input that provides the selection object.
 
If this property is set to 1 the output preserves the topology of its
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.


|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.
 


|}
|}


==Extract Subset==


==FOF/SOD Halo Finder==
Extract a subgrid from a structured grid with the option of setting subsample strides.
 
The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.
 
Sorry, no help is currently available.




Line 2,573: Line 2,893:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''bb (linking length)'''<br>''(BB)''
|'''Input''' (Input)
|
|
Linking length measured in units of interparticle spacing and is dimensionless.  Used to link particles into halos for the friends-of-friends (FOF) algorithm.


| 0.2
This property specifies the input to the Extract Grid filter.
|
|
The value must be greater than or equal to 0.


|
Accepts input of following types:
* vtkImageData
* vtkRectilinearGrid
* vtkStructuredPoints
* vtkStructuredGrid
|-
|-
| '''Compute the most bound particle'''<br>''(ComputeMostBoundParticle)''
|'''VOI''' (VOI)
|
|
If checked, the most bound particle for an FOF halo will be calculated.  WARNING: This can be very slow.


| 0
This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.
 
|
0 0 0 0 0 0
|
The value(s) must lie within the structured-extents of the input dataset.
|-
|'''SampleRateI''' (SampleRateI)
|
|
Only the values 0 and 1 are accepted.


This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


|-
| '''Compute the most connected particle'''<br>''(ComputeMostConnectedParticle)''
|
|
If checked, the most connected particle for an FOF halo will be calculated.  WARNING: This can be very slow.
1
|


| 0
|-
|'''SampleRateJ''' (SampleRateJ)
|
|
Only the values 0 and 1 are accepted.


This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


|-
| '''Compute spherical overdensity (SOD) halos'''<br>''(ComputeSOD)''
|
|
If checked, spherical overdensity (SOD) halos will be calculated in addition to friends-of-friends (FOF) halos.
1
|


| 0
|-
|'''SampleRateK''' (SampleRateK)
|
|
Only the values 0 and 1 are accepted.


This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


|-
| '''Copy FOF halo catalog to original particles'''<br>''(CopyHaloDataToParticles)''
|
|
If checked, the friends-of-friends (FOF) halo catalog information will be copied to the original particles as well.
1
|


| 0
|-
|'''IncludeBoundary''' (IncludeBoundary)
|
|
Only the values 0 and 1 are accepted.


If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.


|-
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
0
|
|
The selected object must be the result of the following: sources (includes readers), filters.
Accepts boolean values (0 or 1).
 
|}


==Extract Surface==


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
Extract a 2D boundary surface using neighbor relations to eliminate internal faces.
The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''maximum radius factor'''<br>''(MaxRadiusFactor)''
| '''Property'''
|
| '''Description'''
Maximum radius factor for SOD finding.
| '''Default Value(s)'''
| '''Restrictions'''


| 2
|-
|'''Input''' (Input)
|
|
The value must be greater than or equal to 0.


 
This property specifies the input to the Extract Surface filter.
|-
| '''minimum FOF mass'''<br>''(MinFOFMass)''
|
|
Minimum FOF mass to calculate an SOD halo.


| 5e+12
|
|
Accepts input of following types:
* vtkDataSet
|-
|-
| '''minimum FOF size'''<br>''(MinFOFSize)''
|'''PieceInvariant''' (PieceInvariant)
|
Minimum FOF halo size to calculate an SOD halo.
 
| 1000
|
|
The value must be greater than or equal to 0.


If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.


|-
| '''minimum radius factor'''<br>''(MinRadiusFactor)''
|
|
Minimum radius factor for SOD finding.
1
 
| 0.5
|
|
The value must be greater than or equal to 0.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)''
|'''NonlinearSubdivisionLevel''' (NonlinearSubdivisionLevel)
|
Number of seeded particles in one dimension.  Therefore, total simulation particles is np^3 (cubed).
 
| 256
|
|
The value must be greater than or equal to 0.


If the input is an unstructured grid with nonlinear faces, this
parameter determines how many times the face is subdivided into
linear faces. If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear
interpolation are discarded). If 1, the nonlinear face is
triangulated based on the midpoints. If greater than 1, the
triangulated pieces are recursively subdivided to reach the
desired subdivision. Setting the value to greater than 1 may
cause some point data to not be passed even if no quadratic faces
exist. This option has no effect if the input is not an
unstructured grid.


|-
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)''
|
|
The space (in rL units) to extend processor particle ownership for ghost particles/cells.  Needed for correct halo calculation when halos cross processor boundaries in parallel computation.
1
 
| 5
|
|
The value must be greater than or equal to 0.




|-
|}
| '''pmin (minimum particle threshold for an FOF halo)'''<br>''(PMin)''
 
|
==FOF/SOD Halo Finder==
Minimum number of particles (threshold) needed before a group is called a friends-of-friends (FOF) halo.


| 100
|
The value must be greater than or equal to 1.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''rL (physical box side length)'''<br>''(RL)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
This property specifies the input of the filter.
|
|
The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or they are wrapped around).


| 100
|
|
The value must be greater than or equal to 0.
Accepts input of following types:
 
* vtkUnstructuredGrid
 
|-
|-
| '''rho_c'''<br>''(RhoC)''
|'''rL (physical box side length)''' (RL)
|
|
rho_c (critical density) for SOD halo finding.


| 2.77537e+11
The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or they are wrapped around).
 
|
|
|-
100
| '''number of bins'''<br>''(SODBins)''
|
|
Number of bins for SOD finding.


| 20
|-
|'''overlap (shared point/ghost cell gap distance)''' (Overlap)
|
|
The value must be greater than or equal to 1.


The space (in rL units) to extend processor particle ownership for ghost particles/cells. Needed for correct halo calculation when halos cross processor boundaries in parallel computation.


|-
| '''initial SOD center'''<br>''(SODCenterType)''
|
|
The initial friends-of-friends (FOF) center used for calculating a spherical overdensity (SOD) halo.  WARNING: Using MBP or MCP can be very slow.
5
|


| 0
|-
|'''np (number of seeded particles in one dimension, i.e., total particles = np^3)''' (NP)
|
|
The value must be one of the following: Center of mass (0), Average position (1), Most bound particle (2), Most connected particle (3).


Number of seeded particles in one dimension. Therefore, total simulation particles is np^3 (cubed).


|-
| '''initial SOD mass'''<br>''(SODMass)''
|
|
The initial SOD mass.
256
|


| 1e+14
|-
|'''bb (linking length)''' (BB)
|
|
The value must be greater than or equal to 0.


Linking length measured in units of interparticle spacing and is dimensionless. Used to link particles into halos for the friends-of-friends (FOF) algorithm.


|}
|
0.20
|


|-
|'''pmin (minimum particle threshold for an FOF halo)''' (PMin)
|


==Feature Edges==
Minimum number of particles (threshold) needed before a group is called a friends-of-friends (FOF) halo.


|
100
|


This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.
The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
|'''Copy FOF halo catalog to original particles''' (CopyHaloDataToParticles)
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Edges'''<br>''(BoundaryEdges)''
|
|
If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.


| 1
If checked, the friends-of-friends (FOF) halo catalog information will be copied to the original particles as well.
|
Only the values 0 and 1 are accepted.


|-
| '''Coloring'''<br>''(Coloring)''
|
|
If the value of this property is set to 1, then the extracted edges are assigned a scalar value based on the type of the edge.
0
 
| 0
|
|
Only the values 0 and 1 are accepted.
Accepts boolean values (0 or 1).
 
 
|-
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|'''Compute the most bound particle''' (ComputeMostBoundParticle)
|
|
Ths value of this property is used to define a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. (See the FeatureEdges property.)


| 30
If checked, the most bound particle for an FOF halo will be calculated. WARNING: This can be very slow.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Compute the most connected particle''' (ComputeMostConnectedParticle)
|
|
The value must be greater than or equal to 0 and less than or equal to 180.


If checked, the most connected particle for an FOF halo will be calculated. WARNING: This can be very slow.


|
0
|
Accepts boolean values (0 or 1).
|-
|-
| '''Feature Edges'''<br>''(FeatureEdges)''
|'''Compute spherical overdensity (SOD) halos''' (ComputeSOD)
|
|
If the value of this property is set to 1, feature edges will be extracted. Feature edges are defined as edges that are used by two polygons whose dihedral angle is greater than the feature angle. (See the FeatureAngle property.)
Toggle whether to extract feature edges.


| 1
If checked, spherical overdensity (SOD) halos will be calculated in addition to friends-of-friends (FOF) halos.
 
|
0
|
Accepts boolean values (0 or 1).
|-
|'''initial SOD center''' (SODCenterType)
|
|
Only the values 0 and 1 are accepted.


The initial friends-of-friends (FOF) center used for calculating a spherical overdensity (SOD) halo. WARNING: Using MBP or MCP can be very slow.


|
0
|
The value(s) is an enumeration of the following:
* Center of mass (0)
* Average position (1)
* Most bound particle (2)
* Most connected particle (3)
|-
|-
| '''Input'''<br>''(Input)''
|'''rho_c''' (RhoC)
|
|
This property specifies the input to the Feature Edges filter.
 
rho_c (critical density) for SOD halo finding.


|
|
2.77536627e11
|
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
|'''initial SOD mass''' (SODMass)
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
The initial SOD mass.


|
1.0e14
|


|-
|-
| '''Manifold Edges'''<br>''(ManifoldEdges)''
|'''minimum radius factor''' (MinRadiusFactor)
|
|
If the value of this property is set to 1, manifold edges will be extracted. Manifold edges are defined as edges that are used by exactly two polygons.


| 0
Minimum radius factor for SOD finding.
 
|
0.5
|
|
Only the values 0 and 1 are accepted.


|-
|-
| '''Non-Manifold Edges'''<br>''(NonManifoldEdges)''
|'''maximum radius factor''' (MaxRadiusFactor)
|
|
If the value of this property is set to 1, non-manifold ediges will be extracted. Non-manifold edges are defined as edges that are use by three or more polygons.


| 1
Maximum radius factor for SOD finding.
 
|
2.0
|
|
Only the values 0 and 1 are accepted.


|-
|'''number of bins''' (SODBins)
|


|}
Number of bins for SOD finding.


|
20
|


==Gaussian Resampling==
|-
|'''minimum FOF size''' (MinFOFSize)
|


Minimum FOF halo size to calculate an SOD halo.


Splat points into a volume with an elliptical, Gaussian distribution.
|
 
1000
vtkGaussianSplatter is a filter that injects input points into a<br>
|
structured points (volume) dataset. As each point is injected, it "splats"<br>
or distributes values to nearby voxels. Data is distributed using an<br>
elliptical, Gaussian distribution function. The distribution function is<br>
modified using scalar values (expands distribution) or normals<br>
(creates ellipsoidal distribution rather than spherical).<br><br><br>
Warning: results may be incorrect in parallel as points can't splat<br>
into other processor's cells.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Splat Accumulation Mode'''<br>''(Accumulation Mode)''
|'''minimum FOF mass''' (MinFOFMass)
|
|
Specify the scalar accumulation mode. This mode expresses how scalar values are combined when splats are overlapped. The Max mode acts like a set union operation and is the most commonly used; the Min mode acts like a set intersection, and the sum is just weird.


| 1
Minimum FOF mass to calculate an SOD halo.
 
|
5.0e12
|
|
The value must be one of the following: Min (0), Max (1), Sum (2).




|-
|}
| '''Fill Value'''<br>''(CapValue)''
|
Specify the cap value to use. (This instance variable only has effect if the ivar Capping is on.)


| 0
==Feature Edges==
|
|-
| '''Fill Volume Boundary'''<br>''(Capping)''
|
Turn on/off the capping of the outer boundary of the volume to a specified cap value. This can be used to close surfaces (after iso-surfacing) and create other effects.


| 1
This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.
|
The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.
Only the values 0 and 1 are accepted.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Ellipitical Eccentricity'''<br>''(Eccentricity)''
| '''Property'''
|
| '''Description'''
Control the shape of elliptical splatting. Eccentricity is the ratio of the major axis (aligned along normal) to the minor (axes) aligned along other two axes. So Eccentricity gt 1 creates needles with the long axis in the direction of the normal; Eccentricity lt 1 creates pancakes perpendicular to the normal vector.
| '''Default Value(s)'''
| '''Restrictions'''


| 2.5
|
|-
|-
| '''Gaussian Exponent Factor'''<br>''(ExponentFactor)''
|'''Input''' (Input)
|
|
Set / get the sharpness of decay of the splats. This is the exponent constant in the Gaussian equation. Normally this is a negative value.


| -5
This property specifies the input to the Feature Edges filter.
|
|
The value must be less than or equal to 0.


|
Accepts input of following types:
* vtkPolyData
|-
|-
| '''Input'''<br>''(Input)''
|'''BoundaryEdges''' (BoundaryEdges)
|
|
This property specifies the input to the filter.


If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.