ParaView/Users Guide/List of filters: Difference between revisions

From KitwarePublic
Jump to navigationJump to search
No edit summary
No edit summary
 
(15 intermediate revisions by 4 users not shown)
Line 1: Line 1:
[[ParaViewUsersGuide]]


==AMR Connectivity==


Fragment Identification


==AMR Contour=
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the volume input of the
filter.
|


|
Accepts input of following types:
* vtkNonOverlappingAMR
The dataset must contain a field array (cell)


with 1 component(s).


|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragments
|


{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
An array of scalars is required.
| '''Description''
|-
| '''Default Value(s)''
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
| '''Restrictions''
|
|
| '''Capping'''<br>''(Capping)'
This property specifies the values at which to compute
 
the isosurface.
If this property is on, the the boundary of the data set is capped
 
|
|
 
0.1
Only the values 0 and 1 are accepted
 
 
|
|
| '''Isosurface'''<br>''(ContourValue)'
This property specifies the values at which to compute the isosurface


|-
|'''Resolve Blocks''' (Resolve Blocks)
|
|
 
Resolve the fragments between blocks.
The value must lie within the range of the selected data array
 
 
|
|
| '''Degenerate Cells'''<br>''(DegenerateCells)'
1
 
If this property is on, a transition mesh between levels is created
 
|
|
 
Accepts boolean values (0 or 1).
Only the values 0 and 1 are accepted
|-
 
|'''Propagate Ghosts''' (Propagate Ghosts)
 
|
|
| '''Input'''<br>''(Input)'
Propagate regionIds into the ghosts.
|
|
0
|
|
Accepts boolean values (0 or 1).


The selected object must be the result of the following: sources (includes readers), filters
|}


==AMR Contour==


The dataset must contain a cell array with 1 components
Iso surface cell array.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet
|-
 
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|


|
|
| '''Merge Points'''<br>''(MergePoints)'
Accepts input of following types:
* vtkCompositeDataSet
The dataset must contain a field array (cell)


Use more memory to merge points on the boundaries of blocks
with 1 component(s).


|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
|
 
This property specifies the cell arrays from which the
Only the values 0 and 1 are accepted
contour filter will compute contour cells.
 
 
|
|
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)'
If this property is off, each process executes independantly


|
|
 
An array of scalars is required.
Only the values 0 and 1 are accepted
|-
 
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
 
|
|
| '''Contour By'''<br>''(SelectInputScalars)'
This property specifies the values at which to compute
 
the isosurface.
This property specifies the name of the cell scalar array from which the contour filter will compute isolines and/or isosurfaces
 
|
|
 
0.1
An array of scalars is required
 
 
|
|
| '''Skip Ghost Copy'''<br>''(SkipGhostCopy)'
A simple test to see if ghost values are already set properly


|-
|'''Capping''' (Capping)
|
|
 
If this property is on, the the boundary of the data set
Only the values 0 and 1 are accepted
is capped.
 
 
|
|
| '''Triangulate'''<br>''(Triangulate)'
1
 
Use triangles instead of quads on capping surfaces
 
|
|
 
Accepts boolean values (0 or 1).
Only the values 0 and 1 are accepted
|-
 
|'''DegenerateCells''' (DegenerateCells)
 
|
|
If this property is on, a transition mesh between levels
is created.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MultiprocessCommunication''' (MultiprocessCommunication)
|
If this property is off, each process executes
independantly.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''SkipGhostCopy''' (SkipGhostCopy)
|
A simple test to see if ghost values are already set
properly.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Triangulate''' (Triangulate)
|
Use triangles instead of quads on capping
surfaces.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MergePoints''' (MergePoints)
|
Use more memory to merge points on the boundaries of
blocks.
|
1
|
Accepts boolean values (0 or 1).


|}


==AMR Dual Clip=
==AMR CutPlane==
 


Clip with scalars.  Tetrahedra
Planar Cut of an AMR grid datasetThis filter
creates a cut-plane of the


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
This property specifies the input for this
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Degenerate Cells'''<br>''(DegenerateCells)'


If this property is on, a transition mesh between levels is created
|
Accepts input of following types:
* vtkOverlappingAMR
|-
|'''UseNativeCutter''' (UseNativeCutter)
|
This property specifies whether the ParaView's generic
dataset cutter is used instead of the specialized AMR
cutter.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''LevelOfResolution''' (LevelOfResolution)
|
Set maximum slice resolution.
|
0
|


|-
|'''Center''' (Center)
|
|


Only the values 0 and 1 are accepted
|
0.5 0.5 0.5
|


|-
|'''Normal''' (Normal)
|


|
|
| '''Input'''<br>''(Input)'
0 0 1
|
|
|


The selected object must be the result of the following: sources (includes readers), filters


|}


The dataset must contain a cell array with 1 components
==AMR Dual Clip==


Clip with scalars. Tetrahedra.


The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|


|
|
| '''Merge Points'''<br>''(MergePoints)'
Accepts input of following types:
* vtkCompositeDataSet
The dataset must contain a field array (cell)


Use more memory to merge points on the boundaries of blocks
with 1 component(s).


|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
clip filter will compute clipped cells.
|
|


Only the values 0 and 1 are accepted
|
 
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
This property specifies the values at which to compute
the isosurface.
|
0.1
|


|-
|'''InternalDecimation''' (InternalDecimation)
|
If this property is on, internal tetrahedra are
decimation
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MultiprocessCommunication''' (MultiprocessCommunication)
|
If this property is off, each process executes
independantly.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MergePoints''' (MergePoints)
|
Use more memory to merge points on the boundaries of
blocks.
|
1
|
|
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)'
Accepts boolean values (0 or 1).


If this property is off, each process executes independantly
|}


|
==AMR Fragment Integration==


Only the values 0 and 1 are accepted
Fragment Integration


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the volume input of the
filter.
|
|
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)'
This property specifies the cell arrays from which the clip filter wil
compute clipped cells


|
|
Accepts input of following types:
* vtkNonOverlappingAMR
The dataset must contain a field array (cell)


An array of scalars is required
with 1 component(s).


|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragments
|


|
|
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)'
An array of scalars is required.
|-
|'''SelectMassArrays''' (SelectMassArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragment mass
|


This property specifies the values at which to compute the isosurface
|
An array of scalars is required.
|-
|'''SelectVolumeWeightedArrays''' (SelectVolumeWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine volume weighted average values
|


| 0.
|
An array of scalars is required.
|-
|'''SelectMassWeightedArrays''' (SelectMassWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine mass weighted average values
|


The value must be greater than or equal to 0 and less than or equal to 1
|
An array of scalars is required.


|}


|
==AMR Fragments Filter==


Meta Fragment filterCombines the running of
AMRContour, AMRFragmentIntegration, AMRDualContour and ExtractCTHParts


==Annotate Time Filter=
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the volume input of the
filter.
|


Shows input data time as text annnotation in the view
|
Accepts input of following types:
* vtkNonOverlappingAMR
The dataset must contain a field array (cell)


The Annotate Time filter can be used to show the data time in a text annotation.<br
with 1 component(s).


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''SelectMaterialArrays''' (SelectMaterialArrays)
|
|
| '''Property''
This property specifies the cell arrays from which the
| '''Description''
analysis will determine fragments
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Format'''<br>''(Format)'


The value of this property is a format string used to display the input time. The format string is specified using printf style
|
An array of scalars is required.
|-
|'''SelectMassArrays''' (SelectMassArrays)
|
This property specifies the cell arrays from which the
analysis will determine fragment mass
|


| Time: %
|
|
An array of scalars is required.
|-
|'''SelectVolumeWeightedArrays''' (SelectVolumeWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine volume weighted average values
|
|
| '''Input'''<br>''(Input)'


This property specifies the input dataset for which to display the time
|
An array of scalars is required.
|-
|'''SelectMassWeightedArrays''' (SelectMassWeightedArrays)
|
This property specifies the cell arrays from which the
analysis will determine mass weighted average values
|


|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
This property specifies the values at which to compute
the isosurface.
|
0.1
|
|


The selected object must be the result of the following: sources (includes readers), filters
|-
 
|'''Extract Surface''' (Extract Surface)
 
|
Whether or not to extract a surface from this data
|
0
|
|
| '''Scale'''<br>''(Scale)'
Accepts boolean values (0 or 1).
 
|-
The factor by which the input time is scaled
|'''Use Watertight Surface''' (Use Watertight Surface)
 
|
|
Whether the extracted surface should be watertight or not
|
|
0
|
|
| '''Shift'''<br>''(Shift)'
Accepts boolean values (0 or 1).
 
|-
The amount of time the input is shifted (after scaling)
|'''Integrate Fragments''' (Integrate Fragments)
 
|
|
Whether or not to integrate fragments in this data
|
|
1
|
|
Accepts boolean values (0 or 1).


|}


==Append Attributes=
==Add Field Arrays==


Reads arrays from a file and adds them to the input data object.
Takes in an input data object and a filename. Opens the file
and adds any arrays it sees there to the input data.


Copies geometry from first input.  Puts all of the arrays into the output


The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
The input.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Append Attributes filter
|


|-
|'''FileName''' (FileName)
|
|


The selected object must be the result of the following: sources (includes readers), filters
This property specifies the file to read to get arrays


|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
|
The value(s) must be a filename (or filenames).


|}


|
==Angular Periodic Filter==


This filter generate a periodic multiblock dataset.This filter generate a periodic
multiblock dataset


==Append Datasets=
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Periodic filter.


Takes an input of multiple datasets and output has only one unstructured grid
|


The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
Accepts input of following types:
| '''Description''
* vtkDataSet
| '''Default Value(s)''
|-
| '''Restrictions''
|'''BlockIndices''' (BlockIndices)
|
This property lists the ids of the blocks to make periodic
from the input multiblock dataset.
|
|
| '''Input'''<br>''(Input)'


This property specifies the datasets to be merged into a single dataset by the Append Datasets filter
|


|-
|'''IterationMode''' (IterationMode)
|
This property specifies the mode of iteration, either a user-provided number
of periods, or the maximum number of periods to rotate to 360°.
|
1
|
The value(s) is an enumeration of the following:
* Manual (0)
* Maximum (1)
|-
|'''NumberOfPeriods''' (NumberOfPeriods)
|
This property specifies the number of iteration
|
3
|
|


The selected object must be the result of the following: sources (includes readers), filters
|-
 
|'''RotationMode''' (RotationMode)
|
This property specifies the mode of rotation, either from a user provided
angle or from an array in the data.
|
0
|
The value(s) is an enumeration of the following:
* Direct Angle (0)
* Array Value (1)
|-
|'''RotationAngle''' (RotationAngle)
|
Rotation angle in degree.
 
|
10
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
|-
|'''RotationArrayName''' (RotationArrayName)
|
Field array name that contains the rotation angle in radian.


|
periodic angle
|


|-
|'''Axis''' (Axis)
|
This property specifies the axis of rotation
|
0
|
The value(s) is an enumeration of the following:
* Axis X (0)
* Axis Y (1)
* Axis Z (2)
|-
|'''Center''' (Center)
|
This property specifies the 3D coordinates for the
center of the rotation.
|
0.0 0.0 0.0
|
|




==Append Geometry=
|}


==Annotate Attribute Data==


Takes an input of multiple poly data parts and output has only one part
Adds a text annotation to a Rander View
This filter can be used to add a text annotation to a Render View (or
similar) using a tuple from any attribute array (point/cell/field/row
etc.) from a specific rank (when running in parallel). Use **ArrayName**
property to select the array association and array name. Use
**ElementId* property to set the element number to extract the value to
label with. When running on multiple ranks, use **ProcessId** property
to select the rank of interest. The **Prefix** property can be used to
specify a string that will be used as the prefix to the generated
annotation text.


The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.<br


{| class="PropertiesTable" border="1" cellpadding="5
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'


Set the input to the Append Geometry filter
Set the input of the filter. To avoid the complications/confusion when identifying
elements in a composite dataset, this filter doesn't support composite datasets
currently.


|
|


The selected object must be the result of the following: sources (includes readers), filters
|
Accepts input of following types:
* vtkDataSet
* vtkTable
The dataset must contain a field array (any)


with 1 component(s).


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
|-
|'''ArrayAssociation''' (ArrayAssociation)
|
Select the attribute to use to popular array names from.
|
2
|
The value(s) is an enumeration of the following:
* Point Data (0)
* Cell Data (1)
* Field Data (2)
* Row Data (6)
|-
|'''ArrayName''' (ArrayName)
|
Choose arrays that is going to be displayed
|


|


|-
|'''ElementId''' (ElementId)
|
|


Set the element index to annotate with.


==Block Scalars=
|
0
|


|-
|'''ProcessId''' (ProcessId)
|


The Level Scalars filter uses colors to show levels of a multiblock dataset
Set the process rank to extract element from.


The Level Scalars filter uses colors to show levels of a multiblock dataset.<br
|
0
|


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Prefix''' (Prefix)
|
Text that is used as a prefix to the field value
|
|
| '''Property''
Value is:
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Level Scalars filter


|
|}


The selected object must be the result of the following: sources (includes readers), filters
==Annotate Global Data==


Filter for annotating with global data (designed for ExodusII reader)
Annotate Global Data provides a simpler API for creating text
annotations using vtkPythonAnnotationFilter. Instead of users
specifying the annotation expression, this filter determines the
expression based on the array selected by limiting the scope of the
functionality. This filter only allows the user to annotate using
"global-data" aka field data and specify the string prefix to use. If
the field array chosen has as many elements as number of timesteps,
the array is assumed to be "temporal" and indexed using the current
timestep.


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
Set the input of the filter.
|
|


|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (none)


==Calculator=
with 1 component(s).


|-
|'''SelectArrays''' (SelectArrays)
|
Choose arrays that is going to be
displayed
|


Compute new attribute arrays as function of existing arrays
|


The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector. The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies.<br
|-
Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.<br
|'''Prefix''' (Prefix)
Clear: Erase the current function (displayed in the read-only text box above the calculator buttons).<br
|
/: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses.<br
Text that is used as a prefix to the field
*: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses.<br
value
-: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses.<br
|
+: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses.<br
Value is:  
sin: Compute the sine of a scalar.<br
|
cos: Compute the cosine of a scalar.<br
tan: Compute the tangent of a scalar.<br
asin: Compute the arcsine of a scalar.<br
acos: Compute the arccosine of a scalar.<br
atan: Compute the arctangent of a scalar.<br
sinh: Compute the hyperbolic sine of a scalar.<br
cosh: Compute the hyperbolic cosine of a scalar.<br
tanh: Compute the hyperbolic tangent of a scalar.<br
min: Compute minimum of two scalars.<br
max: Compute maximum of two scalars.<br
x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses.<br
sqrt: Compute the square root of a scalar.<br
e^x: Raise e to the power of a scalar.<br
log: Compute the logarithm of a scalar (deprecated. same as log10).<br
log10: Compute the logarithm of a scalar to the base 10.<br
ln: Compute the logarithm of a scalar to the base 'e'.<br
ceil: Compute the ceiling of a scalar.<br
floor: Compute the floor of a scalar.<br
abs: Compute the absolute value of a scalar.<br
v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses.<br
cross: Compute cross product of two vectors.<br
mag: Compute the magnitude of a vector.<br
norm: Normalize a vector.<br
The operands are described below.<br
The digits 0 - 9 and the decimal point are used to enter constant scalar values.<br
iHat, jHat, and kHat are vector constants representing unit vectors in the X, Y, and Z directions, respectively.<br
The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell).<br
The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.<br


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Suffix''' (Suffix)
|
|
| '''Property''
Text that is used as a suffix to the field
| '''Description''
value
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Attribute Mode'''<br>''(AttributeMode)'
This property determines whether the computation is to be performed on point-centered or cell-centered data


|
|


The value must be one of the following: point_data (1), cell_data (2), field_data (5)


|}


|
==Annotate Time Filter==
| '''Coordinate Results'''<br>''(CoordinateResults)'
 
The value of this property determines whether the results of this computation should be used as point coordinates or as a new array
 
|


Only the values 0 and 1 are accepted
Shows input data time as text annnotation in the view.The Annotate Time
filter can be used to show the data time in a text
annotation.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
| '''Function'''<br>''(Function)'
This property specifies the input dataset for which to
 
display the time.
This property contains the equation for computing the new array
 
|
|
|
|
| '''Input'''<br>''(Input)'
This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays


|
|


The selected object must be the result of the following: sources (includes readers), filters
|-
 
|'''Format''' (Format)
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
 
 
|
|
| '''Replace Invalid Results'''<br>''(ReplaceInvalidValues)'
The value of this property is a format string used to
 
display the input time. The format string is specified using printf
This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.
style.
 
|
|
 
Time: %f
Only the values 0 and 1 are accepted
 
 
|
|
| '''Replacement Value'''<br>''(ReplacementValue)'
If invalid values in the computation are to be replaced with another value, this property contains that value


|-
|'''Shift''' (Shift)
|
|
The amount of time the input is shifted (after
scaling).
|
|
0.0
|
|
| '''Result Array Name'''<br>''(ResultArrayName)'


This property contains the name for the output array containing the result of this computation
|-
 
|'''Scale''' (Scale)
| Resul
|
The factor by which the input time is
scaled.
|
|
1.0
|
|




==Cell Centers=
|}


==Append Attributes==


Create a point (no geometry) at the center of each input cell
Copies geometry from first input. Puts all of the arrays into the output.
The Append Attributes filter takes multiple input data
sets with the same geometry and merges their point and
cell attributes to produce a single output containing all
the point and cell attributes of the inputs. Any inputs
without the same number of points and cells as the first
input are ignored. The input data sets must already be
collected together, either as a result of a reader that
loads multiple parts (e.g., EnSight reader) or because the
Group Parts filter has been run to form a collection of
data sets.


The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
This property specifies the input to the Append
| '''Description''
Attributes filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'
This property specifies the input to the Cell Centers filter


|
|
Accepts input of following types:
* vtkDataSet


The selected object must be the result of the following: sources (includes readers), filters
|}


==Append Datasets==


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
Takes an input of multiple datasets and output has only one unstructured grid.The Append
Datasets filter operates on multiple data sets of any type
(polygonal, structured, etc.). It merges their geometry
into a single data set. Only the point and cell attributes
that all of the input data sets have in common will appear
in the output. The input data sets must already be
collected together, either as a result of a reader that
loads multiple parts (e.g., EnSight reader) or because the
Group Parts filter has been run to form a collection of
data sets.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
| '''Vertex Cells'''<br>''(VertexCells)'
This property specifies the datasets to be merged into a
 
single dataset by the Append Datasets filter.
If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated
 
|
|
Only the values 0 and 1 are accepted


|
|
Accepts input of following types:
* vtkDataSet


|}


==Cell Data to Point Data=
==Append Geometry==


Takes an input of multiple poly data parts and output has only one part.The Append
Geometry filter operates on multiple polygonal data sets.
It merges their geometry into a single data set. Only the
point and cell attributes that all of the input data sets
have in common will appear in the output.


Create point attributes by averaging cell attributes
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.<br
|-
 
|'''Input''' (Input)
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
Set the input to the Append Geometry
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'
This property specifies the input to the Cell Data to Point Data filter


|
|
Accepts input of following types:
* vtkPolyData


The selected object must be the result of the following: sources (includes readers), filters
|}


==Block Scalars==


The dataset must contain a cell array
The Level Scalars filter uses colors to show levels of a multiblock dataset.The Level
Scalars filter uses colors to show levels of a multiblock
dataset.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
|-
 
|'''Input''' (Input)
|
This property specifies the input to the Level Scalars
filter.
|


|
|
| '''Pass Cell Data'''<br>''(PassCellData)'
Accepts input of following types:
* vtkMultiBlockDataSet


If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output
|}


|
==CTH Surface==


Only the values 0 and 1 are accepted
Not finished yet.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|
|


|
Accepts input of following types:
* vtkCompositeDataSet


==Clean=
|}


==CacheKeeper==


Merge coincident points if they do not meet a feature edge criteria
vtkPVCacheKeeper manages data cache for flip book
animations. When caching is disabled, this simply acts as a pass through
filter. When caching is enabled, is the current time step has been
previously cached then this filter shuts the update request, otherwise
propagates the update and then cache the result for later use. The
current time step is set using SetCacheTime().


The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
Set the input to the Update Suppressor
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Absolute Tolerance'''<br>''(AbsoluteTolerance)'


If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set
|


|-
|'''CacheTime''' (CacheTime)
|
|


The value must be greater than or equal to 0
|
 
0.0
|


|-
|'''CachingEnabled''' (CachingEnabled)
|
Toggle whether the caching is enabled.
|
|
| '''Convert Lines To Points'''<br>''(ConvertLinesToPoints)'
1
 
If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points
 
|
|
Accepts boolean values (0 or 1).


Only the values 0 and 1 are accepted
|}


==Calculator==


|
Compute new attribute arrays as function of existing arrays.
| '''Convert Polys To Lines'''<br>''(ConvertPolysToLines)'
The Calculator filter computes a new data array or new point
coordinates as a function of existing scalar or vector arrays. If
point-centered arrays are used in the computation of a new data array,
the resulting array will also be point-centered. Similarly,
computations using cell-centered arrays will produce a new
cell-centered array. If the function is computing point coordinates,
the result of the function must be a three-component vector.


If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines
The Calculator interface operates similarly to a scientific
calculator. In creating the function to evaluate, the standard order
of operations applies. Each of the calculator functions is described
below. Unless otherwise noted, enclose the operand in parentheses
using the ( and ) buttons.


|
- Clear: Erase the current function (displayed in the read-only text
box above the calculator buttons).
- /: Divide one scalar by another. The operands for this function are
not required to be enclosed in parentheses.
- *: Multiply two scalars, or multiply a vector by a scalar (scalar multiple).
The operands for this function are not required to be enclosed in parentheses.
- -: Negate a scalar or vector (unary minus), or subtract one scalar or vector
from another. The operands for this function are not required to be enclosed
in parentheses.
- +: Add two scalars or two vectors. The operands for this function are not
required to be enclosed in parentheses.
- sin: Compute the sine of a scalar. cos: Compute the cosine of a scalar.
- tan: Compute the tangent of a scalar.
- asin: Compute the arcsine of a scalar.
- acos: Compute the arccosine of a scalar.
- atan: Compute the arctangent of a scalar.
- sinh: Compute the hyperbolic sine of a scalar.
- cosh: Compute the hyperbolic cosine of a scalar.
- tanh: Compute the hyperbolic tangent of a scalar.
- min: Compute minimum of two scalars.
- max: Compute maximum of two scalars.
- x^y: Raise one scalar to the power of another scalar. The operands for
this function are not required to be enclosed in parentheses.
- sqrt: Compute the square root of a scalar.
- e^x: Raise e to the power of a scalar.
- log: Compute the logarithm of a scalar (deprecated. same as log10).
- log10: Compute the logarithm of a scalar to the base 10.
- ln: Compute the logarithm of a scalar to the base 'e'.
- ceil: Compute the ceiling of a scalar. floor: Compute the floor of a scalar.
- abs: Compute the absolute value of a scalar.
- v1.v2: Compute the dot product of two vectors. The operands for this
function are not required to be enclosed in parentheses.
- cross: Compute cross product of two vectors.
- mag: Compute the magnitude of a vector.
- norm: Normalize a vector.


Only the values 0 and 1 are accepted
The operands are described below. The digits 0 - 9 and the decimal
point are used to enter constant scalar values. **iHat**, **jHat**,
and **kHat** are vector constants representing unit vectors in the X,
Y, and Z directions, respectively. The scalars menu lists the names of
the scalar arrays and the components of the vector arrays of either
the point-centered or cell-centered data. The vectors menu lists the
names of the point-centered or cell-centered vector arrays. The
function will be computed for each point (or cell) using the scalar or
vector value of the array at that point (or cell). The filter operates
on any type of data set, but the input data set must have at least one
scalar or vector array. The arrays can be either point-centered or
cell-centered. The Calculator filter's output is of the same data set
type as the input.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input dataset to the
Calculator filter. The scalar and vector variables may be chosen from
this dataset's arrays.
|
|
| '''Convert Strips To Polys'''<br>''(ConvertStripsToPolys)'


If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array ()


|-
|'''AttributeMode''' (AttributeMode)
|
This property determines whether the computation is to
be performed on point-centered or cell-centered data.
|
1
|
The value(s) is an enumeration of the following:
* Point Data (1)
* Cell Data (2)
|-
|'''CoordinateResults''' (CoordinateResults)
|
The value of this property determines whether the
results of this computation should be used as point coordinates or as a
new array.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ResultNormals''' (ResultNormals)
|
Set whether to output results as point/cell
normals. Outputing as normals is only valid with vector
results. Point or cell normals are selected using
AttributeMode.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ResultTCoords''' (ResultTCoords)
|
Set whether to output results as point/cell
texture coordinates. Point or cell texture coordinates are
selected using AttributeMode. 2-component texture coordinates
cannot be generated at this time.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ResultArrayName''' (ResultArrayName)
|
This property contains the name for the output array
containing the result of this computation.
|
Result
|
|


Only the values 0 and 1 are accepted
|-
|'''Function''' (Function)
|


This property contains the equation for computing the new
array.


|
|
| '''Input'''<br>''(Input)'


Set the input to the Clean filter
|


|-
|'''Replace Invalid Results''' (ReplaceInvalidValues)
|
This property determines whether invalid values in the
computation will be replaced with a specific value. (See the
ReplacementValue property.)
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ReplacementValue''' (ReplacementValue)
|
If invalid values in the computation are to be replaced
with another value, this property contains that value.
|
0.0
|
|


The selected object must be the result of the following: sources (includes readers), filters


|}
==Cell Centers==


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
Create a point (no geometry) at the center of each input cell.The Cell Centers
filter places a point at the center of each cell in the
input data set. The center computed is the parametric
center of the cell, not necessarily the geometric or
bounding box center. The cell attributes of the input will
be associated with these newly created points of the
output. You have the option of creating a vertex cell per
point in the outpuut. This is useful because vertex cells
are rendered, but points are not. The points themselves
could be used for placing glyphs (using the Glyph filter).
The Cell Centers filter takes any type of data set as
input and produces a polygonal data set as
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Cell Centers
filter.
|
|
| '''Piece Invariant'''<br>''(PieceInvariant)'
If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel


|
|
Accepts input of following types:
* vtkDataSet
|-
|'''VertexCells''' (VertexCells)
|
If set to 1, a vertex cell will be generated per point
in the output. Otherwise only points will be generated.
|
0
|
Accepts boolean values (0 or 1).


Only the values 0 and 1 are accepted
|}


==Cell Data to Point Data==


|
Create point attributes by averaging cell attributes.The Cell
| '''Point Merging'''<br>''(PointMerging)'
Data to Point Data filter averages the values of the cell
attributes of the cells surrounding a point to compute
point attributes. The Cell Data to Point Data filter
operates on any type of data set, and the output data set
is of the same type as the input.


If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Cell Data to
Point Data filter.
|
|


Only the values 0 and 1 are accepted
|
 
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (cell)


|-
|'''PassCellData''' (PassCellData)
|
If this property is set to 1, then the input cell data
is passed through to the output; otherwise, only the generated point
data will be available in the output.
|
|
| '''Tolerance'''<br>''(Tolerance)'
0
 
If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set
 
|
|
 
Accepts boolean values (0 or 1).
The value must be greater than or equal to 0 and less than or equal to 1
|-
 
|'''PieceInvariant''' (PieceInvariant)
 
|
|
| '''Tolerance Is Absolute'''<br>''(ToleranceIsAbsolute)'
If the value of this property is set to 1, this filter
 
will request ghost levels so that the values at boundary points match
This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging
across processes. NOTE: Enabling this option might cause multiple
 
executions of the data source because more information is needed to
remove internal surfaces.
|
|
 
0
Only the values 0 and 1 are accepted
 
 
|
|
Accepts boolean values (0 or 1).


|}


==Clean to Grid=
==Clean==


Merge coincident points if they do not meet a feature edge criteria.The Clean filter
takes polygonal data as input and generates polygonal data
as output. This filter can merge duplicate points, remove
unused points, and transform degenerate cells into their
appropriate forms (e.g., a triangle is converted into a
line if two of its points are merged).


This filter merges points and converts the data set to unstructured grid
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br
|-
 
|'''Input''' (Input)
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
Set the input to the Clean filter.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Clean to Grid filter
|
Accepts input of following types:
* vtkPolyData
|-
|'''PieceInvariant''' (PieceInvariant)
|
If this property is set to 1, the whole data set will be
processed at once so that cleaning the data set always produces the
same results. If it is set to 0, the data set can be processed one
piece at a time, so it is not necessary for the entire data set to fit
into memory; however the results are not guaranteed to be the same as
they would be if the Piece invariant option was on. Setting this option
to 0 may produce seams in the output dataset when ParaView is run in
parallel.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''Tolerance''' (Tolerance)
|
If merging nearby points (see PointMerging property) and
not using absolute tolerance (see ToleranceIsAbsolute property), this
property specifies the tolerance for performing merging as a fraction
of the length of the diagonal of the bounding box of the input data
set.
|
0.0
|


|-
|'''AbsoluteTolerance''' (AbsoluteTolerance)
|
If merging nearby points (see PointMerging property) and
using absolute tolerance (see ToleranceIsAbsolute property), this
property specifies the tolerance for performing merging in the spatial
units of the input data set.
|
1.0
|
|


The selected object must be the result of the following: sources (includes readers), filters
|-
 
|'''ToleranceIsAbsolute''' (ToleranceIsAbsolute)
 
|
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
This property determines whether to use absolute or
 
relative (a percentage of the bounding box) tolerance when performing
 
point merging.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ConvertLinesToPoints''' (ConvertLinesToPoints)
|
If this property is set to 1, degenerate lines (a "line"
whose endpoints are at the same spatial location) will be converted to
points.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ConvertPolysToLines''' (ConvertPolysToLines)
|
If this property is set to 1, degenerate polygons (a
"polygon" with only two distinct point coordinates) will be converted
to lines.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ConvertStripsToPolys''' (ConvertStripsToPolys)
|
If this property is set to 1, degenerate triangle strips
(a triangle "strip" containing only one triangle) will be converted to
triangles.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''PointMerging''' (PointMerging)
|
If this property is set to 1, then points will be merged
if they are within the specified Tolerance or AbsoluteTolerance (see
the Tolerance and AbsoluteTolerance propertys), depending on the value
of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute
property.) If this property is set to 0, points will not be
merged.
|
1
|
|
Accepts boolean values (0 or 1).


|}


==Clip=
==Clean Cells to Grid==


This filter merges cells and converts the data set to unstructured grid.Merges degenerate cells. Assumes
the input grid does not contain duplicate points. You may
want to run vtkCleanUnstructuredGrid first to assert it.
If duplicated cells are found they are removed in the
output. The filter also handles the case, where a cell may
contain degenerate nodes (i.e. one and the same node is
referenced by a cell more than once).


Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br
|-
 
|'''Input''' (Input)
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
This property specifies the input to the Clean Cells to
| '''Description''
Grid filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Clip Type'''<br>''(ClipFunction)'
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset


|
|
Accepts input of following types:
* vtkUnstructuredGrid


The value must be set to one of the following: Plane, Box, Sphere, Scalar
|}


==Clean to Grid==


|
This filter merges points and converts the data set to unstructured grid.The Clean to Grid filter merges
| '''Input'''<br>''(Input)'
points that are exactly coincident. It also converts the
data set to an unstructured grid. You may wish to do this
if you want to apply a filter to your data set that is
available for unstructured grids but not for the initial
type of your data set (e.g., applying warp vector to
volumetric data). The Clean to Grid filter operates on any
type of data set.


This property specifies the dataset on which the Clip filter will operate
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Clean to Grid
filter.
|
|


The selected object must be the result of the following: sources (includes readers), filters
|
Accepts input of following types:
* vtkDataSet


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
==ClientServerMoveData==




|
| '''Inside Out'''<br>''(InsideOut)'


If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
Set the input to the Client Server Move Data
filter.
|
|


Only the values 0 and 1 are accepted
|


|-
|'''OutputDataType''' (OutputDataType)
|


|
|
| '''Scalars'''<br>''(SelectInputScalars)'
0
|


If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation
|-
|'''WholeExtent''' (WholeExtent)
|


|
0 -1 0 -1 0 -1
|
|


An array of scalars is required


|}


Valud array names will be chosen from point and cell data
==Clip==


Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.The Clip filter
cuts away a portion of the input data set using an
implicit plane. This filter operates on all types of data
sets, and it returns unstructured grid data on
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|'''Input''' (Input)
|
This property specifies the dataset on which the Clip
filter will operate.
|
|
| '''Use Value As Offset'''<br>''(UseValueAsOffset)'
If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array


|
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array ()


Only the values 0 and 1 are accepted
with 1 component(s).


|-
|'''Clip Type''' (ClipFunction)
|
This property specifies the parameters of the clip
function (an implicit plane) used to clip the dataset.
|


|
|
| '''Value'''<br>''(Value)'
The value can be one of the following:
* Plane (implicit_functions)


If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface
* Box (implicit_functions)


|
* Sphere (implicit_functions)


The value must lie within the range of the selected data array
* Cylinder (implicit_functions)


* Scalar (implicit_functions)


|-
|'''InputBounds''' (InputBounds)
|
|


|


==Clip Closed Surface=
|


|-
|'''Scalars''' (SelectInputScalars)
|
If clipping with scalars, this property specifies the
name of the scalar array on which to perform the clip
operation.
|


Clip a polygonal dataset with a plane to produce closed surface
This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.<br
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
An array of scalars is required.The value must be field array name.
| '''Description''
|-
| '''Default Value(s)''
|'''Value''' (Value)
| '''Restrictions''
|
|
| '''Base Color'''<br>''(BaseColor)'
If clipping with scalars, this property sets the scalar
 
value about which to clip the dataset based on the scalar array chosen.
Specify the color for the faces from the input
(See SelectInputScalars.) If clipping with a clip function, this
 
property specifies an offset from the clip function to use in the
| 0.1 0.1
clipping operation. Neither functionality is currently available in
 
ParaView's user interface.
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1)
 
 
|
|
| '''Clip Color'''<br>''(ClipColor)'
0.0
 
Specifiy the color for the capping faces (generated on the clipping interface)
 
| 1 0.11 0.
 
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1)
 
 
|
|
| '''Clipping Plane'''<br>''(ClippingPlane)'
The value must lie within the range of the selected data array.
 
|-
This property specifies the parameters of the clipping plane used to clip the polygonal data
|'''InsideOut''' (InsideOut)
 
|
|
 
If this property is set to 0, the clip filter will
The value must be set to one of the following: Plane
return that portion of the dataset that lies within the clip function.
 
If set to 1, the portions of the dataset that lie outside the clip
 
function will be returned instead.
|
|
| '''Generate Cell Origins'''<br>''(GenerateColorScalars)'
0
 
Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells
 
|
|
 
Accepts boolean values (0 or 1).
Only the values 0 and 1 are accepted
|-
 
|'''UseValueAsOffset''' (UseValueAsOffset)
 
|
|
| '''Generate Faces'''<br>''(GenerateFaces)'
If UseValueAsOffset is true, Value is used as an offset
 
parameter to the implicit function. Otherwise, Value is used only when
Generate polygonal faces in the output
clipping using a scalar array.
 
|
|
 
0
Only the values 0 and 1 are accepted
 
 
|
|
| '''Generate Outline'''<br>''(GenerateOutline)'
Accepts boolean values (0 or 1).
 
|-
Generate clipping outlines in the output wherever an input face is cut by the clipping plane
|'''Crinkle clip''' (PreserveInputCells)
 
|
|
 
This parameter controls whether to extract entire cells
Only the values 0 and 1 are accepted
in the given region or clip those cells so all of the output one stay
 
only inside that region.
 
|
|
| '''Input'''<br>''(Input)'
0
 
This property specifies the dataset on which the Clip filter will operate
 
|
|
Accepts boolean values (0 or 1).


The selected object must be the result of the following: sources (includes readers), filters
|}


==Clip Closed Surface==


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
Clip a polygonal dataset with a plane to produce closed surfaces
This clip filter cuts away a portion of the input polygonal dataset using
a plane to generate a new polygonal dataset.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the dataset on which the Clip
filter will operate.
|
|
| '''Inside Out'''<br>''(InsideOut)'
If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane


|
|
Accepts input of following types:
* vtkPolyData
The dataset must contain a field array (point)


Only the values 0 and 1 are accepted
with 1 component(s).


|-
|'''Clipping Plane''' (ClippingPlane)
|
This property specifies the parameters of the clipping
plane used to clip the polygonal data.
|


|
|
| '''Clipping Tolerance'''<br>''(Tolerance)'
The value can be one of the following:
* Plane (implicit_functions)


Specify the tolerance for creating new points. A small value might incur degenerate triangles
|-
 
|'''GenerateFaces''' (GenerateFaces)
| 1e-0
|
|
Generate polygonal faces in the output.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''GenerateOutline''' (GenerateOutline)
|
Generate clipping outlines in the output wherever an
input face is cut by the clipping plane.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Generate Cell Origins''' (ScalarMode)
|
Generate (cell) data for coloring purposes such that the
newly generated cells (including capping faces and clipping outlines)
can be distinguished from the input cells.
|
0
|
The value(s) is an enumeration of the following:
* None (0)
* Color (1)
* Label (2)
|-
|'''InsideOut''' (InsideOut)
|
If this flag is turned off, the clipper will return the
portion of the data that lies within the clipping plane. Otherwise, the
clipper will return the portion of the data that lies outside the
clipping plane.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Clipping Tolerance''' (Tolerance)
|
Specify the tolerance for creating new points. A small
value might incur degenerate triangles.
|
0.000001
|
|


|-
|'''Base Color''' (BaseColor)
|
Specify the color for the faces from the
input.
|
0.10 0.10 1.00
|


==Compute Derivatives=
|-
|'''Clip Color''' (ClipColor)
|
Specifiy the color for the capping faces (generated on
the clipping interface).
|
1.00 0.11 0.10
|




This filter computes derivatives of scalars and vectors
|}


CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br
==Clip Generic Dataset==


{| class="PropertiesTable" border="1" cellpadding="5
Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set. This output data type of this filter is always an unstructured grid.
The Generic Clip filter cuts away a portion of the input
data set using a plane, a sphere, a box, or a scalar
value. The menu in the Clip Function portion of the
interface allows the user to select which implicit
function to use or whether to clip using a scalar value.
Making this selection loads the appropriate user
interface. For the implicit functions, the appropriate 3D
widget (plane, sphere, or box) is also displayed. The use
of these 3D widgets, including their user interface
components, is discussed in section 7.4. If an implicit
function is selected, the clip filter returns that portion
of the input data set that lies inside the function. If
Scalars is selected, then the user must specify a scalar
array to clip according to. The clip filter will return
the portions of the data set whose value in the selected
Scalars array is larger than the Clip value. Regardless of
the selection from the Clip Function menu, if the Inside
Out option is checked, the opposite portions of the data
set will be returned. This filter operates on all types of
data sets, and it returns unstructured grid data on
output.
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
 
|-
|'''Input''' (Input)
|
|
| '''Property''
Set the input to the Generic Clip
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'


This property specifies the input to the filter
|
Accepts input of following types:
* vtkGenericDataSet
The dataset must contain a field array (point)


|-
|'''Clip Type''' (ClipFunction)
|
Set the parameters of the clip function.
|
|


The selected object must be the result of the following: sources (includes readers), filters
|
The value can be one of the following:
* Plane (implicit_functions)


* Box (implicit_functions)


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
* Sphere (implicit_functions)


* Scalar (implicit_functions)


|-
|'''InputBounds''' (InputBounds)
|
|
| '''Output Tensor Type'''<br>''(OutputTensorType)'


This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor
|


|
|


The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2)
|-
 
|'''Scalars''' (SelectInputScalars)
|
If clipping with scalars, this property specifies the
name of the scalar array on which to perform the clip
operation.
|


|
|
| '''Output Vector Type'''<br>''(OutputVectorType)'
An array of scalars is required.The value must be field array name.
|-
|'''InsideOut''' (InsideOut)
|
Choose which portion of the dataset should be clipped
away.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Value''' (Value)
|
If clipping with a scalar array, choose the clipping
value.
|
0.0
|
The value must lie within the range of the selected data array.


This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data
|}


|
==Color By Array==


The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2)
This filter generate a color based image data based on a selected data scalar


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
| '''Scalars'''<br>''(SelectInputScalars)'


This property indicates the name of the scalar array to differentiate
|


|
|
Accepts input of following types:
* vtkImageData
The dataset must contain a field array (point)


An array of scalars is required
with 1 component(s).


|-
|'''LookupTable''' (LookupTable)
|


|
|
| '''Vectors'''<br>''(SelectInputVectors)'


This property indicates the name of the vector array to differentiate
|


|-
|'''Color By''' (SelectInputScalars)
|
This property specifies the name of the scalar array
from which we will color by.
|
|


An array of vectors is required
|
An array of scalars is required.The value must be field array name.
|-
|'''RGBA NaN Color''' (NaNColor)
|


|
0 0 0 255
|


|-
|'''OutputFormat''' (OutputFormat)
|
|


|
3
|
The value(s) is an enumeration of the following:
* Luminance (1)
* Luminance Alpha (2)
* RGB (3)
* RGBA (4)


==Connectivity=
|}


==Compute Derivatives==


Mark connected components with integer point attribute array
This filter computes derivatives of scalars and vectors.
CellDerivatives is a filter that computes derivatives of
scalars and vectors at the center of cells. You can choose
to generate different output including the scalar gradient
(a vector), computed tensor vorticity (a vector), gradient
of input vectors (a tensor), and strain matrix of the
input vectors (a tensor); or you may choose to pass data
through to the output.


The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
This property specifies the input to the
| '''Description''
filter.
| '''Default Value(s)''
|
| '''Restrictions''
 
|
|
| '''Color Regions'''<br>''(ColorRegions)'
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)


Controls the coloring of the connected regions
with 1 component(s).


|
The dataset must contain a field array (point)


Only the values 0 and 1 are accepted
with 3 component(s).


|-
|'''Scalars''' (SelectInputScalars)
|
This property indicates the name of the scalar array to
differentiate.
|


|
|
| '''Extraction Mode'''<br>''(ExtractionMode)'
An array of scalars is required.
|-
|'''Vectors''' (SelectInputVectors)
|
This property indicates the name of the vector array to
differentiate.
|
1
|
An array of vectors is required.
|-
|'''OutputVectorType''' (OutputVectorType)
|
This property Controls how the filter works to generate
vector cell data. You can choose to compute the gradient of the input
scalars, or extract the vorticity of the computed vector gradient
tensor. By default, the filter will take the gradient of the input
scalar data.
|
1
|
The value(s) is an enumeration of the following:
* Nothing (0)
* Scalar Gradient (1)
* Vorticity (2)
|-
|'''OutputTensorType''' (OutputTensorType)
|
This property controls how the filter works to generate
tensor cell data. You can choose to compute the gradient of the input
vectors, or compute the strain tensor of the vector gradient tensor. By
default, the filter will take the gradient of the vector data to
construct a tensor.
|
1
|
The value(s) is an enumeration of the following:
* Nothing (0)
* Vector Gradient (1)
* Strain (2)


Controls the extraction of connected surfaces
|}


|
==Compute Quartiles==


The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6)
Compute the quartiles table from a dataset or table.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
|
| '''Input'''<br>''(Input)'
This property specifies the input to the Connectivity filter


|
|
Accepts input of following types:
* vtkDataObject


The selected object must be the result of the following: sources (includes readers), filters
|}


==Connectivity==


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
Mark connected components with integer point attribute array.The Connectivity
filter assigns a region id to connected components of the
input data set. (The region id is assigned as a point
scalar value.) This filter takes any data set type as
input and produces unstructured grid
output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Connectivity
filter.
|
|


|
Accepts input of following types:
* vtkDataSet
|-
|'''ExtractionMode''' (ExtractionMode)
|
Controls the extraction of connected
surfaces.
|
5
|
The value(s) is an enumeration of the following:
* Extract Point Seeded Regions (1)
* Extract Cell Seeded Regions (2)
* Extract Specified Regions (3)
* Extract Largest Region (4)
* Extract All Regions (5)
* Extract Closes Point Region (6)
|-
|'''ColorRegions''' (ColorRegions)
|
Controls the coloring of the connected
regions.
|
1
|
Accepts boolean values (0 or 1).


==Contingency Statistics=
|}


==Contingency Statistics==


Compute a statistical model of a dataset and/or assess the dataset with a statistical model
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
This filter either computes a statistical model of a dataset or takes
such a model as its second input. Then, the model (however it is
obtained) may optionally be used to assess the input dataset. This filter
computes contingency tables between pairs of attributes. This result is a
tabular bivariate probability distribution which serves as a
Bayesian-style prior model. Data is assessed by computing &lt;ul&gt;
&lt;li&gt; the probability of observing both variables simultaneously;
&lt;li&gt; the probability of each variable conditioned on the other (the
two values need not be identical); and &lt;li&gt; the pointwise mutual
information (PMI). &lt;/ul&gt; Finally, the summary statistics include
the information entropy of the observations.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br
{| class="PropertiesTable" border="1" cellpadding="5"
This filter computes contingency tables between pairs of attributes.  This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.  Data is assessed by computing <br
|-
*  the probability of observing both variables simultaneously;<br
| '''Property'''
*  the probability of each variable conditioned on the other (the two values need not be identical); and<br
| '''Description'''
*  the pointwise mutual information (PMI)
| '''Default Value(s)'''
<br
| '''Restrictions'''
Finally, the summary statistics include the information entropy of the observations.<br


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
The input to the filter. Arrays from this dataset will
| '''Description''
be used for computing statistics and/or assessed by a statistical
| '''Default Value(s)''
model.
| '''Restrictions''
|
|
| '''Attribute Mode'''<br>''(AttributeMode)'


Specify which type of field data the arrays will be drawn from
|
Accepts input of following types:
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset must contain a field array ()


|-
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|
|


Valud array names will be chosen from point and cell data
|
Accepts input of following types:
* vtkTable
* vtkMultiBlockDataSet
|-
|'''AttributeMode''' (AttributeMode)
|
Specify which type of field data the arrays will be
drawn from.
|
0
|
The value must be field array name.
|-
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|


|


|-
|'''Task''' (Task)
|
Specify the task to be performed: modeling and/or
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
3
|
|
| '''Input'''<br>''(Input)'
The value(s) is an enumeration of the following:
 
* Detailed model of input data (0)
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model
* Model a subset of the data (1)
 
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|'''TrainingFraction''' (TrainingFraction)
|
Specify the fraction of values from the input dataset to
be used for model fitting. The exact set of values is chosen at random
from the dataset.
|
0.1
|
|


The selected object must be the result of the following: sources (includes readers), filters


|}


The dataset must contain a point or cell array
==Contour==


Generate isolines or isosurfaces using point scalars.The Contour
filter computes isolines or isosurfaces using a selected
point-centered scalar array. The Contour filter operates
on any type of data set, but the input is required to have
at least one point-centered scalar (single-component)
array. The output of this filter is
polygonal.


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input dataset to be used by
the contour filter.
|


|
|
| '''Model Input'''<br>''(ModelInput)'
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (point)


A previously-calculated model with which to assess a separate dataset. This input is optional
with 1 component(s).


|-
|'''Contour By''' (SelectInputScalars)
|
This property specifies the name of the scalar array
from which the contour filter will compute isolines and/or
isosurfaces.
|
|


The selected object must be the result of the following: sources (includes readers), filters
|
An array of scalars is required.The value must be field array name.
|-
|'''ComputeNormals''' (ComputeNormals)
|
If this property is set to 1, a scalar array containing
a normal value at each point in the isosurface or isoline will be
created by the contour filter; otherwise an array of normals will not
be computed. This operation is fairly expensive both in terms of
computation time and memory required, so if the output dataset produced
by the contour filter will be processed by filters that modify the
dataset's topology or geometry, it may be wise to set the value of this
property to 0. Select whether to compute normals.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeGradients''' (ComputeGradients)
|
If this property is set to 1, a scalar array containing
a gradient value at each point in the isosurface or isoline will be
created by this filter; otherwise an array of gradients will not be
computed. This operation is fairly expensive both in terms of
computation time and memory required, so if the output dataset produced
by the contour filter will be processed by filters that modify the
dataset's topology or geometry, it may be wise to set the value of this
property to 0. Not that if ComputeNormals is set to 1, then gradients
will have to be calculated, but they will only be stored in the output
dataset if ComputeGradients is also set to 1.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ComputeScalars''' (ComputeScalars)
|
If this property is set to 1, an array of scalars
(containing the contour value) will be added to the output dataset. If
set to 0, the output will not contain this array.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''OutputPointsPrecision''' (OutputPointsPrecision)
|


Select the output precision of the coordinates. **Single** sets the
output to single-precision floating-point (i.e., float), **Double**
sets it to double-precision floating-point (i.e., double), and
**Default** sets it to the same precision as the precision of the
points in the input. Defaults to ***Single***.


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet
|
0
|
The value(s) is an enumeration of the following:
* Single (0)
* Double (1)
* Same as input (2)
|-
|'''GenerateTriangles''' (GenerateTriangles)
|
This parameter controls whether to produce triangles in the output.
Warning: Many filters do not properly handle non-trianglular polygons.


|
1
|
Accepts boolean values (0 or 1).
|-
|'''Isosurfaces''' (ContourValues)
|
This property specifies the values at which to compute
isosurfaces/isolines and also the number of such
values.
|
|
The value must lie within the range of the selected data array.
|-
|'''Point Merge Method''' (Locator)
|
This property specifies an incremental point locator for
merging duplicate / coincident points.
|


|
|
| '''Variables of Interest'''<br>''(SelectArrays)'
The value can be one of the following:
* MergePoints (incremental_point_locators)


Choose arrays whose entries will be used to form observations for statistical analysis
* IncrementalOctreeMergePoints (incremental_point_locators)


|
* NonMergingPointLocator (incremental_point_locators)


An array of scalars is required


|}


|
==Contour Generic Dataset==
| '''Task'''<br>''(Task)'


Specify the task to be performed: modeling and/or assessment
Generate isolines or isosurfaces using point scalars.The Generic
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset
Contour filter computes isolines or isosurfaces using a
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset
selected point-centered scalar array. The available scalar
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; an
arrays are listed in the Scalars menu. The scalar range of
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model
the selected array will be displayed. The interface for
adding contour values is very similar to the one for
selecting cut offsets (in the Cut filter). To add a single
contour value, select the value from the New Value slider
in the Add value portion of the interface and click the
Add button, or press Enter. To instead add several evenly
spaced contours, use the controls in the Generate range of
values section. Select the number of contour values to
generate using the Number of Values slider. The Range
slider controls the interval in which to generate the
contour values. Once the number of values and range have
been selected, click the Generate button. The new values
will be added to the Contour Values list. To delete a
value from the Contour Values list, select the value and
click the Delete button. (If no value is selected, the
last value in the list will be removed.) Clicking the
Delete All button removes all the values in the list. If
no values are in the Contour Values list when Accept is
pressed, the current value of the New Value slider will be
used. In addition to selecting contour values, you can
also select additional computations to perform. If any of
Compute Normals, Compute Gradients, or Compute Scalars is
selected, the appropriate computation will be performed,
and a corresponding point-centered array will be added to
the output. The Generic Contour filter operates on a
generic data set, but the input is required to have at
least one point-centered scalar (single-component) array.
The output of this filter is polygonal.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
Set the input to the Generic Contour
filter.
|
|


The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3)
|
Accepts input of following types:
* vtkGenericDataSet
The dataset must contain a field array (point)


with 1 component(s).


|-
|'''Contour By''' (SelectInputScalars)
|
This property specifies the name of the scalar array
from which the contour filter will compute isolines and/or
isosurfaces.
|
|
| '''Training Fraction'''<br>''(TrainingFraction)'


Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset
|
An array of scalars is required.The value must be field array name.
|-
|'''ComputeNormals''' (ComputeNormals)
|
Select whether to compute normals.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''ComputeGradients''' (ComputeGradients)
|
Select whether to compute gradients.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''ComputeScalars''' (ComputeScalars)
|
Select whether to compute scalars.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Isosurfaces''' (ContourValues)
|
This property specifies the values at which to compute
isosurfaces/isolines and also the number of such
values.
|


| 0.
|
The value must lie within the range of the selected data array.
|-
|'''Point Merge Method''' (Locator)
|
This property specifies an incremental point locator for
merging duplicate / coincident points.
|


The value must be greater than or equal to 0 and less than or equal to 1
|
The value can be one of the following:
* MergePoints (incremental_point_locators)


* IncrementalOctreeMergePoints (incremental_point_locators)


|
* NonMergingPointLocator (incremental_point_locators)




==Contour=
|}


==Convert AMR dataset to Multi-block==


Generate isolines or isosurfaces using point scalars
Convert AMR to Multiblock


The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
This property specifies the input for this
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Compute Gradients'''<br>''(ComputeGradients)'
If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1


|
|
Accepts input of following types:
* vtkOverlappingAMR


Only the values 0 and 1 are accepted
|}


==ConvertSelection==


|
Converts a selection from one type to
| '''Compute Normals'''<br>''(ComputeNormals)'
another.


If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0
{| class="PropertiesTable" border="1" cellpadding="5"
Select whether to compute normals
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''DataInput''' (DataInput)
|
Set the vtkDataObject input used to convert the
selection.
|
|


Only the values 0 and 1 are accepted
|
 
Accepts input of following types:
* vtkDataObject
|-
|'''Input''' (Input)
|
Set the selection to convert.
|


|
|
| '''Compute Scalars'''<br>''(ComputeScalars)'
Accepts input of following types:
 
* vtkSelection
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array
|-
|'''OutputType''' (OutputType)
|
Set the ContentType for the output.
|
5
|
The value(s) is an enumeration of the following:
* SELECTIONS (0)
* GLOBALIDs (1)
* PEDIGREEIDS (2)
* VALUES (3)
* INDICES (4)
* FRUSTUM (5)
* LOCATION (6)
* THRESHOLDS (7)
|-
|'''ArrayNames''' (ArrayNames)
|


|
|
Only the values 0 and 1 are accepted


|
|
| '''Isosurfaces'''<br>''(ContourValues)'
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values


|-
|'''MatchAnyValues''' (MatchAnyValues)
|
|
The value must lie within the range of the selected data array


|
|
| '''Input'''<br>''(Input)'
0
 
This property specifies the input dataset to be used by the contour filter
 
|
|
Accepts boolean values (0 or 1).


The selected object must be the result of the following: sources (includes readers), filters
|}


==Crop==


The dataset must contain a point or cell array with 1 components
Efficiently extract an area/volume of interest from a 2-d image or 3-d volume.The Crop filter
extracts an area/volume of interest from a 2D image or a
3D volume by allowing the user to specify the minimum and
maximum extents of each dimension of the data. Both the
input and output of this filter are uniform rectilinear
data.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
|-
 
|'''Input''' (Input)
|
This property specifies the input to the Crop
filter.
|


|
|
| '''Point Merge Method'''<br>''(Locator)'
Accepts input of following types:
 
* vtkImageData
This property specifies an incremental point locator for merging duplicate / coincident points
|-
 
|'''OutputWholeExtent''' (OutputWholeExtent)
|
This property gives the minimum and maximum point index
(extent) in each dimension for the output dataset.
|
0 0 0 0 0 0
|
|
The value(s) must lie within the structured-extents of the input dataset.


The selected object must be the result of the following: incremental_point_locators
|}


==Curvature==


The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator
This filter will compute the Gaussian or mean curvature of the mesh at each point.The
Curvature filter computes the curvature at each point in a
polygonal data set. This filter supports both Gaussian and
mean curvatures. ; the type can be selected from the
Curvature type menu button.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Curvature
filter.
|
|
| '''Contour By'''<br>''(SelectInputScalars)'
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces


|
|
Accepts input of following types:
* vtkPolyData
|-
|'''InvertMeanCurvature''' (InvertMeanCurvature)
|
If this property is set to 1, the mean curvature
calculation will be inverted. This is useful for meshes with
inward-pointing normals.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''CurvatureType''' (CurvatureType)
|
This propery specifies which type of curvature to
compute.
|
0
|
The value(s) is an enumeration of the following:
* Gaussian (0)
* Mean (1)


An array of scalars is required
|}


==D3==


Valud array names will be chosen from point and cell data
Repartition a data set into load-balanced spatially convex regions. Create ghost cells if requested.The D3 filter is
available when ParaView is run in parallel. It operates on
any type of data set to evenly divide it across the
processors into spatially contiguous regions. The output
of this filter is of type unstructured
grid.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the D3
filter.
|
|


|
Accepts input of following types:
* vtkDataSet
|-
|'''BoundaryMode''' (BoundaryMode)
|
This property determines how cells that lie on processor
boundaries are handled. The "Assign cells uniquely" option assigns each
boundary cell to exactly one process, which is useful for isosurfacing.
Selecting "Duplicate cells" causes the cells on the boundaries to be
copied to each process that shares that boundary. The "Divide cells"
option breaks cells across process boundary lines so that pieces of the
cell lie in different processes. This option is useful for volume
rendering.
|
0
|
The value(s) is an enumeration of the following:
* Assign cells uniquely (0)
* Duplicate cells (1)
* Divide cells (2)
|-
|'''Minimal Memory''' (UseMinimalMemory)
|
If this property is set to 1, the D3 filter requires
communication routines to use minimal memory than without this
restriction.
|
0
|
Accepts boolean values (0 or 1).


==Cosmology FOF Halo Finder=
|}


==Decimate==


Sorry, no help is currently available
Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only.
The Decimate filter reduces the number of triangles in a
polygonal data set. Because this filter only operates on
triangles, first run the Triangulate filter on a dataset
that contains polygons other than
triangles.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


 
|-
 
|'''Input''' (Input)
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
This property specifies the input to the Decimate
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''bb (linking length/distance)'''<br>''(BB)'


Linking length measured in units of interparticle spacing and is dimensionless. Used to link particles into halos for the friend-of-a-friend algorithm
|
 
Accepts input of following types:
| 0.
* vtkPolyData
 
|-
The value must be greater than or equal to 0
|'''TargetReduction''' (TargetReduction)
|
This property specifies the desired reduction in the
total number of polygons in the output dataset. For example, if the
TargetReduction value is 0.9, the Decimate filter will attempt to
produce an output dataset that is 10% the size of the
input.)
|
0.9
|


|-
|'''PreserveTopology''' (PreserveTopology)
|
If this property is set to 1, decimation will not split
the dataset or produce holes, but it may keep the filter from reaching
the reduction target. If it is set to 0, better reduction can occur
(reaching the reduction target), but holes in the model may be
produced.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''FeatureAngle''' (FeatureAngle)
|
The value of this property is used in determining where
the data set may be split. If the angle between two adjacent triangles
is greater than or equal to the FeatureAngle value, then their boundary
is considered a feature edge where the dataset can be
split.
|
15.0
|


|-
|'''BoundaryVertexDeletion''' (BoundaryVertexDeletion)
|
If this property is set to 1, then vertices on the
boundary of the dataset can be removed. Setting the value of this
property to 0 preserves the boundary of the dataset, but it may cause
the filter not to reach its reduction target.
|
1
|
|
| '''Compute the most bound particle for halos'''<br>''(ComputeMostBoundParticle)'
Accepts boolean values (0 or 1).


If checked, the most bound particle will be calculated.  This can be very slow
|}


|
==Delaunay 2D==


Only the values 0 and 1 are accepted
Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.
Delaunay2D is a filter that constructs a 2D Delaunay
triangulation from a list of input points. These points
may be represented by any dataset of type vtkPointSet and
subclasses. The output of the filter is a polygonal
dataset containing a triangle mesh. The 2D Delaunay
triangulation is defined as the triangulation that
satisfies the Delaunay criterion for n-dimensional
simplexes (in this case n=2 and the simplexes are
triangles). This criterion states that a circumsphere of
each simplex in a triangulation contains only the n+1
defining points of the simplex. In two dimensions, this
translates into an optimal triangulation. That is, the
maximum interior angle of any triangle is less than or
equal to that of any possible triangulation. Delaunay
triangulations are used to build topological structures
from unorganized (or unstructured) points. The input to
this filter is a list of points specified in 3D, even
though the triangulation is 2D. Thus the triangulation is
constructed in the x-y plane, and the z coordinate is
ignored (although carried through to the output). You can
use the option ProjectionPlaneMode in order to compute the
best-fitting plane to the set of points, project the
points and that plane and then perform the triangulation
using their projected positions and then use it as the
plane in which the triangulation is performed. The
Delaunay triangulation can be numerically sensitive in
some cases. To prevent problems, try to avoid injecting
points that will result in triangles with bad aspect
ratios (1000:1 or greater). In practice this means
inserting points that are "widely dispersed", and enables
smooth transition of triangle sizes throughout the mesh.
(You may even want to add extra points to create a better
point distribution.) If numerical problems are present,
you will see a warning message to this effect at the end
of the triangulation process. Warning: Points arranged on
a regular lattice (termed degenerate cases) can be
triangulated in more than one way (at least according to
the Delaunay criterion). The choice of triangulation (as
implemented by this algorithm) depends on the order of the
input points. The first three points will form a triangle;
other degenerate points will not break this triangle.
Points that are coincident (or nearly so) may be discarded
by the algorithm. This is because the Delaunay
triangulation requires unique input points. The output of
the Delaunay triangulation is supposedly a convex hull. In
certain cases this implementation may not generate the
convex hull.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input dataset to the
Delaunay 2D filter.
|
|
| '''Compute the most connected particle for halos'''<br>''(ComputeMostConnectedParticle)'
If checked, the most connected particle will be calculated.  This can be very slow


|
|
 
Accepts input of following types:
Only the values 0 and 1 are accepted
* vtkPointSet
 
|-
 
|'''ProjectionPlaneMode''' (ProjectionPlaneMode)
|
This property determines type of projection plane to use
in performing the triangulation.
|
0
|
The value(s) is an enumeration of the following:
* XY Plane (0)
* Best-Fitting Plane (2)
|-
|'''Alpha''' (Alpha)
|
The value of this property controls the output of this
filter. For a non-zero alpha value, only edges or triangles contained
within a sphere centered at mesh vertices will be output. Otherwise,
only triangles will be output.
|
0.0
|
|
| '''Copy halo catalog information to original particles'''<br>''(CopyHaloDataToParticles)'
If checked, the halo catalog information will be copied to the original particles as well


|-
|'''Tolerance''' (Tolerance)
|
|
 
This property specifies a tolerance to control
Only the values 0 and 1 are accepted
discarding of closely spaced points. This tolerance is specified as a
 
fraction of the diagonal length of the bounding box of the
 
points.
|
|
| '''Halo position for 3D visualization'''<br>''(HaloPositionType)'
0.00001
 
This sets the position for the halo catalog particles (second output) in 3D space for visualization. Input particle positions (first output) will be unaltered by this.  MBP and MCP for particle positions can potentially take a very long time to calculate
 
|
|


The value must be one of the following: Average (0), Center of Mass (1), Most Bound Particle (2), Most Connected Particle (3)
|-
 
|'''Offset''' (Offset)
 
|
|
| '''Input'''<br>''(Input)'
This property is a multiplier to control the size of the
initial, bounding Delaunay triangulation.
|
|
1.0
|
|


The selected object must be the result of the following: sources (includes readers), filters
|-
 
|'''BoundingTriangulation''' (BoundingTriangulation)
 
|
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid
If this property is set to 1, bounding triangulation
 
points (and associated triangles) are included in the output. These are
 
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.
|
0
|
|
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)'
Accepts boolean values (0 or 1).


Number of seeded particles in one dimension.  Therefore, total simulation particles is np^3 (cubed)
|}


| 25
==Delaunay 3D==


The value must be greater than or equal to 0
Create a 3D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkUnstructuredGrid as output.Delaunay3D is a filter that constructs
a 3D Delaunay triangulation from a list of input points. These points may be
represented by any dataset of type vtkPointSet and subclasses. The output of
the filter is an unstructured grid dataset. Usually the output is a tetrahedral
mesh, but if a non-zero alpha distance value is specified (called the "alpha"
value), then only tetrahedra, triangles, edges, and vertices lying within the
alpha radius are output. In other words, non-zero alpha values may result in
arbitrary combinations of tetrahedra, triangles, lines, and vertices. (The
notion of alpha value is derived from Edelsbrunner's work on "alpha shapes".)
The 3D Delaunay triangulation is defined as the triangulation that satisfies
the Delaunay criterion for n-dimensional simplexes (in this case n=3 and the
simplexes are tetrahedra). This criterion states that a circumsphere of each
simplex in a triangulation contains only the n+1 defining points of the
simplex. (See text for more information.) While in two dimensions this
translates into an "optimal" triangulation, this is not true in 3D, since a
measurement for optimality in 3D is not agreed on. Delaunay triangulations are
used to build topological structures from unorganized (or unstructured) points.
The input to this filter is a list of points specified in 3D. (If you wish to
create 2D triangulations see Delaunay2D.) The output is an unstructured grid.
The Delaunay triangulation can be numerically sensitive. To prevent problems,
try to avoid injecting points that will result in triangles with bad aspect
ratios (1000:1 or greater). In practice this means inserting points that are
"widely dispersed", and enables smooth transition of triangle sizes throughout
the mesh. (You may even want to add extra points to create a better point
distribution.) If numerical problems are present, you will see a warning
message to this effect at the end of the triangulation process. Warning: Points
arranged on a regular lattice (termed degenerate cases) can be triangulated in
more than one way (at least according to the Delaunay criterion). The choice of
triangulation (as implemented by this algorithm) depends on the order of the
input points. The first four points will form a tetrahedron; other degenerate
points (relative to this initial tetrahedron) will not break it. Points that
are coincident (or nearly so) may be discarded by the algorithm. This is
because the Delaunay triangulation requires unique input points. You can
control the definition of coincidence with the "Tolerance" instance variable.
The output of the Delaunay triangulation is supposedly a convex hull. In
certain cases this implementation may not generate the convex hull. This
behavior can be controlled by the Offset instance variable. Offset is a
multiplier used to control the size of the initial triangulation. The larger
the offset value, the more likely you will generate a convex hull; and the more
likely you are to see numerical problems. The implementation of this algorithm
varies from the 2D Delaunay algorithm (i.e., Delaunay2D) in an important way.
When points are injected into the triangulation, the search for the enclosing
tetrahedron is quite different. In the 3D case, the closest previously inserted
point point is found, and then the connected tetrahedra are searched to find
the containing one. (In 2D, a "walk" towards the enclosing triangle is
performed.) If the triangulation is Delaunay, then an enclosing tetrahedron
will be found. However, in degenerate cases an enclosing tetrahedron may not be
found and the point will be rejected.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)'
This property specifies the input dataset to the
 
Delaunay 3D filter.
The space in rL units to extend processor particle ownership for ghost particles/cells. Needed for correct halo calculation when halos cross processor boundaries in parallel computation
 
|
|
The value must be greater than or equal to 0


|
|
| '''pmin (minimum particle threshold for a halo)'''<br>''(PMin)'
Accepts input of following types:
* vtkPointSet
|-
|'''Alpha''' (Alpha)
|
This property specifies the alpha (or distance) value to
control the output of this filter. For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of radius
alpha) will be output. Otherwise, only tetrahedra will be
output.
|
0.0
|


Minimum number of particles (threshold) needed before a group is called a halo
|-
 
|'''Tolerance''' (Tolerance)
| 1
|
 
This property specifies a tolerance to control
The value must be greater than or equal to 1
discarding of closely spaced points. This tolerance is specified as a
fraction of the diagonal length of the bounding box of the
points.
|
0.001
|


|-
|'''Offset''' (Offset)
|
This property specifies a multiplier to control the size
of the initial, bounding Delaunay triangulation.
|
2.5
|


|-
|'''BoundingTriangulation''' (BoundingTriangulation)
|
This boolean controls whether bounding triangulation
points (and associated triangles) are included in the output. (These
are introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)
|
0
|
Accepts boolean values (0 or 1).
|-
|'''AlphaTets''' (AlphaTets)
|
This boolean controls whether tetrahedrons which satisfy
the alpha criterion output when alpha is non-zero.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''AlphaTris''' (AlphaTris)
|
This boolean controls whether triangles which satisfy
the alpha criterion output when alpha is non-zero.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''AlphaLines''' (AlphaLines)
|
This boolean controls whether lines which satisfy the
alpha criterion output when alpha is non-zero.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''AlphaVerts''' (AlphaVerts)
|
This boolean controls whether vertices which satisfy the
alpha criterion are output when alpha is non-zero.
|
0
|
|
| '''rL (physical box side length)'''<br>''(RL)'
Accepts boolean values (0 or 1).


The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or the are wrapped around)
|}


| 90.140
==Descriptive Statistics==


The value must be greater than or equal to 0
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
This filter either computes a statistical model of a dataset or takes
such a model as its second input. Then, the model (however it is
obtained) may optionally be used to assess the input dataset.&lt;p&gt;
This filter computes the min, max, mean, raw moments M2 through M4,
standard deviation, skewness, and kurtosis for each array you
select.&lt;p&gt; The model is simply a univariate Gaussian distribution
with the mean and standard deviation provided. Data is assessed using
this model by detrending the data (i.e., subtracting the mean) and then
dividing by the standard deviation. Thus the assessment is an array whose
entries are the number of standard deviations from the mean that each
input point lies.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
The input to the filter. Arrays from this dataset will
be used for computing statistics and/or assessed by a statistical
model.
|
|


|
Accepts input of following types:
* vtkImageData
* vtkStructuredGrid
* vtkPolyData
* vtkUnstructuredGrid
* vtkTable
* vtkGraph
The dataset must contain a field array ()


==Curvature=
|-
|'''ModelInput''' (ModelInput)
|
A previously-calculated model with which to assess a
separate dataset. This input is optional.
|


|
Accepts input of following types:
* vtkTable
* vtkMultiBlockDataSet
|-
|'''AttributeMode''' (AttributeMode)
|
Specify which type of field data the arrays will be
drawn from.
|
0
|
The value must be field array name.
|-
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|


This filter will compute the Gaussian or mean curvature of the mesh at each point
|


The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br
|-
; the type can be selected from the Curvature type menu button.<br
|'''Task''' (Task)
|
Specify the task to be performed: modeling and/or
assessment. &lt;ol&gt; &lt;li&gt; "Detailed model of input data,"
creates a set of output tables containing a calculated statistical
model of the &lt;b&gt;entire&lt;/b&gt; input dataset;&lt;/li&gt;
&lt;li&gt; "Model a subset of the data," creates an output table (or
tables) summarizing a &lt;b&gt;randomly-chosen subset&lt;/b&gt; of the
input dataset;&lt;/li&gt; &lt;li&gt; "Assess the data with a model,"
adds attributes to the first input dataset using a model provided on
the second input port; and&lt;/li&gt; &lt;li&gt; "Model and assess the
same data," is really just operations 2 and 3 above applied to the same
input dataset. The model is first trained using a fraction of the input
data and then the entire dataset is assessed using that
model.&lt;/li&gt; &lt;/ol&gt; When the task includes creating a model
(i.e., tasks 2, and 4), you may adjust the fraction of the input
dataset used for training. You should avoid using a large fraction of
the input data for training as you will then not be able to detect
overfitting. The &lt;i&gt;Training fraction&lt;/i&gt; setting will be
ignored for tasks 1 and 3.
|
3
|
The value(s) is an enumeration of the following:
* Detailed model of input data (0)
* Model a subset of the data (1)
* Assess the data with a model (2)
* Model and assess the same data (3)
|-
|'''TrainingFraction''' (TrainingFraction)
|
Specify the fraction of values from the input dataset to
be used for model fitting. The exact set of values is chosen at random
from the dataset.
|
0.1
|


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Deviations should be''' (SignedDeviations)
|
|
| '''Property''
Should the assessed values be signed deviations or
| '''Description''
unsigned?
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Curvature Type'''<br>''(CurvatureType)'
0
|
The value(s) is an enumeration of the following:
* Unsigned (0)
* Signed (1)


This propery specifies which type of curvature to compute
|}


|
==Elevation==


The value must be one of the following: Gaussian (0), Mean (1)
Create point attribute array by projecting points onto an elevation vector.
The Elevation filter generates point scalar values for an
input dataset along a specified direction vector. The
Input menu allows the user to select the data set to which
this filter will be applied. Use the Scalar range entry
boxes to specify the minimum and maximum scalar value to
be generated. The Low Point and High Point define a line
onto which each point of the data set is projected. The
minimum scalar value is associated with the Low Point, and
the maximum scalar value is associated with the High
Point. The scalar value for each point in the data set is
determined by the location along the line to which that
point projects.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input dataset to the
Elevation filter.
|
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Curvature filter
|
Accepts input of following types:
* vtkDataSet
|-
|'''ScalarRange''' (ScalarRange)
|
This property determines the range into which scalars
will be mapped.
|
0 1
|


|-
|'''Low Point''' (LowPoint)
|
This property defines one end of the direction vector
(small scalar values).
|
0 0 0
|
|


The selected object must be the result of the following: sources (includes readers), filters
The value must lie within the bounding box of the dataset.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData


It will default to the min in each dimension.


|-
|'''High Point''' (HighPoint)
|
This property defines the other end of the direction
vector (large scalar values).
|
|
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)'
0 0 1
 
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals
 
|
|


Only the values 0 and 1 are accepted
The value must lie within the bounding box of the dataset.


It will default to the max in each dimension.


|


|}


==D3=
==Environment Annotation==


Allows annotation of user name, date/time, OS, and possibly filename.
Apply to any source. Gui allows manual selection of desired annotation options.
If the source is a file, can display the filename.


Repartition a data set into load-balanced spatially convex regions.  Create ghost cells if requested


The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
Set the input of the filter.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Boundary Mode'''<br>''(BoundaryMode)'
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering


|
Accepts input of following types:
* vtkDataObject
|-
|'''DisplayUserName''' (DisplayUserName)
|
|


The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2)
Toggle User Name Visibility.
 


|
|
| '''Input'''<br>''(Input)'
0
|
Accepts boolean values (0 or 1).
|-
|'''DisplaySystemName''' (DisplaySystemName)
|


This property specifies the input to the D3 filter
Toggle System Name Visibility.


|
0
|
Accepts boolean values (0 or 1).
|-
|'''DisplayDate''' (DisplayDate)
|
|


The selected object must be the result of the following: sources (includes readers), filters
Toggle Date/Time Visibility.


|
0
|
Accepts boolean values (0 or 1).
|-
|'''DisplayFileName''' (DisplayFileName)
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
Toggle File Name Visibility.
 


|
|
| '''Minimal Memory'''<br>''(UseMinimalMemory)'
0
 
|
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction
Accepts boolean values (0 or 1).
 
|-
|'''FileName''' (FileName)
|
Annotation of file name.
|
|
Only the values 0 and 1 are accepted


|
|




==Decimate=
|}


==Extract AMR Blocks==


Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only
This filter extracts a list of datasets from hierarchical datasets.This filter extracts a list
of datasets from hierarchical datasets.


The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
This property specifies the input to the Extract
| '''Description''
Datasets filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)'


If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target
|
Accepts input of following types:
* vtkUniformGridAMR
|-
|'''SelectedDataSets''' (SelectedDataSets)
|
This property provides a list of datasets to
extract.
|


|
|


Only the values 0 and 1 are accepted


|}


|
==Extract Attributes==
| '''Feature Angle'''<br>''(FeatureAngle)'


The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split
Extract attribute data as a table.This is a
filter that produces a vtkTable from the chosen attribute
in the input dataobject. This filter can accept composite
datasets. If the input is a composite dataset, the output
is a multiblock with vtkTable leaves.


| 1
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
The value must be greater than or equal to 0 and less than or equal to 180
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input of the
filter.
|


|
|
| '''Input'''<br>''(Input)'
Accepts input of following types:
 
* vtkDataObject
This property specifies the input to the Decimate filter
|-
 
|'''FieldAssociation''' (FieldAssociation)
|
Select the attribute data to pass.
|
0
|
The value(s) is an enumeration of the following:
* Points (0)
* Cells (1)
* Field Data (2)
* Vertices (4)
* Edges (5)
* Rows (6)
|-
|'''AddMetaData''' (AddMetaData)
|
It is possible for this filter to add additional
meta-data to the field data such as point coordinates (when point
attributes are selected and input is pointset) or structured
coordinates etc. To enable this addition of extra information, turn
this flag on. Off by default.
|
0
|
|
Accepts boolean values (0 or 1).


The selected object must be the result of the following: sources (includes readers), filters
|}


==Extract Bag Plots==


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
Extract Bag Plots.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the
filter.
|
|
| '''Preserve Topology'''<br>''(PreserveTopology)'
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced


|
|
Accepts input of following types:
* vtkTable
The dataset must contain a field array (row)


Only the values 0 and 1 are accepted
with 1 component(s).


|-
|'''Variables of Interest''' (SelectArrays)
|
Choose arrays whose entries will be used to form
observations for statistical analysis.
|


|
|
| '''Target Reduction'''<br>''(TargetReduction)'


This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.
|-
|'''Process the transposed of the input table''' (TransposeTable)
|
This flag indicates if the input table must
be transposed first.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''RobustPCA''' (RobustPCA)
|
This flag indicates if the PCA should be run
in robust mode or not.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''HDR smoothing parameter''' (Sigma)
|
Specify the smoothing parameter of the
HDR.
|
1
|


| 0.
|-
 
|'''GridSize''' (GridSize)
The value must be greater than or equal to 0 and less than or equal to 1
|


Width and height of the grid image to perform the PCA on.


|
100
|
|




==Delaunay 2D=
|}


==Extract Block==


Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution
This filter extracts a range of blocks from a multiblock dataset.This filter extracts a range
of groups from a multiblock dataset


Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br
{| class="PropertiesTable" border="1" cellpadding="5"
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br
|-
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br
| '''Property'''
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br
| '''Description'''
Warning:<br
| '''Default Value(s)'''
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br
| '''Restrictions'''
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br


{| class="PropertiesTable" border="1" cellpadding="5
|-
|'''Input''' (Input)
|
|
| '''Property''
This property specifies the input to the Extract Group
| '''Description''
filter.
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Alpha'''<br>''(Alpha)'


The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output
|
Accepts input of following types:
* vtkMultiBlockDataSet
|-
|'''BlockIndices''' (BlockIndices)
|
This property lists the ids of the blocks to extract
from the input multiblock dataset.
|


|
|


The value must be greater than or equal to 0
|-
 
|'''PruneOutput''' (PruneOutput)
 
|
When set, the output mutliblock dataset will be pruned
to remove empty nodes. On by default.
|
1
|
Accepts boolean values (0 or 1).
|-
|'''MaintainStructure''' (MaintainStructure)
|
This is used only when PruneOutput is ON. By default,
when pruning the output i.e. remove empty blocks, if node has only 1
non-null child block, then that node is removed. To preserve these
parent nodes, set this flag to true.
|
0
|
|
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)'
Accepts boolean values (0 or 1).


If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output
|}


|
==Extract CTH Parts==


Only the values 0 and 1 are accepted
Create a surface from a CTH volume fraction.Extract
CTH Parts is a specialized filter for visualizing the data
from a CTH simulation. It first converts the selected
cell-centered arrays to point-centered ones. It then
contours each array at a value of 0.5. The user has the
option of clipping the resulting surface(s) with a plane.
This filter only operates on unstructured data. It
produces polygonal output.


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Extract CTH
Parts filter.
|
|
| '''Input'''<br>''(Input)'
This property specifies the input dataset to the Delaunay 2D filter


|
|
Accepts input of following types:
* vtkDataSet
The dataset must contain a field array (cell)


The selected object must be the result of the following: sources (includes readers), filters
with 1 component(s).


|-
|'''Clip Type''' (ClipPlane)
|
This property specifies whether to clip the dataset, and
if so, it also specifies the parameters of the plane with which to
clip.
|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet
|
The value can be one of the following:
* None (implicit_functions)


* Plane (implicit_functions)


|-
|'''Volume Arrays''' (VolumeArrays)
|
This property specifies the name(s) of the volume
fraction array(s) for generating parts.
|
|
| '''Offset'''<br>''(Offset)'


This property is a multiplier to control the size of the initial, bounding Delaunay triangulation
|
An array of scalars is required.
|-
|'''Volume Fraction Value''' (VolumeFractionSurfaceValue)
|
The value of this property is the volume fraction value
for the surface.
|
0.1
|


|-
|'''CapSurfaces''' (CapSurfaces)
|
|


The value must be greater than or equal to 0.75
When enabled, volume surfaces are capped to produce visually closed
 
surface.


|
|
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)'
1
|
Accepts boolean values (0 or 1).
|-
|'''RemoveGhostCells''' (RemoveGhostCells)
|


This property determines type of projection plane to use in performing the triangulation
When set to false, the output surfaces will not hide contours
extracted from ghost cells. This results in overlapping contours but
overcomes holes. Default is set to true.


|
1
|
Accepts boolean values (0 or 1).
|-
|'''GenerateTriangles''' (GenerateTriangles)
|
|


The value must be one of the following: XY Plane (0), Best-Fitting Plane (2)
Triangulate results. When set to false, the internal cut and contour filters
 
are told not to triangulate results if possible.


|
|
| '''Tolerance'''<br>''(Tolerance)'
0
|
Accepts boolean values (0 or 1).


This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points
|}


| 1e-0
==Extract Cells By Region==


The value must be greater than or equal to 0 and less than or equal to 1
This filter extracts cells that are inside/outside a region or at a region boundary.
This filter extracts from its input dataset all cells that are either
completely inside or outside of a specified region (implicit function).
On output, the filter generates an unstructured grid. To use this filter
you must specify a region (implicit function). You must also specify
whethter to extract cells lying inside or outside of the region. An
option exists to extract cells that are neither inside or outside (i.e.,
boundary).


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Slice
filter.
|
|


|
Accepts input of following types:
* vtkDataSet
|-
|'''Intersect With''' (ImplicitFunction)
|
This property sets the region used to extract
cells.
|


==Delaunay 3D=
|
The value can be one of the following:
* Plane (implicit_functions)


* Box (implicit_functions)


Create a 3D Delaunay triangulation of input                                points.  It expects a vtkPointSet as input and                                produces vtkUnstructuredGrid as output
* Sphere (implicit_functions)


Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br
|-
from a list of input points. These points may be represented by any<br
|'''InputBounds''' (InputBounds)
dataset of type vtkPointSet and subclasses. The output of the filter<br
|
is an unstructured grid dataset. Usually the output is a tetrahedral<br
mesh, but if a non-zero alpha distance value is specified (called<br
the "alpha" value), then only tetrahedra, triangles, edges, and<br
vertices lying within the alpha radius are output. In other words,<br
non-zero alpha values may result in arbitrary combinations of<br
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br
The 3D Delaunay triangulation is defined as the triangulation that<br
satisfies the Delaunay criterion for n-dimensional simplexes (in<br
this case n=3 and the simplexes are tetrahedra). This criterion<br
states that a circumsphere of each simplex in a triangulation<br
contains only the n+1 defining points of the simplex. (See text for<br
more information.) While in two dimensions this translates into an<br
"optimal" triangulation, this is not true in 3D, since a measurement<br
for optimality in 3D is not agreed on.<br><br><br
Delaunay triangulations are used to build topological structures<br
from unorganized (or unstructured) points. The input to this filter<br
is a list of points specified in 3D. (If you wish to create 2D<br
triangulations see Delaunay2D.) The output is an unstructured<br
grid.<br><br><br
The Delaunay triangulation can be numerically sensitive. To prevent<br
problems, try to avoid injecting points that will result in<br
triangles with bad aspect ratios (1000:1 or greater). In practice<br
this means inserting points that are "widely dispersed", and enables<br
smooth transition of triangle sizes throughout the mesh. (You may<br
even want to add extra points to create a better point<br
distribution.) If numerical problems are present, you will see a<br
warning message to this effect at the end of the triangulation<br
process.<br><br><br
Warning:<br
Points arranged on a regular lattice (termed degenerate cases) can<br
be triangulated in more than one way (at least according to the<br
Delaunay criterion). The choice of triangulation (as implemented by<br
this algorithm) depends on the order of the input points. The first<br
four points will form a tetrahedron; other degenerate points<br
(relative to this initial tetrahedron) will not break it.<br><br><br
Points that are coincident (or nearly so) may be discarded by the<br
algorithm. This is because the Delaunay triangulation requires<br
unique input points. You can control the definition of coincidence<br
with the "Tolerance" instance variable.<br><br><br
The output of the Delaunay triangulation is supposedly a convex<br
hull. In certain cases this implementation may not generate the<br
convex hull. This behavior can be controlled by the Offset instance<br
variable. Offset is a multiplier used to control the size of the<br
initial triangulation. The larger the offset value, the more likely<br
you will generate a convex hull; and the more likely you are to see<br
numerical problems.<br><br><br
The implementation of this algorithm varies from the 2D Delaunay<br
algorithm (i.e., Delaunay2D) in an important way. When points are<br
injected into the triangulation, the search for the enclosing<br
tetrahedron is quite different. In the 3D case, the closest<br
previously inserted point point is found, and then the connected<br
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br
towards the enclosing triangle is performed.) If the triangulation<br
is Delaunay, then an enclosing tetrahedron will be found. However,<br
in degenerate cases an enclosing tetrahedron may not be found and<br
the point will be rejected.<br


{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
 
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Alpha'''<br>''(Alpha)'
This property specifies the alpha (or distance) value to contro
the output of this filter.  For a non-zero alpha value, onl
edges, faces, or tetra contained within the circumsphere (o
radius alpha) will be output.  Otherwise, only tetrahedra will b
output


|-
|'''Extraction Side''' (ExtractInside)
|
This parameter controls whether to extract cells that
are inside or outside the region.
|
1
|
The value(s) is an enumeration of the following:
* outside (0)
* inside (1)
|-
|'''Extract only intersected''' (Extract only intersected)
|
This parameter controls whether to extract only cells
that are on the boundary of the region. If this parameter is set, the
Extraction Side parameter is ignored. If Extract Intersected is off,
this parameter has no effect.
|
0
|
Accepts boolean values (0 or 1).
|-
|'''Extract intersected''' (Extract intersected)
|
This parameter controls whether to extract cells that
are on the boundary of the region.
|
0
|
|
Accepts boolean values (0 or 1).


The value must be greater than or equal to 0
|}


==Extract Component==


|
This filter extracts a component of a multi-component attribute array.
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)'


This boolean controls whether bounding triangulation points (an
{| class="PropertiesTable" border="1" cellpadding="5"
associated triangles) are included in the output. (These ar
|-
introduced as an initial triangulation to begin the triangulatio
| '''Property'''
process. This feature is nice for debugging output.
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
|


Only the values 0 and 1 are accepted
This property specifies the input of the Extract Component filter.


|


|
|
| '''Input'''<br>''(Input)'
Accepts input of following types:
 
* vtkDataSet
This property specifies the input dataset to the Delaunay 3D filter
The dataset must contain a field array ()


|-
|'''Input Array''' (SelectInputArray)
|
|


The selected object must be the result of the following: sources (includes readers), filters
This property indicates the name of the array to be extracted.


|


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet
|
The value must be field array name.
|-
|'''Component''' (Component)
|


This property indicates the component of the array to be extracted.


|
|
| '''Offset'''<br>''(Offset)'
0
|


This property specifies a multiplier to control the size of th
|-
initial, bounding Delaunay triangulation
|'''Output Array Name''' (OutputArrayName)
|


| 2.
This property indicates the name of the output scalar array.


The value must be greater than or equal to 2.5
|
Result
|




|
|}
| '''Tolerance'''<br>''(Tolerance)'


This property specifies a tolerance to control discarding o
==Extract Edges==
closely spaced points. This tolerance is specified as a fractio
of the diagonal length of the bounding box of the points


| 0.00
Extract edges of 2D and 3D cells as lines.The Extract Edges
filter produces a wireframe version of the input dataset
by extracting all the edges of the dataset's cells as
lines. This filter operates on any type of data set and
produces polygonal output.


The value must be greater than or equal to 0 and less than or equal to 1
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


|-
|'''Input''' (Input)
|
This property specifies the input to the Extract Edges
filter.
|


|
|
Accepts input of following types:
* vtkDataSet


|}


==Descriptive Statistics=
==Extract Generic Dataset Surface==


Extract geometry from a higher-order dataset
Extract geometry from a higher-order
dataset.


Compute a statistical model of a dataset and/or assess the dataset with a statistical model
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset
|-
<br
|'''Input''' (Input)
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select
|
Set the input to the Generic Geometry
Filter.
|