[Paraview-developers] normalizing vector field during integration

burlen burlen.loring at gmail.com
Wed Oct 3 21:25:02 EDT 2012


Hi,

I'm doubting the following claim made in vtkStreamTracer documentation 
about increased numerical accuracy when using a normalized velocity 
field. It's not true when |V|<<1, even when using adaptive step size 
methods. It's also not immediately clear how the normalization impacts 
the error estimation algorithm employed in the adaptive step methods. 
The field normalization technique is also being employed in pv's surface 
LIC  implementation.

I can see how normalizing the field makes the stream tracer/surface lic 
easy to use, narrowing the range of input parameters one needs to adjust 
to make it work on a wide variety of datasets and making it easy to get 
streamlines that travel consistently through the dataset. Unfortunately, 
normalizing the vector field changes the relationships between flow 
features which are defined by variations in the flow. These changes give 
one a false sense of the flow behavior during visualization, feature 
detection, and analysis. I think that field normalization during 
integration should be optional, and should probably be disabled by default.

A few illustrative examples follow.

Burlen

from the vtkStreamTracer documentation:

 > Note that normalized vectors are adopted in streamline integration,
 > which achieves high numerical accuracy/smoothness of flow lines that is
 > particularly guaranteed for Runge-Kutta45 with adaptive step size and
 > error control).

What happens when you normalize the velocity field during streamline 
integration? For the sake of discussion consider a concrete example of a 
shear flow (like a strong wind over the surface of the ocean), with:

 > V(y>0)=1000.0,2000.0
 > V(Y<0)=0.001,0.002
 >
 > let W=V/|V| then
 >
 > W(y>0)=0.4472,0.8944
 > W(y<0)=0.4472,0.8944

The defining characteristic of the flow, namely sharp velocity 
differential across an interface, is lost.

Here's example from a real dataset where normalizing the vector field 
"amplifies" an insignificant feature (|V|~1e-5) in the flow. The 
relative differences between features in the visualization are 
completely lost.
http://www.hpcvis.com/downloads/lic-ui-mag-200-anno.png
lic-ui-200

glyphing reveals true relationships:
http://www.hpcvis.com/downloads/glyph-ui-scale-by-2000-200.png
glyph-ui-200


The claim that normalizing the field during integration increases 
numerical accuracy is false. when |V|<<1 exactly the opposite occurs. 
the effect of normalization is similar to computing the streamline with 
an increased step size, although the "growth factor" is dependent on the 
direction. The amount of harm done by this increase will depend on how 
much smaller |V| is compared to 1. Because of variations in the field 
across a dataset, the accuracy of the computed streamlines varies in a 
way that's not easily identified and accounted for. For the sake of 
discussion consider the following simple example using the Euler method 
which, although exaggerates the error, shares properties with other RK 
methods.

 > dt=0.1
 > x_0=0,0
 > V(x_0)=0.001,0.002
 > W(x_0)=V(x_0)/|V(x_0)|=0.4472,0.8944
 > x_1 = x_0 + V(x_0)dt = 0.0001,0.0002
 > x_1 = x_0 + W(x_0)dt = 0.0447,0.0894

The effective step size taken when normalizing the field is orders of 
magnitude larger. Note, using an adaptive method doesn't prevent the 
overstep, and it's not clear how normalizing the field changes the 
behavior of the error estimator.




















-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview-developers/attachments/20121003/a3f19f28/attachment.htm>


More information about the Paraview-developers mailing list