<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body bgcolor="#ffffff" text="#000000">
<tt>Hi,<br>
<br>
I'm doubting the following claim made in vtkStreamTracer
documentation about increased numerical accuracy when using a
normalized velocity field. It's not true when |V|<<1, even
when using adaptive step size methods. It's also not immediately
clear how the normalization impacts the error estimation algorithm
employed in the adaptive step methods. The field normalization
technique is also being employed in pv's surface LIC
implementation.<br>
<br>
I can see how normalizing the field makes the stream
tracer/surface lic easy to use, narrowing the range of input
parameters one needs to adjust to make it work on a wide variety
of datasets and making it easy to get streamlines that travel
consistently through the dataset. Unfortunately, normalizing the
vector field changes the relationships between flow features which
are defined by variations in the flow. These changes give one a
false sense of the flow behavior during visualization, feature
detection, and analysis. I think that field normalization during
integration should be optional, and should probably be disabled by
default.<br>
<br>
A few illustrative examples follow.<br>
<br>
Burlen<br>
<br>
from the vtkStreamTracer documentation:<br>
<br>
</tt><tt>> Note that normalized vectors are adopted in streamline
integration,<br>
> which achieves high numerical accuracy/smoothness of flow
lines that is<br>
> particularly guaranteed for Runge-Kutta45 with adaptive step
size and<br>
> error control).<br>
<br>
What happens when you normalize the velocity field during
streamline integration? For the sake of discussion consider a
concrete example of a shear flow (like a strong wind over the
surface of the ocean), with:<br>
<br>
</tt><tt>> V(y>0)=1000.0,2000.0<br>
> V(Y<0)=</tt><tt>0.001,0.002<br>
><br>
> let W=V/|V| then<br>
><br>
</tt><tt>> W(y>0)=0.4472,0.8944<br>
</tt><tt>> W(y<0)=0.4472,0.8944<br>
<br>
The defining characteristic of the flow, namely sharp velocity
differential across an interface, is lost.<br>
<br>
Here's example from a real dataset where normalizing the vector
field "amplifies" an insignificant feature (|V|~1e-5) in the flow.
The relative differences between features in the visualization are
completely lost.<br>
<a class="moz-txt-link-freetext" href="http://www.hpcvis.com/downloads/lic-ui-mag-200-anno.png">http://www.hpcvis.com/downloads/lic-ui-mag-200-anno.png</a><br>
<img alt="lic-ui-200" title="lic-ui-200"
src="http://www.hpcvis.com/downloads/lic-ui-mag-200-anno.png"
moz-do-not-send="true" height="570" width="1126"><br>
</tt><tt><br>
glyphing reveals true relationships:<br>
<a class="moz-txt-link-freetext" href="http://www.hpcvis.com/downloads/glyph-ui-scale-by-2000-200.png">http://www.hpcvis.com/downloads/glyph-ui-scale-by-2000-200.png</a><br>
<img alt="glyph-ui-200" title="glyph-ui-200"
src="http://www.hpcvis.com/downloads/glyph-ui-scale-by-2000-200.png"
moz-do-not-send="true" height="570" width="1126"><br>
<br>
<br>
The claim that normalizing the field during integration increases
numerical accuracy is false. when |V|<<1 exactly the
opposite occurs. the effect of normalization is similar to
computing the streamline with an increased step size, although the
"growth factor" is dependent on the direction. The amount of harm
done by this increase will depend on how much smaller |V| is
compared to 1. Because of variations in the field across a
dataset, the accuracy of the computed streamlines varies in a way
that's not easily identified and accounted for. For the sake of
discussion consider the following simple example using the Euler
method which, although exaggerates the error, shares properties
with other RK methods.<br>
<br>
</tt> <tt>> dt=0.1<br>
> x_0=0,0<br>
> V(x_0)=0.001,0.002<br>
> W(x_0)=V(x_0)/|V(x_0)|=0.4472,0.8944<br>
> x_1 = x_0 + V(x_0)dt = 0.0001,0.0002<br>
> x_1 = x_0 + W(x_0)dt = 0.0447,0.0894<br>
<br>
The effective step size taken when normalizing the field is orders
of magnitude larger. Note, using an adaptive method doesn't
prevent the overstep, and it's not clear how normalizing the field
changes the behavior of the error estimator.<br>
</tt><tt><br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
</tt>
</body>
</html>