[Insight-developers] Gaussian filter performance: Ver 2.0
Miller, James V (Research)
millerjv@crd.ge.com
Tue, 13 May 2003 08:40:20 -0400
I missed the conversation on the tcon about this topic but
there is something else to keep in mind when choosing
between these two methods.
The discrete gaussian uses a convolution kernel specifically
designed such that smoothing and derivative computations that
commute before discretization also commute after discretization.
If you smooth an image with a standard gaussian kernel, and then
try to compute derivatives using finite differences, the computations
are polluted with a large amount of numerical imprecision. This
is why people normally convolve with a derivative of a gaussian
instead of taking finite differences on a gaussian smoothed image.
The discrete gaussian approach, smoothes with a modified kernel
such that you can use finite differences for the derivative
calculations without the impact of the numerical noise.
This is important when you need to calculate a large number
of derivatives. Consider a volume where you need the smoothed
image, the 3 first derivatives, and the various second and
cross derivatives. If you convolve with a derivative of a gaussian,
you wind up doing
Smooth X : Smooth Y : Smooth Z
Smooth Y : Smooth Z : Smooth first derivative X
Smooth X : Smooth Z : Smooth first derivative Y
Smooth X : Smooth Y : Smooth first derivative Z
Smooth Z : Smooth first derivative X : Smooth first derivative Y
Smooth Y : Smooth first derivative X : Smooth first derivative Z
Smooth X : Smooth first derivative Y : Smooth first derivative Z
Smooth Y : Smooth Z : Smooth second derivative X
Smooth X : Smooth Z : Smooth second derivative Y
Smooth X : Smooth Y : Smooth second derivative Z
Some of these smoothing stages can be reused. But it is still
a lot of convolutions with a large kernel.
In the discrete gaussian case, you can just smooth X,Y,Z and
then take your standard finite difference derivatives
which are much smaller kernels.
Depending on how many derivatives you need (and the dimension
of your problem), then you may want to use the DiscreteGaussian.
But for "large" kernels or just a few derivatives, the recursive
gaussian may be yield better performance.
> -----Original Message-----
> From: Paul Yushkevich [mailto:pauly@cognitica.com]
> Sent: Monday, May 12, 2003 5:47 PM
> Cc: insight-developers@public.kitware.com
> Subject: [Insight-developers] Gaussian filter performance: Ver 2.0
>
>
> I ran a more comprehensive test of Gaussian filter
> performance. Here is
> the experiment that I ran:
>
> For a range of values of scale (meaning standard deviation),
> I compared
> the computation and output of DiscreteGaussianImageFilter (DGIF) and
> RecursiveGaussianImageFilter, as applied to a 3D volume. A range of
> DGIF's MaximumError values was used.
>
> For each scale, I computed what I considered the 'ground
> truth' Gaussian
> convolution of the image. For this ground truth I used DGIF with a
> value of MaximumError low enough not to generate exceptions, i,e, fit
> into a 32x32x32 pixel box. I then computed convolutions
> using DGIF with
> larger error rates and using the RGIF.
>
> I compared the result of each such convolution to the 'ground truth'
> image for its scale. I recorded the mean squared error, the maximum
> absolute error and the bias of the result w.r.t. the ground
> truth. The
> attached file contains a table of results.
>
> The punchline (at least for the image that I used) is this:
>
> *** For sigma < 1 pixel, the RGIF should not be used
>
> *** For sigma >= 3 pixels, the RGIF gives errors on the same order as
> the DGIF with MaximumError = 0.01, and RGIF is an order of magnitude
> faster, so unless great precision is required, the RGIF
> should be used.
>
> *** For sigma between 1 and 3 pixels, there is a tradeoff
> between RGIF
> and DGIF in terms of speed and performance.
>
> Sorry, I did not use the VTK Gaussian in this experiment. My
> intuition
> from the last round of experiments that it performs more or
> less as well
> as the DGIF for the same error rates.
>
> Paul.
>