[Insight-developers] Changes in Optimizers

Luis Ibanez luis.ibanez@kitware.com
Sun, 24 Feb 2002 04:44:08 -0500


Hi,


The changes in optimizers are almost ready.
An experimental Build is on its way to the dashboard.

These changes have not been checked in yet.

---

Optimizers will no longer be templated over the
CostFunction (which is the Metric for registration).

A classical hierarchy of Base class and virtual
functions has been added for both Optimizers and
CostFunctions (Metrics will derive from
SingleValuedCostFunctions)


The hierarchy will look as follows:

CostFunction
\ SingleValuedCostFunction
\ MultipleValuedCostFunction



Optimizer
\NonLinearOptimizer
\\SingleValuedNonLinearOptimizer
\\\GradientDescentOptimizer
\\\\QuaternionTransformGradientDescentOptimizer
\\\RegularStepGradientDescentBaseOptimizer
\\\\RegularStepGradientDescentOptimizer
\\\\VersorTransformOptimizer
\\\SingleValuedNonLinearVnlOptimizer
\\\\AmoebaOptimizer
\\\\ConjugateGradientOptimizer
\\\\LBFGSOptimizer
\\MultipleValuedNonLinearOptimizer
\\\MultipleValuedNonLinearVnlOptimizer
\\\\LevenbergeMarquardtOptimizer


A Set of Adaptors will still be used for wrapping
the vnl_optimizers.


The only Optimizer that  is resisting assimilation is
the evolutionary OnePlusOne, which happens to
be templated over a Statistical class too. It is used
in the MRIBias correction filters,.. so that will
show errors in the experimental build.


Luis