[Insight-users] Doubt about scales parameters in registration
Luis Ibanez
luis.ibanez at kitware.com
Tue May 2 18:54:13 EDT 2006
Hi Jose,
The parameter scaling may have to be changed from one pair
of Fixed/Moving image to another.
The image characteristics that will influence your choice
of the parameters scaling are:
1) How much translation (in mm) do you expect to need
2) How much rotation (in radians) do you expect to need
Notice that the estimation of Translation is not trivial
if you have rotations involved, in particular if you don't
set the center of rotation to be in the center of the image.
The reason is that a rotation may be perceived as performing
a translation on the image, when the center of rotation is
in one of the image corners.
The bottom line is that the scaling parameters are designed
to give you a mechanism for explaining the optimizer that
rotations should move in steps of 0.01, while translations
should move in steps of 10, just to cite an example.
Regards,
Luis
==============================
José Santamaría López wrote:
> Hi Luis,
>
>
> Luis Ibanez
>
>>Hi Jerome,
>>
>>
>>The way the parameter scaling is used by every optimizer depends
>>on the strategy of the specific optimizer.
>>
>
>
> Does also the parameter scaling depends on every particular
> couple of images considered ? Should I change it every time
> with the intention of obtaining good results ?
>
> Thanks,
>
> Jose.
>
>
>>In all cases, however, the goal is to make uniform the dynamic
>>range of the dimensions in the parametric space.
>>
>>
>>
>>You will see that in the case of the GradientDescent optimizers,
>>the scale parameters are used for "dividing" the Gradient values,
>>
>>
>> eg.: itkRegularStepGradientDescentBaseOptimizer.cxx: line 205
>>
>>
>>which results in shorter steps done along the direction of parameters
>>that have high scaling values, and longer steps will be taken along
>>the parameters that have low scaling values. In a typical case of
>>a rigid transform, this means that you want to put high scaling values
>>in the rotation scaling-parameters and low values in the translation
>>scaling-parameters.
>>
>>
>>
>>In the case of the OnePlusOne optimizer, the scales parameters are used
>>for dividing the Radius of the area over which random samples will be
>>thrown for the next generation of the population.
>>
>>
>>In this case, small scaling-parameters will result in large radius,
>>which gives the opportunity for the samples to "walk far" from the
>>current position along that particular direction.
>>
>>
>> e.g: itkOnePlusOneEvolutionaryOptimizer.cxx: line 123
>>
>>
>>Note that the scaling is used to regulate the radius, so you will
>>get similar result is you use the following pairs of parameters:
>>
>>
>> Radius 1000.0 with Scaling 1000.0
>> Radius 1.0 with Scaling 1.0
>> Radius 0.0001 with Scaling 0.0001
>>
>>
>>A similar situation happens with the GradientDescent optimizers,
>>you could compensate the scaling-parameters with changes in the
>>StepLength (in the regular step) or with changes in the learning
>>rate (in the standard gradient descent).
>>
>>
>>In any of these conditions, it is important, as a sanity check,
>>to add Command/Observers the optimizers and to monitor how they
>>evolve at every iteration.
>>
>>
>>Please let us know if you find any suspicious behavior in the
>>optimizers.
>>
>>
>> Thanks
>>
>>
>>
>>
>> Luis
>>
>>
>>
>>=======================
>>SCHMID, Jerome wrote:
>>
>>>Hi,
>>>
>>>I understand correctly the need of scaling in the registration
>>>parameters of the transformation but I have a doubt concerning the good
>>>passing of arguments.
>>>
>>>As suggested by many examples, and the wiki one has to pass a scale
>>>parameter that will be *multiplied* to the internal data in order to put
>>>all the parameters into a same dynamic range. E.g:
>>>
>>>// Scale the translation components of the Transform in the Optimizer
>>>OptimizerType::ScalesType scales( transform->GetNumberOfParameters() );
>>>const double translationScale = 1000.0; // dynamic range of translations
>>>const double rotationScale = 1.0; // dynamic range of rotations
>>>scales[0] = 1.0 / rotationScale;
>>>scales[1] = 1.0 / rotationScale;
>>>scales[2] = 1.0 / rotationScale;
>>>scales[3] = 1.0 / translationScale;
>>>scales[4] = 1.0 / translationScale;
>>>scales[5] = 1.0 / translationScale;
>>>
>>>this is typically example for 3D rigid reg.
>>>
>>>But if I have a look to for instance the OnePlusOne optimizer or the
>>>Powell one, the scales are *divided*, e.g. from powell code:
>>>
>>>for(unsigned int i=0; i<m_SpaceDimension; i++)
>>> {
>>> m_LineDirection[i] = m_LineDirection[i] / this->GetScales()[i];
>>> }
>>>
>>>A great paper of the insight journal on shape to image reg (
>>>"Model-Image Registration of Parametric Shape Models: Fitting a Shell to
>>>the Cochlea" ) based on the OnePlusOne optimizer, uses to set the scales
>>>at 1000.0 instead of 1/1000.0...
>>>
>>>Is it done purposely, i.e. these optimizers require such choice, or it
>>>is simply a misunderstanding on how scale must be used, i.e. divided or
>>>multiplied?
>>>
>>>Thanks.
>>>
>>>Best Regards,
>>>
>>>Jerome Schmid
>>>
>>>
>>>_______________________________________________
>>>Insight-users mailing list
>>>Insight-users at itk.org
>>>http://www.itk.org/mailman/listinfo/insight-users
>>>
>>>
>>
>>
>>_______________________________________________
>>Insight-users mailing list
>>Insight-users at itk.org
>>http://www.itk.org/mailman/listinfo/insight-users
>>
>
>
>
>
>
More information about the Insight-users
mailing list