[Insight-users] problem with MI regsitration

Luis Ibanez luis.ibanez at kitware.com
Fri Nov 26 19:25:33 EST 2004


Hi   P.J.H de Koning,

The random iterator used in the MutualInformation metric
is internally using a *Uniform* distribution.


You will find this iterator in

    Insight/Code/Common/
       itkImageRandomConstIteratorWithIndex.th
       itkImageRandomConstIteratorWithIndex.txx


and the implementation of the uniform statistical
distribution is the one provided by VXL/VNL in

              vnl_sample_uniform


For examples on how to reinitialize the seed for
the random number generator, please look at *any*
of the following examples:

       Insight/Examples/Registration/
              RegistrationExamples2.cxx
              RegistrationExamples3.cxx
              RegistrationExamples4.cxx
              RegistrationExamples5.cxx
              RegistrationExamples6.cxx
              RegistrationExamples7.cxx
              RegistrationExamples8.cxx
              DeformableRegistration8.cxx


You will find in all of them a line similar to
the following:

          vnl_sample_reseed(8775070);



For details on the Random iterator please read
the Chapter on Image Iterators by Josh Cates,
in the ITK Software Guide.

       http://www.itk.org/ItkSoftwareGuide.pdf

In particular, you should read Section 11.3.5,
starting in pdf-page 497.



   Regards,


     Luis




-------------------------------
Koning, P.J.H. de (LKEB) wrote:

> On Wed, 10 Nov 2004 18:09:18 -0500, Luis Ibanez 
> <luis.ibanez at kitware.com> wrote:
> 
> You mentioned that you need to initialize the seed of the random number 
> generator in order to reproduce a run. How do I do that? What kind of 
> random number generator is used?
> 
>>
>> Hi Yannick,
>>
>> Mutual Information is *very* noisy. The fact that you are seeing a
>> Metric plot that is *not monotonically decreasing* doesn't necessarily
>> mean that the optimizer is not *trying* to minimize the cost function.
>>
>> Note also that every run of the Mutual Information metric will give
>> you different values because the points used for computing the metrics
>> are randomly selected. If you want to reproduce a run you have to make
>> sure that you initialize the seed of the random number generator.
>> Otherwise, every run will give you a different metric plot, even if you
>> set the exact same input parameters.
>>
>> One way of getting around this is to use Evolutionary Optimizers that
>> better suited for noisy cost functions. You will find a use of the
>> OnePlusOneEvolutionary optimizer along with Mutual Information in the
>> files
>>
>>         Insight/Examples/Registration/
>>                     ImageRegistration11.cxx
>>                     ImageRegistration14.cxx
>>
>>
>> Regards,
>>
>>
>>     Luis
>>
>>
>> ----------------------
>> Yannick Allard wrote:
>>
>>> Hi Luis,
>>>
>>> It is just that the optimizer was not "maximizing" the MI when I 
>>> called the
>>> MaximizeOn() function of the optimizer but I did maximized it when I 
>>> call the
>>> MinimizeOff(). I looked at the source code and the MinimizeOff() call 
>>> the
>>> MaximizeOn()... I'm currently playing with the Learning rate of the 
>>> optimizer
>>> to see if it is not the actual problem... maybe it was set a bit too 
>>> high and
>>> therefore the registration was not converging properly. In fact I'm 
>>> still
>>> observing some strange jump of the MI during optimization...
>>>
>>> #iter  MI
>>>
>>> 131   0.166983
>>> 132   0.174018
>>> 133   0.174946
>>> 134   0.183752
>>> 135   0.175778
>>> 136   0.178464
>>> 137   0.133462   [
>>> 138   0.18057
>>> 139   0.207989
>>> 140   0.206494
>>> 141   0.20218
>>> 142   0.184866
>>> 143   0.161572
>>> 144   0.182577
>>> 145   0.195277
>>>
>>>
>>> I'll let you know.
>>>
>>> Thank you
>>>
>>> Yannick
>>>
>>> Selon Luis Ibanez <luis.ibanez at kitware.com>:
>>>
>>>
>>>> Hi Yannick
>>>>
>>>> Thanks for letting us know that you solved the problem.
>>>>
>>>>
>>>> Just to double-check:...
>>>>
>>>> What you found is that calling
>>>>
>>>>               optimizer->MaximizeOn();
>>>>
>>>> is not equivalent to calling
>>>>
>>>>              optimizer->MinimizeOff();    ??
>>>>
>>>>
>>>>
>>>> If that's the case, that sounds like a bug in the optimizer...
>>>>
>>>>
>>>> Please let us know so we can trace this in case is a real bug.
>>>>
>>>>
>>>>
>>>>    Thanks
>>>>
>>>>
>>>>       Luis
>>>>
>>>>

.....more...





More information about the Insight-users mailing list