[Insight-users] Re: Is there a way to validate results of 3D
registrations?
Luis Ibanez
luis.ibanez at kitware.com
Tue Aug 15 12:38:00 EDT 2006
Hi Eve,
Thanks for your posting the details of your registration.
1) You should avoid using the origin of an image as a
center of rotation. At least for medical images
(CT/MR) it is safe to assume that the object of
interest is in the middle of the image, and therefore
rotation should be performed around the center of the
image.
You will find examples in the ITK Software Guide
http://www.itk.org/ItkSoftwareGuide.pdf
on how to use the image center as center of rotation
for the transform.
2) When comparing multiple transform, it is probably worth
to be more pragmatic.
For example:
Q: What happens when a transform has a perturbation ?
A: Some points in the image will be mapped to wrong locations.
Q: How bad is for each one of those points to be a
distance "d" from its ideal target ?
A: It depends on the application. Whether you are planning
a surgery, or delivering radiation treatement, or
computing statistics for diagnosis. I would suggest that
you collect a set of point from the image (as large as
you think it is representative, 1000? 100,000 ?) and
for each point compute how much positioning error results
from a wrong Transform. You can then add all those
errors squared and get an RMS value indicating the
quality of the registration. You could also post a
Worst distance (e.g. like a Haussdorf criteria) for
the entire set. Make sure that the points are actually
significant for your application. (e.g. if you are
delivering radiation treatment for brain tumors, then
failing to register the lips is not quite as
significant as failing to register the ventricles).
Remember that most image registration metrics are
"blind" in the sense that they just know about pixels,
they treat the same a pixel on a tumor than a pixel in
the pillow supporting the head in the scanner. That means
that the metric invested as much effort in matching the
pixels in the pillow as it did for matching the pixels
in the tumor.
3) Comparing transforms directly is not quite relevant because
they don't tell you whether you failed to register the
anatomically relevant pieces of the image. Such a comparison
would only be useful for supporting pedantic claims about
the "precision" of the registration. In a real situation,
it doesn't really matter how numerically close the transforms
are. What matters is how much error is introduced by mapping
points of the anatomy to incorrect locations, and this is
dependent on the clinical application.
4) For details on the difference between Offset and Translation
please read the ITK Software Guide
http://www.itk.org/ItkSoftwareGuide.pdf
in particular the Transform section of the Image Registration
chapter.
A point P in an Affine transform is converted into the point P'
by the following expression:
P' = R x ( P - C ) + C + T
where R is the rotation matrix, C is the center of rotation
and T is the Translation.
What we call the offset is the expression:
Offset = - R x C + C + T
so that
P' = R x P + Offset
Note that center of rotatin and translation can be balanced
in many combinations and still produce the same Offset.
The reason why we make the center of rotation to be explicit
is that it accelerates the convergence of the regisration for
most typical medical images (e.g. CT / MR).
The inverse of a transform does not have to use the same
center of rotation. But... it is probably easier to understand
if they keep the same center of rotation.
5) When validating segmentations and registrations it helps a lot
to think that what you are doing is processing an image for
a patient. An actual human that is going to receive medical
care based on the results of your code. With that context in
mind you will find that a pragmatic approach makes more sense
than the mathematical appearance of evaluations that are so
popular as papers in our decadent medical image Journals.
Regards,
Luis
-----------------
Eve Heyes wrote:
> Hello Luis,
>
> Thank you very much for your reply.
>
> I am using the MattesMutualInformationMetric and the VersorRigid3DOptimizer. I am indeed trying to quantify the registration's reproducibility ("limits of acceptable initial misalignments between the images" [1]) so I am concerned about the first question:
>
> 1) How much precision do your require for your clinical problem ?
> 1mm ?, 0.1 microns ? in translation; 0.1 degrees of rotation ?
>
>
> What I am trying to do is to evaluate how different parameters (# of bins, sample size, optimizer's step size, relaxation factor, etc) used affect the precision/ "accuracy" of the registration. To do so, I first applied a transform T to an ideally **registered** moving image to produce M' and ran registration on M' and F to produce the result T'. By calculating the "error" as sum(T+T'), I hope to quantify the accuracy of the method as was done in the following publications:
>
> http://jnm.snmjournals.org/cgi/reprint/44/7/1156 (Under the section "Evaluation of Algorithm") [1]
> http://www.springerlink.com/(wobrrr55ebynbfij0gcubifo)/app/home/contribution.asp?referrer=parent&backto=issue,10,24;journal,122,350;linkingpublicationresults,1:100414,1
>
> Below are 4 registration results performed on M' and F with varying # of bins:
>
> Metric Value Offsets(x,y,z) Translation (X,y,z) Vx Vy Vz
> 1) -0.0552 9.19 -3.14 -12.53 -5.36 10.88 -11.09 0.01 0.01 0.04
> 2) -0.0566 13.54 -6.89 -13.29 -4.76 11.39 -12.85 0.00 0.00 0.04
> 3) -0.0530 5.99 -2.66 -10.17 -6.05 11.46 -12.52 0.01 0.01 0.03
> 4) -0.0573 9.36 -4.03 -12.30 -5.16 11.48 -13.13 0.01 0.01 0.04
>
> (the transform T was applied with the center of rotation = origin of M; during registration between M' and F, the center of rotation = center of gravity of F)
>
> If I want to compare the numeric values as was done in Example #8, but I didn't use the centered initializer, should I be comparing the translation values and not the offset values (in contrast to the example)? [The reason that I'm comparing the numeric values is because I need to validate results of numerous registration results (~200), it is thus not possible to visualize each result individually.]
>
> Another related question is: how can I relate a transform T' that has center of rotation = 0 and its inverse T" that has center of rotation = some point in space? That is, is there any relationship between the following transforms:
>
> 1/ FinalParameters:[0.00540255, 0.00608688, 0.0391753, -4.9541, 17.0141, -2.08383]
> Matrix
> 0.996856 -0.0782221 0.0125873
> 0.0783537 0.996872 -0.0103195
> -0.0117407 0.0112733 0.999868
>
> Offset =
> [11.5463, -0.110014, -1.69868]
>
> 2/ FinalParameters:[0.0143756, 0.00588364, 0.0542648, -4.32434, 19.0096, -1.87543]
>
> Matrix =
> 0.994041 -0.108187 0.0133087
> 0.108526 0.993697 -0.0280668
> -0.0101883 0.0293439 0.999517
>
> Offset =
> [19.5658, -2.01871, -5.92431]
>
> Finally, is my approach of definiing precision incorrect? If so, could you please suggest the more suitable approach?
>
> I'm terribly sorry for the many questions asked. Thanks so much again for your help.
>
> Eve
>
>
> --- luis.ibanez at kitware.com wrote:
>
> From: Luis Ibanez <luis.ibanez at kitware.com>
> To: evelyn_heyes at homeworking.org
> CC: insight-users at public.kitware.com
> Subject: Re: Is there a way to validate results of 3D registrations?
> Date: Wed, 09 Aug 2006 19:37:45 -0400
>
> Hi Eve,
>
>
>
> 1) What image Metric are you using ?
>
> 2) What optimizer are you using ?
>
>
>
> It is normal for some metrics and some optimizers to report
> different registration results at different runs. This is
> mainly due to the fact that part of their processing use
> random number generators.
>
>
>
> This is the case for:
>
> a) MattesMutualInformationImageToImageMetric
> b) MutualInformationImageToImageMetric (Viola-Wells)
> c) OnePlusOneOptimizer
>
>
>
> The variations on the results are probably not related to the
> Transform but to the use of one of the metrics or optimizers
> described above.
>
>
> Image Registration is a *"satisfaction"* kind of problem, meaning
> that there is *NO* correct answer for a registration problem. What
> you can hope to get, are answers that are *"Good Enough"* for solving
> the problem at hand at a particular cost (in computation time and
> time invested in tunning parameters).
>
>
> The idea of "Validating" a registration is un-realistic and
> impractical. It is only good for publishing papers in Decadent
> Reputation-Based Journals.
>
>
> In the real world, what makes sense is to define your requirements:
>
>
> 1) How much precision do your require for your clinical problem ?
> 1mm ?, 0.1 microns ? in translation; 0.1 degrees of rotation ?
>
> 2) How much time can you allow for your registration to compute ?
>
> 3) How much of your time can you allow for fine tunning the
> parameters of your registration.
>
>
> By defining the answers to these questions you will be able to
> determine when a registration is "good enough" for your application.
>
>
> A visual comparison of the numerical values of the Rigid3DTransform
> is far from ideal for comparing the registration results. The best
> way is to resample the moving image using these transforms and to
> visualize the fixed image and resampled moving image together. You
> can do this by using VTK filters such as blending, or contouring,
> as well as by using the checker board 3D widget and the Rectilinear
> Wipe widget.
>
>
>
> Regards,
>
>
> Luis
>
>
> ----------------
> Eve Heyes wrote:
>
>>Hello ITK-users,
>>
>>I'm currently performing 3D registrations using the
>>Versor3DRigidTransform. The following results are generated from 5
>>separate registrations. While the output image of each registration
>>appears to be registered, the transforms used are very different. Can
>>anyone suggest the reason for this? Or, is there a more appropriate way
>>to validate registration results?
>>
>>Center of rotation used: [249.057, 227.131, 162.8]
>>
>>1/
>>Total iterations: 161
>>finalParameters:[0.00540255, 0.00608688, 0.0391753, -4.9541, 17.0141,
>>-2.08383]R
>>esult =
>>Matrix =
>>0.996856 -0.0782221 0.0125873
>>0.0783537 0.996872 -0.0103195
>>-0.0117407 0.0112733 0.999868
>>Offset =
>>[11.5463, -0.110014, -1.69868]
>>
>>2/
>>finalParameters:[0.0143756, 0.00588364, 0.0542648, -4.32434, 19.0096,
>>-1.87543]
>>Result =
>>Matrix =
>>0.994041 -0.108187 0.0133087
>>0.108526 0.993697 -0.0280668
>>-0.0101883 0.0293439 0.999517
>>Offset =
>>[19.5658, -2.01871, -5.92431]
>>
>>3/
>>finalParameters:[0.00947486, 0.00226582, 0.0538679, -4.69318, 17.5486,
>>-2.53202]
>>Result =
>>Matrix =
>>0.994186 -0.107531 0.00554563
>>0.107617 0.994017 -0.0186772
>>-0.00350407 0.0191654 0.99981
>>Offset =
>>[20.2757, -4.85468, -5.98147]
>>
>>4/
>>Result =
>> versor X = 0.0173336
>> versor Y = 0.00266175
>> versor Z = 0.0355604
>> Translation X = -5.71732
>> Translation Y = 10.6958
>> Translation Z = -12.211
>> Iterations = 168
>> Metric value = -0.0675146
>>Matrix =
>>0.997457 -0.0709726 0.0065521
>>0.0711572 0.99687 -0.0344506
>>-0.00408654 0.0348292 0.999385
>>Offset =
>>[9.96952, -0.706962, -19.0038]
>>
>>5/
>>Result =
>> versor X = -0.0481644
>> versor Y = -0.00675409
>> versor Z = 0.00144871
>> Translation X = 4.41412
>> Translation Y = 10.6494
>> Translation Z = -33.1111
>> Iterations = 62
>> Metric value = -0.0728881
>>Matrix =
>>0.999905 -0.00224338 -0.0136317
>>0.0035446 0.995356 0.0961951
>>0.0133526 -0.0962342 0.995269
>>Offset =
>>[7.16668, -4.83927, -13.8086]
>>
>>
>>Thanks in advance for your help.
>>
>>Eve
>>
>>
>>------------------------------------------------------------------------
>>Visit http://www.homeworking.com ~ To help you work at home
>
>
>
>
> _____________________________________________________________
> Visit http://www.homeworking.com ~ To help you work at home
>
>
More information about the Insight-users
mailing list