[Insight-users] Fitting a 2D mesh model to a 2D image?
Luis Ibanez
luis.ibanez@kitware.com
Wed May 5 03:37:51 EDT 2004
Hi Zach,
Your observation is correct, the BSpline transform doesn't have a way
of returning a 'Mathematical' inverse. There are however several
methods for getting a 'reasonable' inverse.
Probably the best method at this point is the filter contribute
by Corinne Mattmann: IterativeInverseDeformationField.
http://www.itk.org/Insight/Doxygen/html/classitk_1_1IterativeInverseDeformationFieldImageFilter.html
You can generate a deformation field (an image of vectors) by sampling
the BSpline transform, then pass this field to the IterativeInverse
filter in order to obtain the inverse field. This filter is demanding
in computation time, but it is effective.
An example on how to save the deformation field from a BSpline transform
can be found in the file:
Insight/Examples/Registration/
DeformableRegistration4.cxx
Note that in the process (C) you don't need to deal with the inverse.
The deformation field will allow to map an arbitrary bacteria on top
of the canonical bacteria. It will be the equivalent of performing
atlas-based segmentation.
About your questions:
1) Yes, you were right, the DeformableMesh would have been in
your case used as a 2D contour, not taking into account the
information from inside the figure (the bacteria in this case).
In theory, yes, you can follow the movement of bacteria using a
deformable model. In practice you will find that the exercise of
finding stable parameters for this algorithm may be challenging.
It doesn't hurt to try... but don't expect and easy win here.
Do you need to do this in real-time ?
or can you simply feed video frames one after another ?
2) Well,
How about the irony of using artificial bacteria in order
to track your real bacteria ?
You may want to take a look at the
Morphogenesis application:
http://www.itk.org/HTML/Morphogenesis.htm
In this method, an aggregate of artificial cells are used
in order to perform segmentation. However, the cells could
also be programmed for performing tracking just by making
them move to places where they will find pixel intensities
similar to where they were before. In your case you could
generate a large number of small artificial cells (let's say
100 to 1000 cells) and spread them inside the region of your
real bacteria. This is easily done by setting an egg-cell
and let it reproduce given that the intensity over the cells
is inside a range that you specify. This looks like the
video:
http://www.itk.org/Art/MorphogenesisSegmentation.mpg
Once the cellular colony has stabilized inside the first
frame of your image sequence, you can replace it with
the second image sequence, and let the colony move in
order to fit again inside the new real bacteria.
The cellular aggregate is held in an itk::Mesh, where the
itk::Cell Pixel Type is an artificial cell: itk::bio::Cell.
You will find the code of this example under:
InsightApplications/
Morphogenesis
3) Yet another option is to use the paradigm of Deformable
Organisms, proposed by Hamarneh, McInerney and Terzopolus
in MICCAI 2001:
http://mrl.nyu.edu/~dt/papers/miccai01/miccai01.pdf
Ahh, the beauty of PDF papers online... !!! :-)
This approach will be ideal for tracking the bacteria
in a video sequence. However, you will have to write
a significant amount of code in order to implement it...
This is copying biology in order to do computation
that can in turn help to understand biology.
Please let us know if you have further questions.
Thanks
Luis
---------------------
Zachary Pincus wrote:
> Luis,
>
> Thanks for your (no pun) insight into the problem of putting a
> coordinate system on bacteria. Your option (c) is most clever: since
> one eventual endpoint of my mesh plan was to use that mapping to
> "canonicalize" bacteria, skipping the middleman is a great idea. Out of
> curiosity, from perusing the API docs it doesn't look like the BSpline
> methods can return inverse transforms. (Presumably because the mapping
> is not bijective so there is no well-defined inverse.) Am I mistaken on
> this point?
>
> If I might ask a couple further questions, that I might try your
> patience further:
>
> (1) Was I correct in my assumption about why the DeformableMesh3D
> filter wouldn't work (that it takes an edge mesh, not a dense
> area/volume mesh)? I ask because a related problem that I'm working on
> involves tracking moving, wiggling bacteria. In this case, having a
> deformable edge mesh that can follow the bacteria would be optimal. Can
> the DeformableMesh3D methods handle this case? It seems very analogous
> to the 3D case...
>
> (2) Regarding your option (a), I was initially quite tempted by the
> spatial object registration methods. Nevertheless, going this route
> would still leave me with the need to implement/use some kind of
> image-based mesh deformation mechanism. Are there currently any good
> options here beside rolling my own FEM methods, as in your option (b)?
>
> Thanks again,
>
> Zach Pincus
>
> On May 4, 2004, at 8:14 AM, Luis Ibanez wrote:
>
>>
>> Hi Zach,
>>
>>
>> Here are several options:
>>
>> (Option (C) is probably the easiest to use at this point)
>>
>>
>>
>> A) You may want to use it the SpatialObjects and its
>> registration support: SpatialObjectToImage registration.
>>
>> You will find the Hierarchy of spatial object under
>>
>> Insight/Code/SpatialObject
>>
>> There is a number of predefined SpatialObject in this directory.
>> Given that you are targeting a very specific problem, you probably
>> will have some advantages by creating your own variant of Spatial
>> Object.
>>
>> An example about this type of registration is presented in the
>> SoftwareGuide.
>>
>> This option is the most elegant but it will require some
>> programming effort on your part. :-)
>>
>>
>>
>> B) You could use the FEM registration mechanism, but not through
>> the FEM registration filter. The reason is that this filter
>> takes control of most of the process. In your case you probably
>> want to create your own mesh, and control the iterations of
>> the registration. Note that the FEM framework *do not* use the
>> itkMesh. There is a native FEM mesh that you should use in order
>> to create your object representation.
>>
>> This is a very formal option, but it will require a lot of
>> programming effort on your part, plus some familiarization
>> with the intricacies of FEM solvers.
>>
>>
>>
>> C) You could create a "canonical" bacterial image and use the
>> BSPline transform for doing deformable registration against
>> your actual image. This is probably the more straight forward
>> option at this point. You could "fabricate" the canonical
>> bacterial image by straighten up the image from a real bacteria.
>> This can be done with the example:
>>
>> Insight/Examples/Registration/LandmarkWarping2.cxx
>>
>> You will place source landmarks on the original bacterial image,
>> and the target landmarks will be in a rectangular grid. Pretty
>> much like the one you attached to your email.
>>
>> Once you get the canonical image, you simply use the Deformable
>> registration mechanisms illustrated in
>>
>> Insight/Examples/Registration/DeformableRegistration4.cxx
>>
>> Which is the basic ITK registration framework with a BSpline
>> DeformableTransform. You use the Canonical bacterial as the
>> fixed image, and the real image as the moving image. Note that
>> the BSplineTransform accepts a generic transform in order to
>> make composition. You can therefore use a Rigid2D transform
>> in order to take care of global translation and rotation, while
>> leaving only deformations for the BSpline transform itself.
>>
>>
>> With this approach you will map all the bacterias into the
>> reference system of the canonical bacteria and will probably
>> be able to analyze the differences between specific patterns
>> of gene expression.
>>
>> A great advantage here is that you will only have to deal
>> with the region of pixels that cover your canonical bacteria.
>> There will be good use of computation time in that case.
>>
>>
>>
>> In General:
>>
>> You may have to deal with the axial polarity of the cells,
>> and with the fact that due to the microscopy projection, you
>> lose any 3D depth information as well as rotations of the
>> bacteria along its axis, parallel to the microscope plate.
>> But those issues affect any of the image processing methods
>> that you may envisage....
>>
>>
>>
>> Please let us know if you have further questions.
>>
>>
>> Thanks
>>
>>
>>
>> Luis
>>
>>
>> ---------------------
>> Zachary Pincus wrote:
>>
>>> Hello,
>>> I've run across the need to place some sort of coordinate system on
>>> images of bent bacteria for the purposes of making measurements of
>>> protein localizations that can be compared across populations.
>>> My first-pass idea was to make a 2D grid from an ITK mesh, and then
>>> use some of the FEM model-based segmentation/registration methods to
>>> fit the grid to the bacterial image. (See attached image for a vague
>>> idea of what I'm talking about.)
>>> However, from reading the documentation for the DeformableMesh3D
>>> filter, I'm not sure if it will work off the shelf. It seems (and I
>>> am likely to be wrong) that this filter is more designed to work
>>> with the output of something like a marching cubes algorithm (or in
>>> my case, marching squares) that defines only the edge of the
>>> structure (that is, some manifold surface embedded in a
>>> higher-dimension space, like a 2D surface of a 3D object, or in my
>>> case, a 1D perimeter of a 2D object). Do the DeformableMesh3D
>>> methods work with dense meshes of the sort I'm proposing in the
>>> image below?
>>> It almost seems like I need some hybrid between the FEM deformable
>>> image registration methods and the model based segmentation methods.
>>> If the DeformableMesh3D methods won't work, are there other things
>>> that I could try off-the-shelf from ITK? If not, does anyone have
>>> any suggestions as to which classes I might try to build off of?
>>> Thanks for any input at all,
>>> Zach Pincus
>>> Department of Biochemistry and Program in Biomedical Informatics
>>> Stanford University School of Medicine
>>> Attached: Figure 0, in which my imagined inputs and outputs are
>>> illustrated in a most mediocre manner.
>>> ----------------------------------------------------------------------
>>> --
>>
>>
>>
>>
>> _______________________________________________
>> Insight-users mailing list
>> Insight-users@itk.org
>> http://www.itk.org/mailman/listinfo/insight-users
>>
>
> _______________________________________________
> Insight-users mailing list
> Insight-users@itk.org
> http://www.itk.org/mailman/listinfo/insight-users
>
More information about the Insight-users
mailing list