[Insight-users] Registering US -> CT
N.E. Mackenzie Mackay
9nem at qlink.queensu.ca
Fri Sep 24 12:22:38 EDT 2004
Hi Luis,
I think I will try that.
I was wondering how you would go about making that 3D image wit the
middle image ( at z=0 with normal along the z axis ) the ultrasound
image.
Thanks,
Neilson
On Sep 19, 2004, at 7:51 PM, Luis Ibanez wrote:
>
> Hi Neilson,
>
> No,
> if you read a 2D image and you apply a 3D blurring, what you
> will get is an Exception thrown. A 2D image is considered
> to be a degenerate 3D image and therefore it is not suitable
> for being the input of a 3D filter.
>
> What you can do, however, is to read the 2D image, declare
> a 3D image with thickness 5 pixels (or 2N+1...) copy the
> slice that you read into the middle slice of the 2N+1
> image. Then, apply a blurring filter.
>
> You can set the pixel spacing Dz in this new 3D image in such
> a way that Dz * (2N+1) = the physical thicknes of the sensitive
> plane of your Ultra Sound acquisition probe.
>
> In that way this new 3D image will be a better representation
> of the physical reality that your Ultra Sound 2D image captured.
>
>
> Regards,
>
>
> Luis
>
>
> ---------------------------------
> N.E. Mackenzie Mackay wrote:
>
>> That sounds like a good idea.
>> If I read in the 2d image as 3D and applied a blurring filter would
>> that cause a 3D blurring ( blur the pixels outside of the plane )?
>> If so, I could set the "thickness" by the setting the radius of the
>> blurring mask.
>> Just a thought.
>> Neilson
>> On Sep 16, 2004, at 6:47 PM, Luis Ibanez wrote:
>>>
>>> Hi Neilson,
>>>
>>> That's a good point. You are right,
>>> in this case, since you have scattered slices
>>> from Ultra Sound it is not possible to make sure
>>> that every point will be inside an image.
>>>
>>> One option could be to associate a "thickness"
>>> to every US slice, you can probably figure out
>>> one that makes sense from the point of view of
>>> the physical acquisicion process.
>>>
>>> That thickness could be used for defining
>>> regions of space where a point will be considered
>>> to be "inside" one of the US images.
>>>
>>> The more US image you have, the more chances
>>> there are that this could lead to a reasonable
>>> registration.
>>>
>>> Note that this requires you to do more modifications
>>> on the ImageMetric class.
>>>
>>>
>>>
>>>
>>> Regards,
>>>
>>>
>>> Luis
>>>
>>>
>>>
>>> --------------------------------
>>> N.E. Mackenzie Mackay wrote:
>>>
>>>> I was thinking the same thing.
>>>> The only thing I was worried about is using the method in 3D. If
>>>> some of the points don't map onto the the US image will the
>>>> registration method ignore those points or will it throw an error?
>>>> On Sep 14, 2004, at 10:33 PM, Luis Ibanez wrote:
>>>>
>>>>>
>>>>> Hi Neilson,
>>>>>
>>>>> MutualInformation is mostly a region-based image metric.
>>>>> This means that its value gets better when the overlap
>>>>> between matching regions of the image is large. Mutual
>>>>> Information is not particularly well suited for matching
>>>>> thin structures since thir random sampling is unlikely
>>>>> to select many pixels belonging to those structures.
>>>>>
>>>>> In that sense you probably shouldn't expect much from Mutual
>>>>> Information for registering Bone, since bone structures are
>>>>> mostly shell-like and they don't fill large regions of space.
>>>>> E.g. large bones have their layers of cortical bone with high
>>>>> calcifications but their width usually cover just a couple of
>>>>> pixels in a CT scan.
>>>>>
>>>>> Given that you seem to have segmented the bone from the CT scan,
>>>>> it is probably worth to try a Model-to-Image registration approach.
>>>>> This can be done by taking points on the surface of your bone
>>>>> segmentation, and/or from a band around that surface, and using
>>>>> them to match the intensities (and structure) of the same bone as
>>>>> seen in the UltraSound images.
>>>>>
>>>>> Could you post a couple of the US images ?
>>>>>
>>>>> (e.g. you could put them in www.mypacs.net and let us know their
>>>>> image ID).
>>>>>
>>>>>
>>>>> Depending on how the bone structures look like on the US image
>>>>> there may be different possible metrics to try in a PointSet to
>>>>> Image registration.
>>>>>
>>>>>
>>>>> BTW, when you start working in 3D, don't attempt to use Rigid
>>>>> transforms until you have manage to tune all the other parameters
>>>>> of the registration to work with simple translation transforms.
>>>>> It is more effective to deal with a single issue at a time.
>>>>>
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>>
>>>>> Luis
>>>>>
>>>>>
>>>>>
>>>>> -------------------------------
>>>>> N.E. Mackenzie Mackay wrote:
>>>>>
>>>>>> Hi,
>>>>>> I have tried for the last while to get a single ultrasound
>>>>>> image to register to a CT volume. Specifically try to get the
>>>>>> bone of an ultrasound image and bone of the CT to register
>>>>>> together. Up to now I am having quite some trouble.
>>>>>> I have been able to segment the bone from CT and give an estimate
>>>>>> on where the bone is in the ultrasound. I am now trying to
>>>>>> register those two images.
>>>>>> This is what I am using:
>>>>>> MattesMutualInformationImageToImageMetric - decided to use this
>>>>>> because the registration was of two different modalities.
>>>>>> Couldn't use feature registration becuase to hard to segment
>>>>>> ultrasound correctly
>>>>>> - using 50 bins and %20-%100 of samples still doesn't give
>>>>>> adequate results.
>>>>>> linearInterpolateImageFunction
>>>>>> RegularSetGradientDecentOptimizer
>>>>>> Euler3DTransform
>>>>>> Both images ( 3D CT and US ) are normalized before the
>>>>>> registration.
>>>>>> I have used a maximum step ranging from 0.5-6. And a min step of
>>>>>> 0.005 - 1.
>>>>>> I have a good initial guess ( maximum 2cm away from correct with
>>>>>> about 0- 30degrees of rotation)> I tested out the registration
>>>>>> method in 2D and have had success. When I use the exact same
>>>>>> variables applied in 3D the registration is poor.
>>>>>> Does anyone have any suggestions? I would be happy to provide a
>>>>>> couple of images or actual code to show you what I am dealing
>>>>>> with.
>>>>>> Neilson
>>>>>> _______________________________________________
>>>>>> Insight-users mailing list
>>>>>> Insight-users at itk.org
>>>>>> http://www.itk.org/mailman/listinfo/insight-users
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>>
>>>
>>>
>> _______________________________________________
>> Insight-users mailing list
>> Insight-users at itk.org
>> http://www.itk.org/mailman/listinfo/insight-users
>
>
>
>
>
More information about the Insight-users
mailing list