Proposals:IteratorTraits

From KitwarePublic
Revision as of 13:44, 7 April 2005 by Millerjv (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Iterator Traits

There was on the ITK user list relating to a sparse image representation. The user wanted to create a sparse image representation and pass the this image representation to existing ITK filters. This will not currently work with ITK because some early design decisions relative to performance and flexibility preclude this.

Image Iterators

ITK has such a rich set of iterators for an image. RegionIterators, ReverseRegionIterators, FloodFillIterators, NeighborhoodIterators, ReflectiveIterators, etc. The iterators are not "part" of the Image class but exist outside of the class. This allows great flexibility and skirts some compiler issues (related to having all these iterators in a single class). Note that N-dimensional iteration about an image is much more complicated than iterating an STL container. Each of the ITK iterators has to maintain quite a bit of state and cache.

Iterator use in filters

The impact of separating the iterators from the Image container classes is that each FILTER decides for itself what type of iterator it wants to use. For instance, the DiscreteGaussianImageFilter uses a standard NeighborhoodIterator, the ConnectedComponentImageFilter uses a ShapedNeighborhoodIterator, etc. If you passed a SparseImage to an ITK filter, the iterators used by that filter would have to support accessing a SparseImage.

Performance

For performance reasons, most the image iterators use pointer arithmetic to navigate the N-dimensional image. Pointer arithmetic is also used to converting an interator to an index position.

ImageRegionIterator caches a pointer for "start" of the image. It then keeps track on a 1D offset for the current pixel position. Pixels are accessed as *(start+offset). The 1D offset is only converted to an ND itk::Index when the user specifically asks for an index. I

ImageRegionIteratorWithIndex always maintains a valid itk::Index internally. This index is update with every call to operator++. Internally, pointers are kept to the current pixel being examined.

NeighborhoodIterators maintain a collection of pointers to pixels within an image. As an image moves, all the pointers are incremented.

Proposals

A) Remove pointer math from iterators

A few years ago, I realized that if we had made a slight change to the design (and not exposed a pointer to the image buffer), all the iterators could have been written using a 1D offset. The iterators would access a pixel via an interface to the PixelContainer via this 1D offset. This would allow different image organizations using the same set of iterators. ITK images are contiguous blocks of memory. If we designed the iterators to use 1D offsets, we could create the illusion of contiguous memory but support SparseImages or images that are only slice contiguous, etc.

Option A is a very general solution. But this is very invasive and we would have to determine whether there are any performance issues due to using a one extra function call to access a pixel inplace of a direct memory access.

B) Separate iterators and filters for SparseImage

Create separate iterators for a SparseImage and separate filters to process a SparseImage. For instance, DiscreteGaussianSparseImageFilter, etc. This requires duplicating filter code.

Option B is probably the better short option if there are only a few filters that need to run on a SparseImage. If we need the breadth of filters in ITK with SparseImages, then something like Option A would have to be explored.

C) Iterator traits (or ImageTraits)

Create a mechansism for a filter to request a "class" of iterator from some trait class. IteratorTraits<ImageType::Layout>::ImageRegionIterator would be a typedef to an image region iterator suitable for the particular type of image (dense or sparse image). IteratorTraits<ImageType::Layout>::ShapedNeighborhoodIterator would be a typedef to a shaped neighborhood iterator for the particular type of image. Etc.

To avoid having to specialize IteratorTraits for every type of Image class AND instantiation (2D, 3D, 4D, short, float, Vector<float>, etc.) the trait should be based on some subtype of an image. The trait would be used as

 IteratorTraits<ImageType::Layout>::ImageRegionIterator it;

The trait class would look something like

 template <class T>
 class IteratorTraits
 {
 public:
 }
 template<>
 class IteratorTraits<ImageBase::DenseLayout>
 {
 public:
    typedef ImageRegionIterator<ImageType> ImageRegionIterator;
    typedef ImageRegionIteratorWithIndex<ImageType> ImageRegionIteratorWithIndex;
 }

This approach would require a second set of iterator classes that operate on SparseImages. However, separate filters for dense and sparse images would NOT be required. Instead, we would just have to edit EVERY filter to use the IteratorTraits<> to create iterators instead of creating them directly. We'd have to retrain the developers but this is doable. We can even write a script to change the existing code.