[Insight-users] Coupling level set segmentation filters

Miller, James V (Research) millerjv at crd.ge.com
Fri May 13 08:31:49 EDT 2005


I am not sure I can answer the specific question.  I think I need a few 
more details.

The thing to remember with the pipeline is that requests propragate
from downstream to upstream and data flows from upstream to downstream.

So the pipeline places a request for a certain region size on the output
of a filter. The filter translates that request to its input in the 
method GenerateInputRequestedRegion. 

So in your code where you set the output requested region to match
the input requested region, you are violating the pipeline mechanism.
The pipeline requested region should already have been established by 
the pipeline.

You should be able to do what you want with the pipeline. The trick
is that the pipeline cannot have any active "loops".  Your two internal filters
each have an input which is the other's output.  At the time that these 
calculate these inputs cannot actually be connected to the other's 
output.

Like you said, a number of filters in ITK iterate internally using a 
DisconnectPipeline() approach: FiniteDifferenceImageFilter, 
VotingBinaryIterativeHoleFillingImageFilter, GrayscaleGeodesicdilateImageFilter, etc.

Jim




-----Original Message-----
From: insight-users-bounces+millerjv=crd.ge.com at itk.org
[mailto:insight-users-bounces+millerjv=crd.ge.com at itk.org]On Behalf Of
Nils H. Busch
Sent: Thursday, May 12, 2005 8:52 AM
To: insight-users at itk.org
Subject: RE: [Insight-users] Coupling level set segmentation filters


Hi Jim,

thanks a lot for the help on connecting the filters.
There seems to be (at least) one problem left to get my filter to work.

Before I can run Update() on the component filters, each filters' output
(which serves as the input to the other filter) region needs to be set and
buffer allocated. Otherwise, I get segmentation faults. This makes some
sense to me and I verfied this behaviour by writing a small test program
where each filters output was the input of the resp other filter. To get
this to work, I had to set the outputs region and allocate the buffer
manually prior to running Update. Therefore, I have added a
PreProcessOutput method which does the following

typename OutputImageType::Pointer output[numOutputs];
for(unsigned int idx = 0; idx < this ->GetNumberOfOutputs(); ++idx) {
  output[idx] = m_Filter[idx]->GetOutput();
  typename Superclass::InputImagePointer input =
    const_cast<TInputImage *>(this->GetInput(idx));
  if (input.IsNull()) {
     itkExceptionMacro(<<"Input to filter is NULL.");
  }
  typename TInputImage::RegionType region = input->GetRequestedRegion()
  output[idx]->SetRegions(region);
  output[idx]->Allocate();
  }
}

The regions are set correctly now. However, in my finite filter's
difference function the other filter's output image still has a zero
buffered region and my GenerateInputRequestedRegion raises an exception
for an illegal requested region.

Is my approach plausible at all or does it conflict with the whole
PropagateRequestedRegion mechanism etc. ? Can and/or should setting the
inputs regions and buffer allocation be moved to
GenerateInputRequestedRegion or GenerateOutputRequestedRegion ? I am still
not completely clear about when in the pipleine update process these are
called.

To recap, my composite filter contains two instance of subclasses of
finite difference image filters with three inputs each, of which the last
is the output of the other filter. My composite filter has four inputs and
two outputs. I have overidden GenerateInputRequestedRegion for the
composite filter.
It calls the superclass implementation setting regions for all 4 inputs.
Then, only for the two first inputs for the finite difference image
filters, I follow
the approach of FiniteDifferenceImageFilter::GenerateInputRequestedRegion
and pad by the finite difference functions radius and then crop the
resulting region. This gives an exception. I can understand why enlarging
the region by the radius gives a larger region then the largest possible
region, but why does this normally work for a finite difference filter
then?

Any hints/suggestions what I am doing wrong or a relevant example ?

Thanks
>
> To create "cycles" in ITK, where an output becomes a filters input, you
> need to jump through a few extra hoops.
>
>
> OutputImageType::Pointer iterationOutput;
> while (!this->Halt()) {
>   for(unsigned int idx = 0; idx < this->GetNumberOfOutputs(); ++idx) {
>     m_LevelSetSegmentationFilter[idx]->Update();
>
>     iterationOutput = m_LevelSetSegmentationFilter[idx]->GetOutput();
>     iterationOutput->DisconnectPipeline();
>     m_LevelSetSegmentationFilter[idx]->SetInput(iterationOutput);
>   }
>
> The call to DisconnectPipeline() disconnects the image from the pipeline
> and forces the pipeline to generate a new valid output data object.
>
> Jim
>
>

-- 
  Nils H. Busch
_______________________________________________
Insight-users mailing list
Insight-users at itk.org
http://www.itk.org/mailman/listinfo/insight-users


More information about the Insight-users mailing list