[Insight-users] Lack of memory for segmentation

Atwood, Robert C r.atwood at imperial.ac.uk
Fri Nov 18 08:49:14 EST 2005


 I have experienced similar problems, in fact I was thinking of asking
for some more help on the list...

First, some basic problems need to be ruled out
The data type used to store the information: What is yours? 600 x 600 x
600 x 4 bytes (float data) = 864 000 000 bytes. 

Are you on a Windows or *Nix system? I find windows doesn't like big
programs, I think it defaults to a 2/2 split giving your program 2 Gb
max even though you have 4 available, and reserving 2Gb for itself.

Is it a 32 bit processor? Then you cannot (easily) address more than 4gb
for the whole process (data and stacks) so the amount you can use for
data will be less than 4gb

Filtering a volume creates a new data space about the same as the
original, so as soon as you have a couple of connections in a pipeline,
you will exceed 4Gb, since each one could use 860 Mb.

My filter just does these operations: Import, extract region of
interest, extract and print a slice, 3d median with selected kernel,
extract and print a slice, cast to float, apply nonlinear diffusion
filter, cast to original data type, extract and print a slice, write the
volume file; and it would exceed the available memory if the image was
about 300 Mb  (and the region of interest is nearly the whole thing) if
I did not apply the method mentioned below. 

Hopefully this brings you to the same point as me for this problem, I
think we both need to find out what is the best way to release the
memory used by previous filters in the pipeline? Can the filters be
directed to do so automatically when in a pipleine? I saw something
about 'streaming' but as I recall this is not fully implemented so far,
is this what we want?


Currently I use brace-delimited scopes which seems like a bit of a
kludge, since it demolishes the nice syntax of the pipeline. 

 Below is an abridged code showing what I have done.
 
[set up data types etc.]
MyImage::Pointer filteredImage;
[ read the image using importFilter similar to example that came with
ITK]

{
    MyPreFilter::Pointer myprefilter = MyPreFilter::New();
    prefilter->SetInput(importFilter->GetOutput());
    prefilter->Update();     /* actually in a try/catch */
    filteredImage = prefilter->GetOutput();
    filteredImage->Update(); /* actually in a try/catch */
} /*end  block of existance for prefilter */

   free(imagedata); /* my raw data not handled by ITK smart pointer */

To see what's going on, I used numerous snippets of the following and
compile with -DVERBOSE
  #ifdef VERBOSE
  system("free");
  #endif /*VERBOSE*/




-----Original Message-----
From: insight-users-bounces+r.atwood=imperial.ac.uk at itk.org
[mailto:insight-users-bounces+r.atwood=imperial.ac.uk at itk.org] On Behalf
Of Olivier Rousseau
Sent: 17 November 2005 19:57
To: insight-users at itk.org
Subject: [Insight-users] Lack of memory for segmentation

Hi,

I am trying to segment a 3D volume, that is 600x600X600 pixels. I ran
ShapeDetectionLevelSetFilter, 
but an exception is thrown during the segmentation saying that I'm
lacking memory.
I am surprised since the computer I am using has 4gb of RAM.

Is it possible that I am doing something wrong?
Otherwise, what size of 3D volume can I expect to be able to segment?
Or, is it possible to run this segmentation algorithm on a cluster?

Thanks
Olivier



More information about the Insight-users mailing list