[Insight-users] Processing Large Datasets

Luis Ibanez luis.ibanez at kitware.com
Wed Aug 11 18:40:37 EDT 2004


Hi Invisible Human,


If you are in a hurry for getting this resampling done.

Here is a poor-man streaming strategy that may get your
problem  solved in a couple of days.


1)  Compute the size of chuncks of your data
     set that could easily fit in memory.
     For example, blocks of

            200 x 200 x 150

     in RGB at 8bits per component that will
     be 112 Mbytes per chunk.


2) Since you anticipate that an output chunk will
    require an extra border of about 15 pixels then
    you can cut your input datasets in chuchs of

           250 x 250 x 150

    that is, 25 pixels of border on each direction
    except Z.  These blocks should be cut with mutual
    overlaps of 50 pixes (25 x 2 ) with their neighbors.


3) If you load one input chunck at a time, it should
    be enough for creating the corresponding output
    chuck through resampling.



This is not elegant, and it is not publishable,
but it will solve your immediate problem.



   Regards,


      Luis



------------------------
Invisible Human wrote:

> Hi,
>  
> I am interested in knowing how are people dealing with large datasets of 
> over 4GB size.  Has anyone implemented streaming for IO classes? 
>  
> I am interested in resampling a dataset of about 150 slices even though 
> only 15 slices can fit in my physical memory at any time.  Each slice of 
> the output image depends on a requested region of x <= 15 slices in the 
> input region.
>  
> Regards.
> 






More information about the Insight-users mailing list