[vtkusers] Optimizing memory usage with MemoryLimitImageDataStreamer
Kalle Pahajoki
kalle.pahajoki at gmail.com
Tue Nov 7 13:15:23 EST 2006
Hi
I'm developing a software with VTK that sometimes needs to process very
large datasets It is quite easy to run into a situation where the software
eats up all memory and has to be killed. I've had some success with using
streaming (especially the vtkMemoryLimitImageDataStreamer) but I'm not sure
if I'm making use of it's full potential. Therefore, I have a couple of
questions.
Our software is developed in a way where there can be a variable amount of
processing filters between the datasource and when it's rendered. Currently,
the software doesn't build a pipeline in the traditional sense, where you
connect all the filters together and then update the last one, but instead,
most of the steps Update the pipeline and pass the processed data forward.
A typical "pipeline" inside the program might be:
XMLImageDataReader -> Reslice ->
Merging
XMLImageDataReader -> Reslice ->
2) Can the readers utilize streaming, meaning, can XMLImageDataReader for
example return data in pieces
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.vtk.org/pipermail/vtkusers/attachments/20061107/d86feb05/attachment.htm>
More information about the vtkusers
mailing list