[vtkusers] large data sets and memory
thomas jetzfellner
thomas.jetzfellner at gsf.de
Wed Nov 21 10:17:07 EST 2007
hi,
this is the first time, I write to the list and I think my question
could sound a little bit "stupid". my current problem is, when i load
large datasets there is a problem with the memory allocation. my
programm crashes on the update call of the vtkImageData. I tried it in
different ways and none worked on large data sets. small data is
processed correctly. Attached you find some code snippeds, on the update
call my program dies.
vtkImageData* volume= new vtkImageData ;
volume->SetInput( centerImage->GetOutput() ) ;
volume->Update();
vtkImageDataStreamer *ids = vtkImageDataStreamer::New();
ids->SetInputConnection(centerImage->GetOutputPort());
ids->SetNumberOfStreamDivisions(200);
ids->UpdateInformation();
ids->GetExtentTranslator()->SetSplitModeToBlock();
ids->Update();
vtkMemoryLimitImageDataStreamer* mlds =
vtkMemoryLimitImageDataStreamer::New();
mlds->SetInputConnection( centerImage->GetOutputPort() ) ;
mlds->SetMemoryLimit ( 10000 ) ;
mlds->Update();
mybe it is a problem in my concept. i want to load a dataset with a
resolution of 512 x 512 x 1024, are there other ways to handle such a
load of data?
my dev environment is visual studio 2005 on winXP SP2
thanks for any information and suggestions
More information about the vtkusers
mailing list