[vtkusers] large data sets and memory

L.J. van Ruijven L.J.vanRuijven at amc.uva.nl
Thu Nov 22 03:50:56 EST 2007


Hello Thomas,

Segmentation may indeed solve your problem. However, it is difficult 
to estimate how many segments you will need to use. When I encountered 
this problem on my machine I learned that datasets of 100 MB generally 
worked OK and datasets of 200 MB generally crashed. But I also learned 
that between these limits a program often reported no error at all, 
while its output contained one or more corrupt numbers. So I advise 
you to stay well below your memory limit, because it is not always 
obvious that your output is corrupt.

Although it took a lot of time, I am glad that I switched to Windows 
x64. No more memory problems.

Leo


> ok,
>
> i thought that the problem is something like that. I have used 
VolView 
> to take a look at the data set and it did not crash. so i thought it 
is 
> my fault.
> do you think a possible solution would be a segmentated loading 
> algorithm? so i load only the half dataset and the rest only when 
> needed? hay anyone tried something like this?
>
> thanks


-------------- next part --------------
A non-text attachment was scrubbed...
Name: ljvanruijven.vcf
Type: text/x-vcard
Size: 262 bytes
Desc: Card for "L.J. van Ruijven" <L.J.vanRuijven at amc.uva.nl>
URL: <http://www.vtk.org/pipermail/vtkusers/attachments/20071122/cc85de26/attachment.vcf>


More information about the vtkusers mailing list