[vtkusers] Large Memory Datasets with VTK, & Problems with the ReleaseDataFlag

Randall Hand randall.hand at gmail.com
Thu Mar 24 11:20:01 EST 2005


I saw on the VTK Wiki FAQ a few suggestions for working with large
datasets that I've tried, namely 2 changes:
   1) Using the "ReleaseDataFlagOn" (GlobalReleaseDataFlagOn in my case)
   2) Using ImmediateModeRenderingOn

I have a dataset of 1-billion points (1k x 1k x 1k, StructuredPoints,
1scalar per point) that I've been trying to visualize via Flow Glyphs
(simple Sphere glyph at each point, mostly just to test the memory
limits).  Of course, it crashes with a core dump, presumably from
memory overflow.  I've been using the MaskPoints filter to get it down
to a managable size, around 1million points, but I'm trying to enable
me raising that significantly higher.

Well, enabling ImmediateModeRendering and ReleaseDataFlag didn't make
any difference at all, as far as I can tell.  The memory usage seems
smaller, but it still Core Dumps in the same place.  But the
interesting part is when I try to render with the (previously working)
1million points.  I get no result.

I'm using MangledMesa, & writing the result to a PNG image.  The
resulting images are blank, except for the contextual text (timestamp,
title, etc) that I add around the edges.  Removing the ReleaseDataFlag
lines (but leaving the ImmediateModeRendering lines) got my resulting
image back.  Is there a known compatibility issue with the
ReleaseDataFlag option?  Perhaps only with offscreen rendering or Mesa
rendering?

Also, is there some reason why enabling it didn't allow me to use any
higher numbers of points?  Even from 1mil to 2mil would be a much
welcome improvement, but I'ld really love to get to the full 1bil.

-- 
Randall Hand
http://www.yeraze.com



More information about the vtkusers mailing list