[vtkusers] Large data files
Kevin Wright
Kevin.Wright at ansys.com
Fri Jul 25 14:57:52 EDT 2003
> Hi Marcio,
>
> I also have large data sets to load and render, and ran across this
> web site earlier:
>
> http://public.kitware.com/cgi-bin/vtkfaq?req=show&file=faq03.007.htp
>
> It has a lot of tips on handling large data sets.
>
> - Dennis
> > I wrote an application to read a huge data file (1 ~2 GBytes)
> > where the informations is a collection of structured data. Every
> > block is identified by its id ("scalarsN" where N is the block
> > indicator). The structured data is 200 x 100 x 50 points and it is
> > taking a lot of minutes to retrieve the proper information to be
> > rendered.I am using Tcl/Tk scripts and would like to know what can
> > be done to spped up the reading process. All sugestions will be
> > welcome!
Setting ImmediateModeRendering on will save you a lot of time on rendering with data that size (see the link above).
One more thing not mentioned in that page. Are you using a built-in reader to read a standard data format, or have you written your own reader in Tcl? If you've written your own reader, then I think that you'll find that since Tcl deals with all data as strings, that the act of reading data into Tcl, converting it to string, then sending it down to C++ (as you do when you create vtk objects) to be converted back to numbers will take a great deal of time. Two possible ways around that are to either code the reader itself in C, and the rest of the application in Tcl, or if you don't want to mix Tcl and C, then you could write a one time converter in Tcl that would take your data format and write out a vtk file format, which could be read in by standard VTK objects. This would still likely take a long time, but at least it would only be done once per dataset, instead of once per run.
Hope that's some help,
Kevin.
More information about the vtkusers
mailing list