[vtkusers] slow imagedata rendering

Louis Desjardins lost_bits1110 at hotmail.com
Tue Apr 27 15:49:52 EDT 2004


Hello

I'm on a Linux, with a 2.4 GHz processor, Nvidia Geforce4 graphics card

I have 3 imagedata actors each of size 300 by 100
One of the actors is a spectrogram, and the other 2 are curves of the mean 
and peak values from the spectrogram

The time it takes to physically render my image on screen (using 
GetLastRenderTimeInSeconds() ) is on average 0.08 seconds!! This is only 
12.5 Hz...!!!

Why does it take so long to just render on such a good machine???? and its 
just simple image data!!
I need it to be at least 40 Hz!!

Is it because of the extra layer of abstraction that vtk is using, are there 
any OpenGL commands I should call to resolve this?? I've rendered other data 
which are much more complex like UnstructuredGrid etc in only 0.01 seconds 
per render..!

Thanks for your hlep in advance!

_________________________________________________________________
Stop worrying about overloading your inbox - get MSN Hotmail Extra Storage! 
http://join.msn.com/?pgmarket=en-us&page=hotmail/es2&ST=1/go/onm00200362ave/direct/01/




More information about the vtkusers mailing list