[Paraview] PV 3.6.2 and VirtualGL
Paul Melis
paul.melis at sara.nl
Wed Mar 3 07:27:36 EST 2010
Hi,
I'm experimenting with running paraview on a remote visualization
server, using VirtualGL. This package basically lets you run an OpenGL
application as you normally would, but intercepts the swapbuffer events
to read back the framebuffer, which then gets JPEG compressed and sent
to a client machine (see [1] for an excerpt from the manual). On the
client side you get a normal X11 application (using X forwarding), but
the 3D rendering is done on the server side and transported using a
dedicated connection.
When using PV 3.6.2 in this setup I get the following strange results. I
have a dataset on which I apply the Glyph filter. When running PV
locally (without any remote rendering) this glyphing works fine. But,
when I run paraview remotely it refuses to show glyphs for the exact
same dataset and same pipeline. The strange thing is that I can't get
any of the glyph types to show anything, except for 2D Glyph + Vertex.
For that mode I get what I expect.
I've disabled depth peeling to exclude a source of possible
interference. VirtualGL itself should basically leave the 3D rendering
untouched, except during context creation and on swapbuffer events. Is
there anything special in how Paraview draws its glyphs?
Thanks,
Paul
[1] "Whenever a window is created by the application, VirtualGL creates
a corresponding 3D pixel buffer (“Pbuffer”) on a 3D graphics card in the
application server. Whenever the application requests that an OpenGL
rendering context be created for the window, VirtualGL intercepts the
request and creates the context on the corresponding Pbuffer instead.
Whenever the application swaps or flushes the drawing buffer to indicate
that it has finished rendering a frame, VirtualGL reads back the Pbuffer
and sends the rendered 3D image to the client."
More information about the ParaView
mailing list