[Paraview] Problems with distributed (volume) rendering

Moreland, Kenneth kmorel at sandia.gov
Wed Sep 19 10:30:25 EDT 2007


>From your description, it sounds like the server is not opening
rendering contexts correctly.  More specifically, it sounds like two or
more processes on the server are opening rendering windows that are on
top of each other.  Check to make sure that the DISPLAY environment
variable for each process is set to localhost:0 and that each process
has its own GPU.  If you don't have GPUs in your cluster, you can just
use OSMesa to get around the problem.

See the Wiki for more details.

http://www.paraview.org/Wiki/Setting_up_a_ParaView_Server#X_Connections

Also, if you plan to do volume rendering, I recommend sticking with the
CVS version.  There are some known issues with parallel volume rendering
in 3.0.

-Ken

> -----Original Message-----
> From: paraview-bounces+kmorel=sandia.gov at paraview.org
[mailto:paraview-
> bounces+kmorel=sandia.gov at paraview.org] On Behalf Of Paul Melis
> Sent: Wednesday, September 19, 2007 2:13 AM
> To: paraview at paraview.org
> Subject: [Paraview] Problems with distributed (volume) rendering
> 
> Hello,
> (warning, long post :-/)
> 
> I'm trying to get distributed volume rendering to work with paraview
and
> I'm having some problems. I'm pretty new to paraview (although I've
been
> using VTK for some time now) and could use a pointer or 2.
> 
> Is it at all possible to use the client-server feature to use a
compute
> cluster to do volume rendering with paraview? I seem to find some
> conflicting statements about this on the web.
> I'm using the CVS version of paraview (mostly because that's what I do
> with VTK and that works fine). Would that hurt in this case, should I
> use the latest stable 3.0.x?
> 
> I've got the following setup:
> - one desktop machine (running fedora core 4), on which I run paraview
> as client
> - a cluster of machines (running some other linux distribution) on
which
> I run pvserver
> - connection is reverse from server to client, as there is a firewall
in
> between. I've verified the connection can be established.
> 
> The following steps work as expected:
> 1. start paraview on my desktop machine
> 2. use File -> Connect to let paraview wait for a connection from the
> cluster
> 3. on the first cluster node run
> mpirun -np 8 pvserver -rc -ch=<desktop-machine>
> 4. the connection is succesfully made
> 5. load in data with File -> Open, in this case a 128^3 structured
> points set stored in a .vtk file on the cluster nodes (from shared
> homedir available on all cluster nodes)
> 
> I actually test with a contour filter a.t.m. as volume rendering gives
> an even greater mess than described below.
> 
> The following does not work correctly:
> 6. add a contour filter, with some appropriate iso-value (Filter ->
> Common -> Contour, Apply).  I'm now seeing a black square, roughly
where
> the dataset would be (perhaps the screen-space bounding box?)
Sometimes
> the view is updated and I now see a mess: 8 axes that are not aligned
> and parts of the isosurface, but it looks like it's one part of the
> whole isosurface that is repeatedly rendered, but at differing
> locations. It does not look like the whole isosurface is broken into
> pieces that are rendered at the wrong position.
> 
> I've set the Remote Render Threshold to 0 to always force remote
> rendering (the option is checked as well).
> When I use only 1 cluster node (mpirun -np 1 ...) everything seems to
> work fine, both isosurface and volume rendering.
> 
> One thing I suspect is that there's a problem with my MPI
installation.
> As I was having problems compiling paraview with MPI support (extra
> libraries not linked in) I set CMAKE_C_COMPILER and CMAKE_CXX_COMPILER
> to mpicc and mpic++, respectively. Can this cause problems like this?
> Second thing is that I'm having trouble confirming which
implementations
> of MPI there are on my workstation and cluster machines. A mismatch
here
> will probably also cause trouble. A final thing I was wondering about
is
> wether the client <-> server connection is also using MPI, or if it
uses
> some proprietary form of communication?
> 
> Thanks for any help,
> Paul
> 
> _______________________________________________
> ParaView mailing list
> ParaView at paraview.org
> http://www.paraview.org/mailman/listinfo/paraview




More information about the ParaView mailing list