[Paraview] Problems with distributed (volume) rendering

Paul Melis paul at science.uva.nl
Fri Sep 21 07:21:08 EDT 2007


Moreland, Kenneth wrote:

>>From your description, it sounds like the server is not opening
>rendering contexts correctly.  More specifically, it sounds like two or
>more processes on the server are opening rendering windows that are on
>top of each other.  
>  
>
That was a golden insight! It turned out that X11 forwarding was turned 
on for the connections from the render nodes, so they all effectively 
rendered on the first node.
It works fine now.

>Also, if you plan to do volume rendering, I recommend sticking with the
>CVS version.  There are some known issues with parallel volume rendering
>in 3.0.
>  
>
Thanks for the info.

Paul

>-Ken
>
>  
>
>>-----Original Message-----
>>From: paraview-bounces+kmorel=sandia.gov at paraview.org
>>    
>>
>[mailto:paraview-
>  
>
>>bounces+kmorel=sandia.gov at paraview.org] On Behalf Of Paul Melis
>>Sent: Wednesday, September 19, 2007 2:13 AM
>>To: paraview at paraview.org
>>Subject: [Paraview] Problems with distributed (volume) rendering
>>
>>Hello,
>>(warning, long post :-/)
>>
>>I'm trying to get distributed volume rendering to work with paraview
>>    
>>
>and
>  
>
>>I'm having some problems. I'm pretty new to paraview (although I've
>>    
>>
>been
>  
>
>>using VTK for some time now) and could use a pointer or 2.
>>
>>Is it at all possible to use the client-server feature to use a
>>    
>>
>compute
>  
>
>>cluster to do volume rendering with paraview? I seem to find some
>>conflicting statements about this on the web.
>>I'm using the CVS version of paraview (mostly because that's what I do
>>with VTK and that works fine). Would that hurt in this case, should I
>>use the latest stable 3.0.x?
>>
>>I've got the following setup:
>>- one desktop machine (running fedora core 4), on which I run paraview
>>as client
>>- a cluster of machines (running some other linux distribution) on
>>    
>>
>which
>  
>
>>I run pvserver
>>- connection is reverse from server to client, as there is a firewall
>>    
>>
>in
>  
>
>>between. I've verified the connection can be established.
>>
>>The following steps work as expected:
>>1. start paraview on my desktop machine
>>2. use File -> Connect to let paraview wait for a connection from the
>>cluster
>>3. on the first cluster node run
>>mpirun -np 8 pvserver -rc -ch=<desktop-machine>
>>4. the connection is succesfully made
>>5. load in data with File -> Open, in this case a 128^3 structured
>>points set stored in a .vtk file on the cluster nodes (from shared
>>homedir available on all cluster nodes)
>>
>>I actually test with a contour filter a.t.m. as volume rendering gives
>>an even greater mess than described below.
>>
>>The following does not work correctly:
>>6. add a contour filter, with some appropriate iso-value (Filter ->
>>Common -> Contour, Apply).  I'm now seeing a black square, roughly
>>    
>>
>where
>  
>
>>the dataset would be (perhaps the screen-space bounding box?)
>>    
>>
>Sometimes
>  
>
>>the view is updated and I now see a mess: 8 axes that are not aligned
>>and parts of the isosurface, but it looks like it's one part of the
>>whole isosurface that is repeatedly rendered, but at differing
>>locations. It does not look like the whole isosurface is broken into
>>pieces that are rendered at the wrong position.
>>
>>I've set the Remote Render Threshold to 0 to always force remote
>>rendering (the option is checked as well).
>>When I use only 1 cluster node (mpirun -np 1 ...) everything seems to
>>work fine, both isosurface and volume rendering.
>>
>>One thing I suspect is that there's a problem with my MPI
>>    
>>
>installation.
>  
>
>>As I was having problems compiling paraview with MPI support (extra
>>libraries not linked in) I set CMAKE_C_COMPILER and CMAKE_CXX_COMPILER
>>to mpicc and mpic++, respectively. Can this cause problems like this?
>>Second thing is that I'm having trouble confirming which
>>    
>>
>implementations
>  
>
>>of MPI there are on my workstation and cluster machines. A mismatch
>>    
>>
>here
>  
>
>>will probably also cause trouble. A final thing I was wondering about
>>    
>>
>is
>  
>
>>wether the client <-> server connection is also using MPI, or if it
>>    
>>
>uses
>  
>
>>some proprietary form of communication?
>>
>>Thanks for any help,
>>Paul
>>
>>_______________________________________________
>>ParaView mailing list
>>ParaView at paraview.org
>>http://www.paraview.org/mailman/listinfo/paraview
>>    
>>
>
>
>  
>



More information about the ParaView mailing list