[Paraview] parallel paraview: display not updated

Stefan Friedel stefan.friedel at iwr.uni-heidelberg.de
Thu Feb 9 11:30:30 EST 2006


Hi Randall et al.,
we solved it...probably it is a bug? Anyway:

-starting the server with mpirun -np <n> pvserver --render-module=MPIRenderModule --use-offscreen-rendering
-starting the client with mpirun -np 1 pvclient
-do a File->Save view image (why? I don't know...but now:)
-load your data, view your data - after saving the first view image everything is working fine!!?

Thanks for your help,
Best Regards, Stefan

> Hi Randall et al.,
> > In running basic Client-Server model, the following seems to work:
> > 
> > 1) On the server:  mpirun -np <number> pvserver --use-offscreen-rendering
> > --render-module=MPIRenderModule
> > 2) On the client:   pvclient --server-host=<servername>
> no, does not work: server is starting...and the client is connecting:
> 
> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm --gm-tree-spawn -np 2 -machinefile ./mach256 /export/system/paraview/bin/pvserver --render-module=MPIRenderModule --use-offscreen-rendering
> Listen on port: 11111
> Waiting for client...
> connected to port 11111
> Client connected.
> 
> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm -np 1 -machinefile ./mach256  /export/system/paraview/bin/pvclient --server-host=node256
> Connect to node256:11111
> 
> (and I can see gm ports in use, so the MPI communication between the two
> pvserver processes is also opened); but the view panel on the right is still
> showing some old buffer content (moving another window over this part of the
> Paraview client window will produce "traces" of the window etc.)
> > 
> > If you have to use reverse-connection (the Server connects to the client),
> > then use the following:
> > 1) On the client: pvclient -rc
> > 2) On the server: mpirun -np <number> pvserver --use-offscreen-rendering
> > --render-module=MPIRenderModule -rc -ch=<clientname>
> Same behaviour...
> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm -np 1 -machinefile ./mach256  /export/system/paraview/bin/pvclient -rc
> Listen on port: 11111
> Waiting for server...
> connected to port 11111
> Server connected.
> 
> 
>  ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm --gm-tree-spawn -np 2 -machinefile ./mach256 /export/system/paraview/bin/pvserver --render-module=MPIRenderModule --use-offscreen-rendering -rc -ch=node256
> Connect to node256:11111
> 
> > 
> > the PVX file only seems to be necessary when working with a "fully
> > distributed" system (eg, separate Data Servers, Render Servers, and
> > Client).  We've cracked some of the dark mysteries of the PVX file here, and
> > if necessary I'm happy to share what little we know.
> I tried different entries in the pvx, no change/success. As I understand the
> definitions for different servers here are not needed for MPI communication -
> the MPI server is just one server (regardless of the number of MPI processes),
> but different servers in the pvx (declared by Machine Name="..." entries) are
> servers which communicate over sockets, right?  Maybe it's a more general
> problem, eg. some issue with the mpichgm/gm layer of Myri - I'm waiting for an
> answer of the Myri support regarding Paraview on Myri/gm... 
> 
> The reason for using offscreen rendering is: laziness (no X is installed on the
> nodes) but probably the next nodes will not even have any graphics card... 
> Thank you!, Best Regards, Stefan
> --
> Zentrale Dienste - Interdisziplinäres Zentrum für Wissenschaftliches
> Rechnen der Universität Heidelberg - IWR - INF 368, 69120 Heidelberg
> stefan.friedel at iwr.uni-heidelberg.de  Tel +49 6221 54-8240 Fax -5224
> IWR: www.iwr.uni-heidelberg.de          HELICS: www.helics.uni-hd.de



> _______________________________________________
> ParaView mailing list
> ParaView at paraview.org
> http://www.paraview.org/mailman/listinfo/paraview


-- 
Zentrale Dienste - Interdisziplinäres Zentrum für Wissenschaftliches
Rechnen der Universität Heidelberg - IWR - INF 368, 69120 Heidelberg
stefan.friedel at iwr.uni-heidelberg.de  Tel +49 6221 54-8240 Fax -5224
IWR: www.iwr.uni-heidelberg.de          HELICS: www.helics.uni-hd.de
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
Url : http://public.kitware.com/pipermail/paraview/attachments/20060209/dd430eac/attachment.pgp


More information about the ParaView mailing list