[Paraview] parallel paraview: display not updated

Randall Hand randall.hand at gmail.com
Thu Feb 9 12:47:22 EST 2006


wow.. That's pretty friggin wierd.. lol.

Glad you got it workin tho, I'll keep that trick with the Save View Image in
mind.  We haven't run into that here, but we're working on SGI Servers with
Windows Clients, and that opens a whole warehouse full of cans of worms that
you don't want to deal with.

Now.. One thing we did run into was that sometimes we had to turn off the
Squirt Subsample Rate on the client.  We had a strange situation where it
would initially load the data just fine, and then when you interacted with
it, it was just fine.  But when you let go of the button so that it would
return to the full resolution display, one quarter of the screen would keep
a copy of the resolution-reduced image up.  If we just unchecked that option
on the client, it was ok (Of course this meant you had to stream
full-resolution images back to the client, in our case it's not that big of
a problem).

On 2/9/06, Stefan Friedel <stefan.friedel at iwr.uni-heidelberg.de> wrote:
>
> Hi Randall et al.,
> we solved it...probably it is a bug? Anyway:
>
> -starting the server with mpirun -np <n> pvserver
> --render-module=MPIRenderModule --use-offscreen-rendering
> -starting the client with mpirun -np 1 pvclient
> -do a File->Save view image (why? I don't know...but now:)
> -load your data, view your data - after saving the first view image
> everything is working fine!!?
>
> Thanks for your help,
> Best Regards, Stefan
>
> > Hi Randall et al.,
> > > In running basic Client-Server model, the following seems to work:
> > >
> > > 1) On the server:  mpirun -np <number> pvserver
> --use-offscreen-rendering
> > > --render-module=MPIRenderModule
> > > 2) On the client:   pvclient --server-host=<servername>
> > no, does not work: server is starting...and the client is connecting:
> >
> > ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm --gm-tree-spawn -np 2
> -machinefile ./mach256 /export/system/paraview/bin/pvserver
> --render-module=MPIRenderModule --use-offscreen-rendering
> > Listen on port: 11111
> > Waiting for client...
> > connected to port 11111
> > Client connected.
> >
> > ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm -np 1 -machinefile
> ./mach256  /export/system/paraview/bin/pvclient --server-host=node256
> > Connect to node256:11111
> >
> > (and I can see gm ports in use, so the MPI communication between the two
> > pvserver processes is also opened); but the view panel on the right is
> still
> > showing some old buffer content (moving another window over this part of
> the
> > Paraview client window will produce "traces" of the window etc.)
> > >
> > > If you have to use reverse-connection (the Server connects to the
> client),
> > > then use the following:
> > > 1) On the client: pvclient -rc
> > > 2) On the server: mpirun -np <number> pvserver
> --use-offscreen-rendering
> > > --render-module=MPIRenderModule -rc -ch=<clientname>
> > Same behaviour...
> > ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm -np 1 -machinefile
> ./mach256  /export/system/paraview/bin/pvclient -rc
> > Listen on port: 11111
> > Waiting for server...
> > connected to port 11111
> > Server connected.
> >
> >
> >  ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm --gm-tree-spawn -np 2
> -machinefile ./mach256 /export/system/paraview/bin/pvserver
> --render-module=MPIRenderModule --use-offscreen-rendering -rc -ch=node256
> > Connect to node256:11111
> >
> > >
> > > the PVX file only seems to be necessary when working with a "fully
> > > distributed" system (eg, separate Data Servers, Render Servers, and
> > > Client).  We've cracked some of the dark mysteries of the PVX file
> here, and
> > > if necessary I'm happy to share what little we know.
> > I tried different entries in the pvx, no change/success. As I understand
> the
> > definitions for different servers here are not needed for MPI
> communication -
> > the MPI server is just one server (regardless of the number of MPI
> processes),
> > but different servers in the pvx (declared by Machine Name="..."
> entries) are
> > servers which communicate over sockets, right?  Maybe it's a more
> general
> > problem, eg. some issue with the mpichgm/gm layer of Myri - I'm waiting
> for an
> > answer of the Myri support regarding Paraview on Myri/gm...
> >
> > The reason for using offscreen rendering is: laziness (no X is installed
> on the
> > nodes) but probably the next nodes will not even have any graphics
> card...
> > Thank you!, Best Regards, Stefan
> > --
> > Zentrale Dienste - Interdisziplinäres Zentrum für Wissenschaftliches
> > Rechnen der Universität Heidelberg - IWR - INF 368, 69120 Heidelberg
> > stefan.friedel at iwr.uni-heidelberg.de  Tel +49 6221 54-8240 Fax -5224
> > IWR: www.iwr.uni-heidelberg.de          HELICS: www.helics.uni-hd.de
>
>
>
> > _______________________________________________
> > ParaView mailing list
> > ParaView at paraview.org
> > http://www.paraview.org/mailman/listinfo/paraview
>
>
> --
> Zentrale Dienste - Interdisziplinäres Zentrum für Wissenschaftliches
> Rechnen der Universität Heidelberg - IWR - INF 368, 69120 Heidelberg
> stefan.friedel at iwr.uni-heidelberg.de  Tel +49 6221 54-8240 Fax -5224
> IWR: www.iwr.uni-heidelberg.de          HELICS: www.helics.uni-hd.de
>
>
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.2 (GNU/Linux)
>
> iD8DBQFD624m3umRt9zSRsQRAnk4AJkBA6a7ziJJeF9GuwLBtTk3QjsKnQCfVCKe
> pBkw1V9hg71I6RuAjqag5LY=
> =Dqyt
> -----END PGP SIGNATURE-----
>
>
>


--
Randall Hand
Visualization Scientist,
ERDC-MSRC Vicksburg, MS
Homepage: http://www.yeraze.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://public.kitware.com/pipermail/paraview/attachments/20060209/ddee98be/attachment.htm


More information about the ParaView mailing list