wow.. That's pretty friggin wierd.. lol.<br><br>Glad you got it workin tho, I'll keep that trick with the Save View Image in mind. We haven't run into that here, but we're working on SGI Servers with Windows Clients, and that opens a whole warehouse full of cans of worms that you don't want to deal with.
<br><br>Now.. One thing we did run into was that sometimes we had to turn off the Squirt Subsample Rate on the client. We had a strange situation where it would initially load the data just fine, and then when you interacted with it, it was just fine. But when you let go of the button so that it would return to the full resolution display, one quarter of the screen would keep a copy of the resolution-reduced image up. If we just unchecked that option on the client, it was ok (Of course this meant you had to stream full-resolution images back to the client, in our case it's not that big of a problem).
<br><br><div><span class="gmail_quote">On 2/9/06, <b class="gmail_sendername">Stefan Friedel</b> <<a href="mailto:stefan.friedel@iwr.uni-heidelberg.de">stefan.friedel@iwr.uni-heidelberg.de</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Hi Randall et al.,<br>we solved it...probably it is a bug? Anyway:<br><br>-starting the server with mpirun -np <n> pvserver --render-module=MPIRenderModule --use-offscreen-rendering<br>-starting the client with mpirun -np 1 pvclient
<br>-do a File->Save view image (why? I don't know...but now:)<br>-load your data, view your data - after saving the first view image everything is working fine!!?<br><br>Thanks for your help,<br>Best Regards, Stefan<br>
<br>> Hi Randall et al.,<br>> > In running basic Client-Server model, the following seems to work:<br>> ><br>> > 1) On the server: mpirun -np <number> pvserver --use-offscreen-rendering<br>> > --render-module=MPIRenderModule
<br>> > 2) On the client: pvclient --server-host=<servername><br>> no, does not work: server is starting...and the client is connecting:<br>><br>> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm --gm-tree-spawn -np 2 -machinefile ./mach256 /export/system/paraview/bin/pvserver --render-module=MPIRenderModule --use-offscreen-rendering
<br>> Listen on port: 11111<br>> Waiting for client...<br>> connected to port 11111<br>> Client connected.<br>><br>> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm -np 1 -machinefile ./mach256 /export/system/paraview/bin/pvclient --server-host=node256
<br>> Connect to node256:11111<br>><br>> (and I can see gm ports in use, so the MPI communication between the two<br>> pvserver processes is also opened); but the view panel on the right is still<br>> showing some old buffer content (moving another window over this part of the
<br>> Paraview client window will produce "traces" of the window etc.)<br>> ><br>> > If you have to use reverse-connection (the Server connects to the client),<br>> > then use the following:
<br>> > 1) On the client: pvclient -rc<br>> > 2) On the server: mpirun -np <number> pvserver --use-offscreen-rendering<br>> > --render-module=MPIRenderModule -rc -ch=<clientname><br>> Same behaviour...
<br>> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm -np 1 -machinefile ./mach256 /export/system/paraview/bin/pvclient -rc<br>> Listen on port: 11111<br>> Waiting for server...<br>> connected to port 11111<br>> Server connected.
<br>><br>><br>> ~/TMP/mpichgm_gccwssh/bin/mpirun.ch_gm --gm-tree-spawn -np 2 -machinefile ./mach256 /export/system/paraview/bin/pvserver --render-module=MPIRenderModule --use-offscreen-rendering -rc -ch=node256<br>
> Connect to node256:11111<br>><br>> ><br>> > the PVX file only seems to be necessary when working with a "fully<br>> > distributed" system (eg, separate Data Servers, Render Servers, and
<br>> > Client). We've cracked some of the dark mysteries of the PVX file here, and<br>> > if necessary I'm happy to share what little we know.<br>> I tried different entries in the pvx, no change/success. As I understand the
<br>> definitions for different servers here are not needed for MPI communication -<br>> the MPI server is just one server (regardless of the number of MPI processes),<br>> but different servers in the pvx (declared by Machine Name="..." entries) are
<br>> servers which communicate over sockets, right? Maybe it's a more general<br>> problem, eg. some issue with the mpichgm/gm layer of Myri - I'm waiting for an<br>> answer of the Myri support regarding Paraview on Myri/gm...
<br>><br>> The reason for using offscreen rendering is: laziness (no X is installed on the<br>> nodes) but probably the next nodes will not even have any graphics card...<br>> Thank you!, Best Regards, Stefan<br>
> --<br>> Zentrale Dienste - Interdisziplinäres Zentrum für Wissenschaftliches<br>> Rechnen der Universität Heidelberg - IWR - INF 368, 69120 Heidelberg<br>> <a href="mailto:stefan.friedel@iwr.uni-heidelberg.de">
stefan.friedel@iwr.uni-heidelberg.de</a> Tel +49 6221 54-8240 Fax -5224<br>> IWR: <a href="http://www.iwr.uni-heidelberg.de">www.iwr.uni-heidelberg.de</a> HELICS: <a href="http://www.helics.uni-hd.de">www.helics.uni-hd.de
</a><br><br><br><br>> _______________________________________________<br>> ParaView mailing list<br>> <a href="mailto:ParaView@paraview.org">ParaView@paraview.org</a><br>> <a href="http://www.paraview.org/mailman/listinfo/paraview">
http://www.paraview.org/mailman/listinfo/paraview</a><br><br><br>--<br>Zentrale Dienste - Interdisziplinäres Zentrum für Wissenschaftliches<br>Rechnen der Universität Heidelberg - IWR - INF 368, 69120 Heidelberg<br><a href="mailto:stefan.friedel@iwr.uni-heidelberg.de">
stefan.friedel@iwr.uni-heidelberg.de</a> Tel +49 6221 54-8240 Fax -5224<br>IWR: <a href="http://www.iwr.uni-heidelberg.de">www.iwr.uni-heidelberg.de</a> HELICS: <a href="http://www.helics.uni-hd.de">www.helics.uni-hd.de
</a><br><br><br>-----BEGIN PGP SIGNATURE-----<br>Version: GnuPG v1.4.2 (GNU/Linux)<br><br>iD8DBQFD624m3umRt9zSRsQRAnk4AJkBA6a7ziJJeF9GuwLBtTk3QjsKnQCfVCKe<br>pBkw1V9hg71I6RuAjqag5LY=<br>=Dqyt<br>-----END PGP SIGNATURE-----
<br><br><br></blockquote></div><br><br clear="all"><br>-- <br>Randall Hand<br>Visualization Scientist, <br>ERDC-MSRC Vicksburg, MS<br>Homepage: <a href="http://www.yeraze.com">http://www.yeraze.com</a>