[Paraview] Client server problems in running 3.2.1 on SVA

Moreland, Kenneth kmorel at sandia.gov
Mon Jan 7 10:05:51 EST 2008


What happens when you run glxgears in an mpi job?

 

  mpirun -np 2 -hostlist "sva11 sva12" /bin/env DISPLAY=:0 glxgears

 

-Ken

 

________________________________

From: paraview-bounces+kmorel=sandia.gov at paraview.org
[mailto:paraview-bounces+kmorel=sandia.gov at paraview.org] On Behalf Of
Manjunath Sripadarao
Sent: Thursday, January 03, 2008 10:13 PM
To: paraview at paraview.org
Subject: Re: [Paraview] Client server problems in running 3.2.1 on SVA

 

Hi Ken,
 Yeah I am using -np 2, that was a typo. I am using HPMPI, HPs
implementation of MPI. 

Thanks,
Manju

	 

	Shouldn't you be using -np 2 instead of -n2.  I don't see where
you specified your MPI implementation, but I have never used -n# in liu
of -np #.

	 

	-Ken

	 

	
________________________________


	From: paraview-bounces+kmorel=sandia.gov at paraview.org
[mailto:paraview-bounces+kmorel=sandia.gov at paraview.org] On Behalf Of
Manjunath Sripadarao 
	Sent: Thursday, January 03, 2008 8:08 AM
	To: paraview at paraview.org

	
	Subject: Re: [Paraview] Client server problems in running 3.2.1
on SVA

	 

	Hi Ken,
	 Thanks for the clarification. I think a little more information
is needed as I am
	running into some problem that I can't seem to resolve.
	
	I am using 3 machines of my cluster all three are connected to 3
displays. I want 
	use 2 of them as multi-tile display with 2 tiles. I have the
Paraview GUI on the 
	third machine. All three have X servers running and are MPI
enabled. So here
	is how I launch the jobs on the machines.
	
	$ mpirun -n2 -hostlist "sva11 sva12" /bin/env DISPLAY=:0
pvserver -tdx=2 -tdy=1 
	
	and I launch the client 
	
	$ paraview
	
	and then connect to sva11.
	
	My servers.pvsc looks like
	
	<Servers>
	  <Server name="builtin" resource="builtin:">
	    <ManualStartup/>
	  </Server>
	  <Server name="fw-sva10" resource="cs://sva10:11111">
	    <ManualStartup/>
	  </Server>
	  <Server name="fw-sva11" resource="cs://sva11:11111"> 
	    <ManualStartup/>
	  </Server>
	  <Server name="fw-sva12" resource="cs://sva12:11111">
	    <ManualStartup/>
	  </Server>
	<Server>

	
	But the window multi-tile window appears only on sva11 and it
doesn't
	span the whole screen (which it used to do previously). And it
doesn't seem
	like sva12 is even doing anything. I suspect this because sva11
flickers 
	when the client connects, but sva12 remains as it is.
	
	If I do the reverse and connect the client to sva12, then model
window appears
	on sva12 and sva11 seems to do nothing.
	
	Is there anything I am missing ? 
	I am using Paraview 3.2.1.
	
	Thanks,
	Manju

	What you are trying to do does not work because ParaView is not
supposed
	to run like that. When running in client/server mode, ParaView
connects
	to process 0 of the server, and nothing else. All communication
with the 
	client goes through process 0, and process 0 of the server uses
the MPI
	interconnect to pass data to and from the other nodes in the
server.
	
	ParaView is implemented in this way for convenience and
scalability. It 
	is not scalable to have every process in the server to connect
to the
	client because all communication will eventually have to go
through the
	same network interface on the client side. Also, the MPI
interconnect on 
	the server is almost always faster than the socket communication
between
	client and server.
	
	-Ken

	
	> -----Original Message-----
	> From: Manjunath Sripadarao [mailto: manjunaths at gmail.com]
	> Sent: Tuesday, December 18, 2007 11:05 PM
	> To: Moreland, Kenneth
	> Subject: Re: [Paraview] Client server problems in running
3.2.1 on SVA 
	>
	> Hi Kenneth,
	>  Thanks for the reply.
	>
	> > Problem 1: Do not use mpirun with the paraview executable.
The
	ParaView
	> > client is a serial application and should not be launched
with 
	mpirun.
	>
	> Ah...ok. I did launch paraview with using mpirun.
	>
	> > Problem 2: Only launch the server with one mpirun command.
Use the
	-np
	> > argument to mpirun to specify how many processes you want
the server 
	to
	> > run in (in your case, probably 2).  If your MPI environment
it set
	up
	> > right, mpirun will automatically launch it on n2 and n3.  If
not,
	either
	> > change your MPI configuration or provide a machines file to
mpirun 
	to
	> > specify on which hosts to run on.  The instructions for
doing that
	vary
	> > based on MPI implementations and installations.  Talk to
your system
	> > administrators for more information. 
	>
	> I did this too, I am using a machines file for this (which we
call an
	app
	> file
	> but all the same). But how do you get the client to "listen"
to 2 or
	> more servers ?
	> Even though I start 2 or 3 servers, if I can't get the client
to 
	listen
	> to multiple servers then it won't connect to all of them. If I
use the
	> -s option multiple times it only connects to the last one
i.e.,
	> if I give
	> $ paraview -s=n2 -s=n3
	> only n3 pvserver gets connected and n2 pvserver quits. 
	>
	> If I write a servers.pvsc file with a list of servers, how do
I get
	> paraview pick the file automatically and connect to all the
servers ?
	>
	> Thanks in advance,
	> Manju
	>
	> > -Ken
	> >
	> >
	> > > -----Original Message-----
	> > > From: paraview-bounces+kmorel=sandia.gov at paraview.org
	> > [mailto: paraview-
	> > > bounces+kmorel=sandia.gov at paraview.org] On Behalf Of
Manjunath
	> > Sripadarao 
	> > > Sent: Monday, December 17, 2007 11:00 PM 
	> > > To: paraview at paraview.org
	> > > Subject: [Paraview] Client server problems in running
3.2.1 on SVA
	> > >
	> > > Hello, 
	> > >  I am trying to run Paraview 3.2.1 in client server mode.
So here
	is
	> > > my setup, I have a cluster
	> > > of workstations n[1-3]. I launch the client on n1 and
reverse
	connect
	> > > to my servers on n2 and n3. 
	> > > I am running the client like this
	> > > $[n1] mpirun <...other MPI options...> paraview -s=n2
-s=n3
	> > > and server like this
	> > > $[n2] mpirun <...other MPI options...> pvserver -rc 
	--client-host=n1
	> > > -tdx=2 -tdy=1
	> > > and
	> > > $[n3] mpirun <...other MPI options...> pvserver -rc
	--client-host=n1
	> > > -tdx=2 -tdy=1
	> > > but paraview connects only to n3 but not to n2. 
	> > > I figured out from the list and corresponding with
Weiguang that
	by
	> > > creating a
	> > > servers.pvsc file in .config/ParaView/servers.pvsc and
adding
	> > > <Servers> 
	> > >   <Server name="builtin" resource="builtin:">
	> > >     <ManualStartup/>
	> > >   </Server>
	> > >   <Server name="n2" resource="csrc://n2:11111"> 
	> > >     <ManualStartup/>
	> > >   </Server>
	> > >   <Server name="n3" resource="csrc://n3:11111">
	> > >     <ManualStartup/>
	> > >   </Server>
	> > > </Servers>
	> > >
	> > > I can't get the client to connect, when I remove the
	> > > --server=<nodename> line on the client, it does not 
	> > > read the above file and does not connect. I have to
manually say
	> > > file->connect and connect to a selected
	> > > server, but once I connect to one of them, it drops the
connection 
	> > > another server. Has the ability to
	> > > connect to multiple servers been disabled in 3.2.1 ?
	> > > I want to run one client and multiple servers to have a
	multi-tiled
	> > > display. 
	> > >
	> > > Regards,
	> > > Manju
	> > > _______________________________________________
	> > > ParaView mailing list
	> > > ParaView at paraview.org
	> > > http://www.paraview.org/mailman/listinfo/paraview
	> >
	> >
	> >

	 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://public.kitware.com/pipermail/paraview/attachments/20080107/24c31542/attachment.htm


More information about the ParaView mailing list