[Paraview] Connecting to Catalyst-Enabled Simulation Running on Remote Cluster
Andy Bauer
andy.bauer at kitware.com
Tue May 19 11:09:31 EDT 2015
Hi Timo,
There's a couple of steps that need to be added when connecting to a
Catalyst-enabled simulation that's been run through a batch job on a
cluster. This "should" be the same as just running pvserver through a batch
job and connecting to that. The basics are that there's an
executable/script that runs on the login node that the batch job connects
to. Your GUI connects to the login node and has the ports forwarded
properly through the executable/script to your pvserver. Now if your
pvserver is running on the login node then it will have to use the above
connection mechanism to connect to your Catalyst-enabled simulation. There
may be some emails or something on the wiki that has information on the
connection executable/script but I haven't tried any of that in several
years so I'm not sure where it would be. Maybe someone else will chime in
that's done this more recently than me.
Good luck and let us know how it goes.
Andy
On Mon, May 18, 2015 at 8:48 AM, Timo Oster <timo.oster at ovgu.de> wrote:
> Hi all,
>
> in an effort to enable live visualization of our simulation code, I have
> written a Catalyst adaptor for it. The live visualization is working
> great when the ParaView client runs on the same machine as the
> simulation, even when the simulation runs in parallel using mpi.
>
> Now I want to do live visualization of a simulation running on a remote
> cluster. I am able to get this to work for the simulation running on the
> login node of the cluster:
>
> 1. Tunnel an ssh connection of port 11111 to the cluster:
> ssh -L 11111:localhost:11111 server
> 2. In the shell that opens, start a paraview server (in the background
> with '&')
> 3. Run the simulation with mpirun (simulation runs only on login node)
> 4. Start my local ParaView client and connect to localhost:11111 to
> connect to the remote pvserver through the ssh tunnel
> 5. In the client, connect to Catalyst (port 22222)
> 6. A provider for the simulation is created and the live visualization
> works
>
> Now I want to do the same for simulations started on the remote cluster
> via the batch job system. In this scenario, the parallel processes of
> the simulation will run on different (randomly chosen) nodes on the
> cluster. How do I go about getting a connection from my local client to
> the Catalyst instances running on those nodes?
>
> I imagine I will need to set up ssh tunnels from the nodes to the login
> node where the pvserver is running. I've tried adding a ssh tunnel line
> to the job script that is executed when the batch job starts. I've tried
> forwarding and reverse-forwarding port 22222 (ssh -L and ssh -R) to no
> avail. The best I get is "ERROR: In
> /.../ParaView_git/VTK/Common/System/vtkSocket.cxx, line 206
> vtkServerSocket (0x18e0930): Socket error in call to bind. Address
> already in use."
>
> My knowledge in networking and ssh is limited, so any pointers to how I
> would go about this are greatly appreciated.
>
> Regards,
>
> Timo
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at:
> http://paraview.org/Wiki/ParaView
>
> Search the list archives at: http://markmail.org/search/?q=ParaView
>
> Follow this link to subscribe/unsubscribe:
> http://public.kitware.com/mailman/listinfo/paraview
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20150519/db6ebbe5/attachment.html>
More information about the ParaView
mailing list