[Paraview] Paraview client-Server
Randall Hand
randall.hand at gmail.com
Fri Aug 11 10:06:39 EDT 2006
well, in this case we're not tiling it Either..
Upon some further digging, it looks like it was just a Version Mismatch
problem. (Probably should add a check for that at startup). I was running
CVS on the cluster, and 2.4.4 on the client. using 2.4.4 on both works
beautifully.
now for the next question :) PV_USE_TRANSMIT - Does that still work?
I have a file that only exists on the head node (process 0), but when I try
to open it (with PV_USE_TRANSMIT set to 1 on all nodes) I get errors about
"unable to open PLY file" on all the processors and it seems to lock.
On 8/10/06, Sean Ziegeler, Contractor <seanzig.ctr at navo.hpc.mil> wrote:
>
> Randall,
> As you probably know, our configuration is very similar to yours in that
> only the head node(s) have external access. We routinely use
> client/server mode successfully in conjunction with parallel
> processing. As stated by others, you simply have to force the first
> rank (#0) MPI process to run on a head node. If that is your only
> concern, you should be ok.
>
> The difference, however, is that we are using MPIRenderModule (we aren't
> tiling the display). You'll want to be sure IceTDesktop doesn't
> deviate, but from the other responses on the list, I'm guessing not.
>
> -Sean
>
> On Thu, 2006-08-10 at 12:10, Berk Geveci wrote:
> > Only the first node has to be connected to the client. They talk to
> > each other with MPI.
> >
> > On 8/10/06, Randall Hand <randall.hand at gmail.com> wrote:
> > This, this is the tile cluster (plasma).
> >
> > We brought it up with our admins onsite, and they dont' think
> > that setting up NAT would be any big problem, so they're
> > looking into it now. What we want to do is use the
> > IceTDesktop render mode to parallel process across our 12
> > nodes with the display piped to a client (probably windows)
> > that's not part of the cluster. It was my understanding (and
> > experience) that in this setup (mpirun -np 12 pvserver -rc)
> > that each pvserver process will want an independent connection
> > to the Client, which would require full network connectivity
> > if it's any host other than the head node.
> >
> > AM I mistaken?
> >
> > On 8/10/06, Andy Cedilnik < andy.cedilnik at kitware.com> wrote:
> > Hello,
> >
> > Actually, as far as I remember there is no need for
> > connectivity from
> > satellite nodes. I am pretty sure Randal's setup does
> > not use NAT, so
> > there is no way his satellite nodes can access
> > Internet. Randal, this is
> > the tile cluster?
> >
> > But, again, I see no reason for satellite nodes to
> > access client. They
> > will try to access render server nodes if you do
> > render/data server
> > separation.
> >
> > Andy
> >
> > Wylie, Brian wrote:
> > > Sandia uses reverse connect on all of our cluster
> > deployments.
> > >
> > > Even though you can't 'see' the cluster nodes from
> > outside you can
> > > often 'see' the outside from a cluster node.
> > >
> > > If the cluster node cannot ping an external ip, than
> > perhaps install
> > > NAT on the cluster? (I'm a bit out of my area
> > here.... but that's
> > > what our administrators have done if there was an
> > issue).
> > >
> > > If you want I can forward your email to our cluster
> > folks....
> > >
> > > Brian Wylie - Org 1424
> > > Sandia National Laboratories
> > > MS 0822 - Building 880/A1-J
> > > (505)844-2238 FAX(505)845-0833
> > >
> > >
> > >
> > >
> >
> ------------------------------------------------------------------------
> > > *From:*
> > paraview-bounces+bnwylie=sandia.gov at paraview.org
> > >
> > [mailto:
> paraview-bounces+bnwylie=sandia.gov at paraview.org] *On
> > > Behalf Of *Randall Hand
> > > *Sent:* Wednesday, August 09, 2006 1:46 PM
> > > *To:* Paraview List
> > > *Subject:* [Paraview] Paraview client-Server
> > >
> > > On our cluster, only the head node has true
> > internet
> > > connectivity. The remaining cluster nodes are
> > on an internal
> > > network with no visibility except to each other
> > & the head node.
> > >
> > > In this configuration, is there any way to run
> > Paraview in
> > > Parallel-server mode to a client anywhere other
> > than the head
> > > node? I would presume that there would need to
> > be some kind of
> > > Relay on the head node to make this happen.
> > >
> > > This came up a few months ago with kitware, but
> > I'm kinda curious
> > > if anyone else out there has a different
> > solution (/poke Bryan,
> > > Kenneth, *@doe :) ).
> > >
> > > --
> > > ----------------------------------------
> > > Randall Hand
> > > Visualization Scientist
> > > ERDC MSRC-ITL
> > >
> > >
> >
> ------------------------------------------------------------------------
> > >
> > > _______________________________________________
> > > ParaView mailing list
> > > ParaView at paraview.org
> > > http://www.paraview.org/mailman/listinfo/paraview
> > >
> >
> >
> >
> >
> >
> > --
> > ----------------------------------------
> > Randall Hand
> > Visualization Scientist
> > ERDC MSRC-ITL
> > _______________________________________________
> > ParaView mailing list
> > ParaView at paraview.org
> > http://www.paraview.org/mailman/listinfo/paraview
> >
> >
> >
> >
> >
> > ______________________________________________________________________
> > _______________________________________________
> > ParaView mailing list
> > ParaView at paraview.org
> > http://www.paraview.org/mailman/listinfo/paraview
>
>
--
----------------------------------------
Randall Hand
Visualization Scientist
ERDC MSRC-ITL
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://public.kitware.com/pipermail/paraview/attachments/20060811/54f697e7/attachment-0001.htm
More information about the ParaView
mailing list