FW: [Paraview] building paraview3 beta with openmpi?
Berk Geveci
berk.geveci at kitware.com
Mon May 7 10:39:26 EDT 2007
It almost looks like pvserver is not compiled with MPI and mpiexec is
starting 4 completely independent pvservers. Then the client is
probably connecting to just one of those. What do you see when look at
processes after mpiexec? Are there 4 pvserver processes? Also, what
happens if you create a sphere and apply Process Id Scalars? Do you
see 4 pieces or just 1?
-berk
> From: W. Bryan Smith [mailto:wbsmith at gmail.com]
> Sent: Friday, May 04, 2007 11:33 AM
> To: Moreland, Kenneth
> Subject: Re: [Paraview] building paraview3 beta with openmpi?
>
>
>
> so i now tried this:
>
> python -c "import os; os.system('mpiexec -n 4 pvserver')"
>
> and i still get the same error as shown below. but now, it
> looks like when i interact with the data, 2 of my 4 cpus are
> being used (the system is a dual opteron, so 4 cores total).
> still, when actually computing the geometry, only one cpu
> is being used, while in paraview2.6 i see all 4 processes as
> active when the geometry is being computed. maybe i am
> comparing apples to oranges, but it seems like something's
> not quite right here.
>
> thanks,
> bryan
>
>
> On 5/4/07, W. Bryan Smith < wbsmith at gmail.com> wrote:
>
> i tried setting up the server cs://localhost using
> mpiexec -n 4 pvserver
>
> and now i get this error:
> ERROR: In
> /local2/ParaView3/ParaView3/Servers/Common/vtkProcessModuleConnectionManager.cxx,
> line 180
> vtkProcessModuleConnectionManager (0x4149d70): Failed to
> set up server socket.
>
>
>
>
>
>
>
> On 5/4/07, W. Bryan Smith < bryan at ncmir.ucsd.edu> wrote:
>
> ok, thanks. i will look into how to do that and post back if i have
> any questions. thanks for the quick response.
>
> bryan
>
>
>
>
>
>
> On 5/4/07, Moreland, Kenneth <kmorel at sandia.gov> wrote:
>
>
>
> Unlike ParaView 2, the ParaView 3 client application cannot be run in
> parallel. Instead, you have to run pvserver in parallel and connect to
> that. In this mode, you still get all the parallel features of ParaView.
>
>
>
> -Ken
>
>
>
>
> ________________________________
>
>
> From: paraview-bounces+kmorel=sandia.gov at paraview.org
> [mailto:paraview-bounces+kmorel=sandia.gov at paraview.org] On
> Behalf Of W. Bryan Smith
> Sent: Friday, May 04, 2007 10:25 AM
> To: paraview at paraview.org
> Subject: [Paraview] building paraview3 beta with openmpi?
>
>
>
>
> hi,
>
> i successfully compiled paraview3 beta with openmpi 1.0.2 by setting
>
> SET(CMAKE_C_COMPILER mpicc)
> SET(CMAKE_CXX_COMPILER mpicxx)
>
> in my CMakeLists.txt file, and then running cmake like this:
>
> cmake -DVTK_USE_MPI:BOOL=ON \
> -DVTK__MPIRUN_EXE:FILEPATH=/path_to_openmpi/openmpi1.0.2/bin/
> \
> /path_to_paraview3_src/ParaView3/
>
> and when i try to run
>
> mpiexec -n 4 ./paraview
>
> from within my paraview3 bin directory, i get 4 paraview windows rather
> than a single window running in parallel mode. i have made sure the path
> to the openmpi bin is on the top of my path, and the path to openmpi lib is
> on the top of my ld_library_path environment variable, but still no luck.
> i
> have no trouble running paraview2.6 with this same command, built with
> the same modifications to CMakeLists.txt and the cmake command line.
>
> any help much appreciated,
> bryan
>
>
>
>
>
>
>
>
> _______________________________________________
> ParaView mailing list
> ParaView at paraview.org
> http://www.paraview.org/mailman/listinfo/paraview
>
>
--
Berk Geveci
Kitware Inc.
28 Corporate Drive
Clifton Park, NY, 12065
More information about the ParaView
mailing list