[Paraview] Using paraview with pvserver

Utkarsh Ayachit utkarsh.ayachit at kitware.com
Wed Mar 6 13:41:25 EST 2013


Patrick,

With legacy VTK format, ParaView does the following:

1. for unstructured data (vtkPolyData, vtkUnstructuredGrid etc),
ParaView reads the entire file on the root node and then partitions
and distributes the data among all processes so the rest of the
processing i.e. filters, etc. can distribute the work across
processes.

2. for image data and other structured data, however, ParaView reads
the entire file on all processes and then "crops" the data to a subset
to distribute further processing between processes. This is lame and
we are going to change this in near future to do something similar to
what we do for unstructured data.

Note this is specific to legacy VTK files. If you write out a XML VTK
file (vti) then ParaView smartly reads only the portions relevant to
the process in parallel.

Hopefully, that clarifies things a bit.

Utkarsh


On Tue, Mar 5, 2013 at 10:47 AM, Patrick Begou
<Patrick.Begou at hmg.inpg.fr> wrote:
> Thanks Utkarsh for your helpful answer.
>
> The "Process ID Scalars " filter on the sphere shows that the configuration
> is working with this test.
> The file I'm trying to read is a simple test file in legacy format:
>
> kareline: head -10 grad.vtk
> # vtk DataFile Version 2.4
> Fichier test genere par run.exe
> BINARY
> DATASET STRUCTURED_POINTS
> DIMENSIONS 1024 1024 1024
> ORIGIN 0 0 0
> SPACING 1 1 1
> POINT_DATA 1073741824
> SCALARS test_values double 1
> LOOKUP_TABLE default
> and the binary data is folowing  (big endian)
>
>
> Each point is only
> for (i=0; i< nx; i++)
>   for (j=0; j< ny; j++)
>     for (k=0; k< nz; k++)
>        tab[i][j][k]=i+j+k;
>
>
> My aim is to build a solution to visualize our large CFD datasets.
> I can visualize this field if I launch pvserver in sequential mode but it's
> quite slow...
> With 4 processes I can't visualize it.
>
> Patrick
>
>
>
>
> Utkarsh Ayachit wrote:
>>
>> Patrick.
>>
>> To test of the parallel version is built and running/rendering
>> correctly, try this: simply create a Sphere from the Sources menu
>> (+Apply). Next create the "Process ID Scalars" filter from the Filters
>> menu. You should see the sphere colored in 4 colors. If that works,
>> all's well. Also, go the Settings dialog, on the Render View | Server
>> Page, set the Remote Render Threshold to 0. That will ensure the use
>> of parallel rendering, always.
>>
>> The 100% CPU usage is expected. That's a OpenMPI thing. Refer to
>> http://www.open-mpi.org/community/lists/users/2009/04/9017.php
>>
>> What type of datafile are you trying to load?
>>
>> Utkarsh
>>
>> On Tue, Mar 5, 2013 at 9:22 AM, Patrick Begou <Patrick.Begou at hmg.inpg.fr>
>> wrote:
>>>
>>> Hi,
>>>
>>> I've posted this mail on vtk forum (as I had understood that paraview is
>>> built on VTK), but it seams it was not the right place and they suggest
>>> me
>>> to use this one.
>>>
>>> I am trying to set up a client/server config with paraview 3.98.1.
>>> The client is an OpenSuse desktop runing the binary distribution of
>>> paraview
>>> 3.98.1 for x86_64.
>>> The server is an EL6 cluster frontend, with a basic graphics board
>>> (runing
>>> X11), 12 cores and 20Tb for storage. I've build paraview 3.98.1 from
>>> sources
>>> for this host with openMPI.
>>>
>>> 1) runing paraview on the server from the client in a "ssh -XC"
>>> connection
>>> works
>>> 2) runing a sequential pvserver on the server and paraview on the client
>>> works
>>>
>>> but runing pvserver in parallel on the server does'nt.
>>> If I launch :
>>> mpirun -np 4 /share/apps/paraView-server-3.08.1/bin/pvserver
>>> (with or without --use-offscreen-rendering) it crashes after loading the
>>> datas, while rendering (I think).
>>>
>>> On the client the message is:
>>> ERROR: In
>>>
>>> /home/utkarsh/Dashboards/MyTests/NightlyMaster/ParaViewSuperbuild-Release/paraview/src/paraview/VTK/Parallel/Core/vtkSocketCommunicator.cxx,
>>> line 812
>>> vtkSocketCommunicator (0x177a380): Could not receive tag. 188969
>>>
>>> On the server, the message is:
>>> mpirun noticed that process rank 2 with PID 25736 on node kareline exited
>>> on
>>> signal 9 (Killed).
>>>
>>> I also noticed some strange behavior:
>>> 1) after selecting the file to read and before  clicking on the apply
>>> button, 3 of the 4 pvserver processes use 100% cpu.
>>> 2) after clicking the apply button, the 4 pvserver processes uses the
>>> same
>>> memory amount than the sequential run (8Gb each)
>>> So I think something is wrong in building my parallel version.
>>>
>>> Thanks for your help
>>>
>>> Patrick
>>>
>>> Details of my install:
>>>
>>> I've build paraview 3.98 from sources folowing
>>> http://paraview.org/Wiki/ParaView/ParaView_And_Mesa_3D:
>>>
>>> 1) build of Mesa-7.11.2 with:
>>> ./configure --with-driver=osmesa
>>> --prefix=/share/apps/paraView-server-3.98.1
>>> make
>>> make install
>>>
>>> 2) build paraview with:
>>> mkdir ParaView-3.98.1-build
>>> cd ParaView-3.98.1-build
>>> ccmake28 -D PARAVIEW_BUILD_QT_GUI:BOOL=OFF -D VTK_USE_X:BOOL=OFF -D
>>> VTK_OPENGL_HAS_OSMESA:BOOL=ON
>>> /home/makerpm/PARAVIEW/ParaView-3.98.1-source
>>>
>>> and set
>>>   PARAVIEW_BUILD_QT_GUI = OFF
>>>   OPENGL_INCLUDE_DIR = /share/apps/paraView-server-3.98.1/include
>>>   OPENGL_gl_LIBRARY =
>>>   OPENGL_glu_LIBRARY = /share/apps/paraView-server-3.98.1/lib/libGLU.so
>>>   VTK_OPENGL_HAS_OSMESA = ON
>>>   OSMESA_INCLUDE_DIR = /share/apps/paraView-server-3.98.1/include
>>>   OSMESA_LIBRARY = /share/apps/paraView-server-3.98.1/lib/libOSMesa.so
>>>   VTK_USE_X = OFF
>>>   PARAVIEW_USE_MPI=ON
>>>
>>> make
>>> make install
>>>
>>> --
>>> ===============================================================
>>> |  Equipe M.O.S.T.         | http://most.hmg.inpg.fr          |
>>> |  Patrick BEGOU           |       ------------               |
>>> |  LEGI                    | mailto:Patrick.Begou at hmg.inpg.fr |
>>> |  BP 53 X                 | Tel 04 76 82 51 35               |
>>> |  38041 GRENOBLE CEDEX    | Fax 04 76 82 52 71               |
>>> ===============================================================
>>>
>>> _______________________________________________
>>> Powered by www.kitware.com
>>>
>>> Visit other Kitware open-source projects at
>>> http://www.kitware.com/opensource/opensource.html
>>>
>>> Please keep messages on-topic and check the ParaView Wiki at:
>>> http://paraview.org/Wiki/ParaView
>>>
>>> Follow this link to subscribe/unsubscribe:
>>> http://www.paraview.org/mailman/listinfo/paraview
>
>
>
> --
> ===============================================================
> |  Equipe M.O.S.T.         | http://most.hmg.inpg.fr          |
> |  Patrick BEGOU           |       ------------               |
> |  LEGI                    | mailto:Patrick.Begou at hmg.inpg.fr |
> |  BP 53 X                 | Tel 04 76 82 51 35               |
> |  38041 GRENOBLE CEDEX    | Fax 04 76 82 52 71               |
> ===============================================================
>


More information about the ParaView mailing list