[Paraview] running paraview in parallel with a batch script/MPI
Pat Marion
pat.marion at kitware.com
Thu Feb 14 00:18:54 EST 2013
Right, the decomposition is up to the reader, so you need a reader that is
"parallel aware". If you're starting with a .pvtu file, then you should be
all set. If you look inside the pvtu file, it should just be an xml format
that lists all the pieces, and the pieces are .vtu. Each rank will read a
different subset of the .vtu files. Since pvbatch functions just like
pvserver, you can try your work interactively with pvserver first. In
fact, you can connect ParaView to a parallel pvserver, then paste the
python script line for line into ParaView's python console and you should
get the same results.
You can color by process ID scalars using pvbatch. I have script that I
use for testing which does exactly that. Notice that the script creates a
Sphere(), you could replace that with your reader.
# testParallelRender.py
print "importing paraview.simple..."
from paraview.simple import *
n_procs =
servermanager.vtkProcessModule.GetProcessModule().GetNumberOfLocalPartitions()
print "number of ranks: ", n_procs
print "create pipeline..."
Sphere(ThetaResolution=100, PhiResolution=100)
ProcessIdScalars()
UpdatePipeline()
print "creating view..."
CreateRenderView()
print "create reps..."
Show()
lt = CreateLookupTable(RGBPoints=[0.0, 0, 0, 1, n_procs-1, 1, 0, 0],
ColorSpace = "HSV")
SetDisplayProperties(ColorAttributeType=0, ColorArrayName="ProcessId",
LookupTable=lt)
print "render..."
Render()
GetActiveView().UseOffscreenRenderingForScreenshots = 0
print "writing..."
WriteImage("coloredSphere.png")
print "Done."
Pat
On Thu, Feb 14, 2013 at 3:16 AM, Cook, Rich <cook47 at llnl.gov> wrote:
> OK, so what I'm hearing is that the pvbatch process will use its MPI rank
> to decompose the data correctly.
> Does this decomposition work correctly for all data formats? I suspect
> not, so is there a way to confirm that decomposition is working for a
> particular type of data? In the GUI, I can paint the data with the process
> ID scalars filter. Any similar trick for pvbatch?
> Thanks!
> -- Rich
>
>
> On Feb 12, 2013, at 11:31 PM, Pat Marion wrote:
>
> Hi Rich,
>
> The command line will be:
>
> mpirun -np 1 /path/to/pvbatch /path/to/script.py
>
>
> The pvbatch executable is similar to pvserver, except proc 0 doesn't wait
> for a client connection, it reads the python script specified on the
> command line and executes the script as if it were instructions from a
> client. The script will be read and interpreted on proc 0, but the
> processing will be carried out by all the satellite procs, just like
> pvserver.
>
> I'd recommend starting with a script that is very simple, like:
>
> # testSphere.py
>
> from paraview.simple import *
>
> Sphere()
> writer = XMLPPolyDataWriter(FileName='sphere.pvtp')
> writer.UpdatePipeline()
>
>
> Try getting that to work using -np 1, then try -np 2, you should find
> pieces of there sphere written by both processes.
>
> Pat
>
> p.s. If your data is unstructured grid, then your writer would be a
> XMLPUnstructuredGridWriter. Also, I'm not sure if the name should be
> "XMLP..." or "XMLPartitioned..."
>
>
> On Wed, Feb 13, 2013 at 11:48 AM, Cook, Rich <cook47 at llnl.gov> wrote:
>
>> Hello, ParaView genii,
>>
>> I would like to run ParaView to view some data in parallel using the
>> following script from a user as a basis. I'm not sure how to do this
>> right. I've never scripted Paraview before, and am thus clueless how to
>> make it work. Can someone on this list tell me how Paraview scripting with
>> MPI works or point me to a tutorial on your massive tutorial pages? The
>> data is very large and it would be great to decompose the data across the
>> cluster.
>>
>> Thanks!
>>
>> try: paraview.simple
>> except: from paraview.simple import *
>> paraview.simple._DisableFirstRenderCameraReset()
>>
>> filein = XMLPartitionedUnstructuredGridReader(
>> FileName=['/p/lscratchd/bodart1/test_para/duct.020000.pvtu'] )
>>
>> Slice1 = Slice( SliceType="Plane" )
>>
>> Slice1.SliceOffsetValues = [0.0]
>> Slice1.SliceType.Origin = [0.0,0.0,0.0]
>> Slice1.SliceType.Normal = [0.0,0.0,1.0]
>> Slice1.SliceType = "Plane"
>>
>> CleantoGrid1 = CleantoGrid()
>>
>>
>> CellDatatoPointData1 = CellDatatoPointData()
>>
>> w.FileName="test_slice.vtu"
>> w.UpdatePipeline()
>>
>>
>> --
>> ✐Richard Cook
>> ✇ Lawrence Livermore National Laboratory
>> Bldg-453 Rm-4024, Mail Stop L-557
>> 7000 East Avenue, Livermore, CA, 94550, USA
>> ☎ (office) (925) 423-9605
>> ☎ (fax) (925) 423-6961
>> ---
>> Information Management & Graphics Grp., Services & Development Div.,
>> Integrated Computing & Communications Dept.
>> (opinions expressed herein are mine and not those of LLNL)
>>
>>
>>
>> _______________________________________________
>> Powered by www.kitware.com
>>
>> Visit other Kitware open-source projects at
>> http://www.kitware.com/opensource/opensource.html
>>
>> Please keep messages on-topic and check the ParaView Wiki at:
>> http://paraview.org/Wiki/ParaView
>>
>> Follow this link to subscribe/unsubscribe:
>> http://www.paraview.org/mailman/listinfo/paraview
>>
>
>
> --
> ✐Richard Cook
> ✇ Lawrence Livermore National Laboratory
> Bldg-453 Rm-4024, Mail Stop L-557
> 7000 East Avenue, Livermore, CA, 94550, USA
> ☎ (office) (925) 423-9605
> ☎ (fax) (925) 423-6961
> ---
> Information Management & Graphics Grp., Services & Development Div.,
> Integrated Computing & Communications Dept.
> (opinions expressed herein are mine and not those of LLNL)
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20130214/9a908ecf/attachment-0001.htm>
More information about the ParaView
mailing list