[Paraview] pvbatch vs. pvpython
Ganesh Vijayakumar
ganesh.iitm at gmail.com
Wed Aug 8 16:15:21 EDT 2012
hi!
I recently installed paraview 3.12 with offscreen mesa on a SGI cluster
with intel compilers and SGI MPT. Using the same version of paraview on my
local computer I recorded a script in the qt version using python trace. I
was able to execute the same script just fine on the cluster on a similar
but larger case using pvpython. However I'm unable to use pvbatch. First
off the cluster forces me to use mpirun to run pvbatch, saying that all MPI
applications must be launched with mpirun. Even when I do this
mpirun -np 1 pvbatch --use-offscreen-rendering saveImage.py
MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
MPI: aborting job
MPI: Received signal 9
a termination is all I get. Is something wrong the way I'm using pvbatch?
I'll be working on datasets that are over 20 million cells. I hope to use
multiple processors if that will help me speed up visualization. Please
note that I'm not using a server. I just intend to submit the visualization
as a job on the same cluster where I run the simulation.
ganesh
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20120808/e333be7e/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: saveImage.py
Type: application/octet-stream
Size: 4040 bytes
Desc: not available
URL: <http://www.paraview.org/pipermail/paraview/attachments/20120808/e333be7e/attachment.obj>
More information about the ParaView
mailing list