[Paraview] pvbatch vs python shell (3.6.1): pvbatch crashes

Weirs, V Gregory vgweirs at sandia.gov
Tue Oct 20 11:38:58 EDT 2009


Hi Jonathan,
As you have probably gathered, there are two interfaces to the python scripting - servermanager, which was in 3.4, and simple, which is in 3.6. I haven't had any success with backwards compatibility, despite the "compatibility mode", and you are mixing and matching servermanager and simple. I don't know exactly why you get different results in pvbatch and the GUI, but I would try to replace the servermanager commands by the simple commands. (Note that servermanager is still in 3.6 and later, and sometimes you need it, but mostly you get to it through simple.) In particular:
1.) I don't think you need the connection line
2.) Remove the CreatRenderView line, an equivalent will be added in 4
3.) rep = Show(contour)  # replace the CreateRepresentation line
4.) view = Render() # replace the ResetCamera and StillRender lines
5.) WriteImage("./foo.png")  # replace the view.WriteImage line

Under the hood, some key changes are that paraview is using the active filter, representation, or view when you don't specify it; and is creating a representation or view if you don't have one but you need it. This is why you don't have to create a view or representation directly, Show() and Render() are creating them.

I am not sure if these changes will actually fix the errors you list, but that is where I would start.
Later, eh.
Greg

On 10/20/09 6:57 AM, "Jonathan Dursi" <ljdursi at scinet.utoronto.ca> wrote:

Hi:

I'm just starting to try to learn the python scripting interface to
ParaView 3.6.1, taking bits and pieces from various tutorial slides
scattered over the web.   I've managed to put together something that
works fine in the python shell on the client side, but when I try using
the same script with pvbatch it crashes.  What am I doing wrong?

I cut down the script; the following is what I've been trying to run
with pvbatch.

from paraview.simple import *

sm = servermanager

connection = sm.Connect()
view = sm.CreateRenderView()

reader =
XMLPartitionedRectilinearGridReader(FileName="draping.0030.bin.pvtr")
pointData = CellDatatoPointData(Input=reader)
contour = Contour(pointData)
contour.ContourBy='density'
contour.Isosurfaces=[100]
rep = sm.CreateRepresentation(contour,view)

view.ResetCamera()
view.StillRender()
view.WriteImage("./foo.png", "vtkPNGWriter")

When I run this (modulo the sm.Connect()) in the python shell on the
client (server launched with mpirun -np 80 pvserver
--use-offscreen-rendering) it produces what I'd expect.

But run with pvbatch, it dies pretty convincingly:

$  mpirun -np 80 pvbatch --use-offscreen-rendering ../foo.py
Process id: 0 >> ERROR: In
/scinet/gpc/src/ParaView/ParaView3/Servers/ServerManager/vtkSMProxyManager.cxx,
line 327
vtkSMProxyManager (0x794c210): No proxy that matches: group=views and
proxy=BarChartView were found.

Process id: 0 >> ERROR: In
/scinet/gpc/src/ParaView/ParaView3/Servers/ServerManager/vtkSMProxy.cxx,
line 2032
vtkSMComparativeViewProxy (0x8bdcab0): Failed to create subproxy:
BarChartView

Process id: 0 >> ERROR: In
/scinet/gpc/src/ParaView/ParaView3/Servers/ServerManager/vtkSMProxyManager.cxx,
line 327
vtkSMProxyManager (0x794c210): No proxy that matches: group=views and
proxy=XYPlotView were found.

Process id: 0 >> ERROR: In
/scinet/gpc/src/ParaView/ParaView3/Servers/ServerManager/vtkSMProxy.cxx,
line 2032
vtkSMComparativeViewProxy (0x8d60080): Failed to create subproxy: XYPlotView

vtkFileSeriesReader : [ ...........]
vtkFileSeriesReader(1) : [ ...........]
vtkFileSeriesReader(4) : [ ...........]
vtkFileSeriesReader(9) : [ ...........]
vtkFileSeriesReader(24) : [ ...........]
vtkMPIMoveData : [ ...........]
rank 55 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 55: killed by signal 9
rank 54 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 54: killed by signal 11
rank 51 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 51: killed by signal 11
rank 50 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 50: killed by signal 11
rank 49 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 49: killed by signal 11
rank 48 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 48: killed by signal 11
rank 16 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 16: killed by signal 9
rank 39 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 39: killed by signal 11
rank 38 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 38: killed by signal 9
rank 33 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 33: killed by signal 11
rank 32 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 32: killed by signal 9
rank 78 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 78: killed by signal 11
rank 77 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 77: killed by signal 11
rank 75 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 75: killed by signal 11
rank 74 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 74: killed by signal 11
rank 73 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 73: killed by signal 11
rank 72 in job 1  gpc-f101n049_40245   caused collective abort of all ranks
   exit status of rank 72: killed by signal 11

What am I doing wrong?


--
Jonathan Dursi     <ljdursi at scinet.utoronto.ca>
_______________________________________________
Powered by www.kitware.com

Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: http://paraview.org/Wiki/ParaView

Follow this link to subscribe/unsubscribe:
http://www.paraview.org/mailman/listinfo/paraview


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20091020/804c1683/attachment.htm>


More information about the ParaView mailing list