[Paraview] pvserver disconnecting...

pat marion pat.marion at kitware.com
Thu Feb 10 15:08:18 EST 2011


Hi Karl,

It sounds like a memory leak problem.  Have you tried opening the System
Monitor in ubuntu and watching the memory usage of pvserver as you run your
script?  Can you post your script so we can take a look at it, maybe you
aren't cleaning up after you process each file?

You can pass --cslog=log.txt to pvserver.  This records very low level log
information, it captures every message sent to the server.  Problem is that
it does not work very well for a parallel server, all processes all write to
the same file.

pvserver exits when the client disconnects and there is no keep-alive flag.
Try running pvserver in a bash loop, or starting it from your python script.

Pat

On Thu, Feb 10, 2011 at 1:46 PM, Karl Battams <karlbattams at gmail.com> wrote:

> I'm pretty new to paraview so could be doing fundamentally wrong here, but
> here's my issue...
>
> I have paraview 3.10 compiled/installed from source on Ubuntu 10. I'm using
> a python script to connect to pvserver, iterate over a bunch of vtk files,
> do some stuff, and saving the output as a png. I'm running the python script
> and the pvserver on the same (8-core) machine.
>
> So I start pvserver with (e.g.) mpirun -np 6 pvserver and it sits happily
> listening for a client. In another terminal, I run my python script which
> connects (successfully) to pvserver, and starts running through the files
> exactly as I ask it to. So far so good.
>
> Problem is, after a random length of time, and/or number of files, the
> pvserver abruptly dies and so the whole python script obviously dies with
> it. The pvserver gives no error message... just dies. If I just run the
> script for one file (or even a few of them), it runs without an issue.
>
> What I have noticed is that the less processors I use, the more files my
> script will process before pvserver dies. If I use pvserver by itself (i.e.
> one processor) it will do about 120-or-so files. If I use four processors,
> it'll only go through 30-or-so before one of them throws a kill signal and
> takes the whole thing down. It is not a particular file that does it, nor is
> it at a particular point in the processing script. The length of time and
> number of files processed is also not fixed. (It's almost like it's a memory
> leak issue..??)
>
> So... questions:
> 1) Am I being inefficient/stupid by using pvserver + python script to do
> this batch processing? If so, what's the recommended practice?
> 2) My script only does one server connect (via Connect('localhost') ) and
> then loops over the files. Should I do a new server connect for each file
> instead?
> 3) Assuming yes to question 2, how do I cleanly disconnect from pvserver
> without killing the server altogether (see below)?
> 4) If I run a single instance of file processing (i.e. one file) the script
> runs fine... but when it's done, the pvserver disconnects the client and
> dies. Is that normal? Is there a 'keep-alive' flag for pvserver? Why does
> dropping the connection kill the server?
> 5) Does pvserver leave any logs anywhere? Anyway I can trace what's causing
> the kill signal?
>
> I just can't shake the feeling that the pvserver process should be a lot
> more robust than what I'm seeing.
>
> Many thanks for any help!
> Karl
>
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at:
> http://paraview.org/Wiki/ParaView
>
> Follow this link to subscribe/unsubscribe:
> http://www.paraview.org/mailman/listinfo/paraview
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20110210/0f29c172/attachment.htm>


More information about the ParaView mailing list