[Paraview] pvpython eating all my memory

Berk Geveci berk.geveci at kitware.com
Mon Jun 15 21:11:32 EDT 2009


No problem.

On Mon, Jun 15, 2009 at 1:21 PM, Peter Brady<petertbrady at gmail.com> wrote:
> Hi Berk,
>
> That fixed my problem.  Thanks for looking into it.
>
> Peter.
>
> On Mon, Jun 15, 2009 at 8:19 AM, Berk Geveci<berk.geveci at kitware.com> wrote:
>> Hi Peter,
>>
>> First of all, I apologize for taking so long looking into this.
>>
>> I think I know what the problem is. In prog_filter.py, you are doing
>> the following:
>>
>> outputBlock = inputBlock.NewInstance()
>>
>> Due to some issues in VTK's reference counting and the way it
>> interacts with Python, this causes a memory leak and outputBlock is
>> never released. Try this:
>>
>> outputBlock = inputBlock.NewInstance()
>> outputBlock.UnRegister(None)
>>
>> instead. I bet you this will fix the memory problem. I will
>> investigate how we can solve this problem in VTK in the future.
>>
>> -berk
>>
>>
>> On Wed, Apr 1, 2009 at 4:48 PM, Peter Brady<petertbrady at gmail.com> wrote:
>>> Hello all,
>>>
>>> I have some reasonable sized data sets (256x256x512) that I'm using
>>> paraview to do some postprocessing on via a python script.  However,
>>> for some reason, as time progresses and it slogs through more
>>> timesteps the memory usage slowly creeps up until my workstation
>>> crashes (seems to take about 200-300 timesteps before it eats up all
>>> my memory).  I haven't encountered this problem before but this is the
>>> largest data set I've processed thus far.  My data format is ensight
>>> gold.  The script I'm using is below:
>>>
>>> ...
>>
>


More information about the ParaView mailing list