[vtk-developers] Offscreen rendering OpenGL, linux, leaks memory

Dan Lipsa dan.lipsa at kitware.com
Fri Jun 10 15:54:17 EDT 2016


We tried the test program on two platforms:

1. OpenGL vendor string: VMware, Inc.
OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.6, 256 bits)
OpenGL version string: 2.1 Mesa 11.0.7

Here the leak is about 70MB for 10000. Here our collaborator run the
equivalent python program.
He connected to a linux machine using TigerVNC.

2. OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: Quadro K2000/PCIe/SSE2
OpenGL core profile version string: 4.2.0 NVIDIA 304.131
OpenGL core profile shading language version string: 4.20 NVIDIA via Cg
compiler

Here the leak is about 20 MB for 10000 iterations.

On Fri, Jun 10, 2016 at 9:40 AM, Dan Lipsa <dan.lipsa at kitware.com> wrote:

> Thanks Dave. This is a great idea. I will test this and report back.
>
> Beyond that, the debugging tools are failing us and I don't see many other
>> options than to start cutting out bits of the code to locate the problem.
>> See if the leak goes away when, e.g. vtkOpenGLRenderer::DeviceRender is
>> bypassed and go from there.
>>
>
> Indeed. Thanks for the pointer.
>
> Dan
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/vtk-developers/attachments/20160610/f63e73c8/attachment.html>


More information about the vtk-developers mailing list