[vtkusers] Crash with subsequent creation of vtkRenderWindow in Python
Pearu Peterson
vtk at cens.ioc.ee
Sat Apr 5 12:50:40 EST 2003
On Sat, 5 Apr 2003, Charl P. Botha wrote:
> On Sat, Apr 05, 2003 at 07:29:18PM +0300, Pearu Peterson wrote:
> > #6 0x424fa5d3 in glXDestroyContext(dpy=0x8514be8,ctx=0x8518af8) in 'glxapi.c', line 208
> > #5 0x424fecd6 in Fake_glXDestroyContext(dpy=0x8514be8,ctx=0x8518af8) in 'fakeglx.c ', line 1462
> > #4 0x42505f34 in XMesaGarbageCollect() in 'xm_api.c', line 2530
> > #3 0x425d68b7 in XSync()
> > #2 0x425daba6 in _XReply()
> > #1 0x425d980c in ?()
> > #0 0x425f4cae in _X11TransWrite()
>
> Hmmm, this is a perfectly normal glxDestroyContext call in
> vtkXOpenGLRenderWindow. Maybe you could check what happens in that
> unfortunately named Fake_glXDestroyContext() call in Mesa?
Well, this is the code from fakeglx.c:
"""
/*
* Our fake GLX context will contain a "real" GLX context and an XMesa
context.
*
* Note that a pointer to a __GLXcontext is a pointer to a
fake_glx_context,
* and vice versa.
*
* We really just need this structure in order to make the libGL functions
* glXGetCurrentContext(), glXGetCurrentDrawable() and
glXGetCurrentDisplay()
* work correctly.
*/
struct fake_glx_context {
__GLXcontext glxContext; /* this MUST be first! */
XMesaContext xmesaContext;
};
static void
Fake_glXDestroyContext( Display *dpy, GLXContext ctx )
{
struct fake_glx_context *glxCtx = (struct fake_glx_context *) ctx;
(void) dpy;
MakeCurrent_PrevContext = 0;
MakeCurrent_PrevDrawable = 0;
MakeCurrent_PrevReadable = 0;
MakeCurrent_PrevDrawBuffer = 0;
MakeCurrent_PrevReadBuffer = 0;
XMesaDestroyContext( glxCtx->xmesaContext );
XMesaGarbageCollect();
}
"""
where from xm_api.c
"""
void XMesaDestroyContext( XMesaContext c )
{
#ifdef FX
if (c->xm_draw_buffer && c->xm_buffer->FXctx)
fxMesaDestroyContext(c->xm_draw_buffer->FXctx);
#endif
if (c->gl_ctx) {
_swsetup_DestroyContext( c->gl_ctx );
_swrast_DestroyContext( c->gl_ctx );
_tnl_DestroyContext( c->gl_ctx );
_ac_DestroyContext( c->gl_ctx );
_mesa_destroy_context( c->gl_ctx );
}
FREE( c );
}
/*
* Look for XMesaBuffers whose X window has been destroyed.
* Deallocate any such XMesaBuffers.
*/
void XMesaGarbageCollect( void )
{
XMesaBuffer b, next;
for (b=XMesaBufferList; b; b=next) {
next = b->Next;
if (b->display && b->frontbuffer && b->type == WINDOW) {
#ifdef XFree86Server
/* NOT_NEEDED */
#else
XSync(b->display, False);
if (!window_exists( b->display, b->frontbuffer )) {
/* found a dead window, free the ancillary info */
XMesaDestroyBuffer( b );
}
#endif
}
}
}
"""
I'll try to play with the above code (e.g. I am not sure why
XFree86Server was not defined in my case). But if you have
any hints what to try, they are most welcome.
> It's very difficult for me to help debugging this as I can't reproduce it.
Sure, it's understandable.
Thanks,
Pearu
More information about the vtkusers
mailing list