[Paraview] [EXTERNAL] Client-server mode fails with 5.0.1

Paul Melis paul.melis at surfsara.nl
Wed Aug 17 08:16:18 EDT 2016


Hi Utkarsh,

On 28-06-16 16:29, Utkarsh Ayachit wrote:
>> As soon as I set the threshold to 0 the warning about the OpenGL 3.2 context
>> pops up (but no segfault in this case).
>
> No segfault just means that the scene didn't attempt to rendering
> something that needed newer OpenGL.
>
>> I tried a very minimal example that creates a 3.2 context (using SDL) and
>> that fails as well, so maybe something strange is going in with our nvidia
>> driver installation. I'll see if I can do some more tests.
>
> Cool. Thanks.

I did some further testing, but can't find anything obviously wrong with 
our setup w.r.t OpenGL versions. I do notice that window setup in SDL2 
fails due to the XRandR support finding no outputs configured on the 
graphics cards (there's nothing connected as these are cluster nodes). 
GLFW3 reports "X11: RandR monitor support seems broken", but initializes 
and renders (on :0.0) anyway when requesting a 3.2 context. Does 
ParaView, or VTK, use XRandR to get a valid window on which an OpenGL 
context is created? Could this be a reason I see the 3.2 context 
creation failure?

Regards,
Paul
-- 

Paul Melis
| Visualization group leader & developer | SURFsara |
| Science Park 140 | 1098 XG Amsterdam |
| T 020 800 1312 | paul.melis at surfsara.nl | www.surfsara.nl |


More information about the ParaView mailing list