[Paraview] clipped scene with GPU rendering on NVIDIA tesla C1060

Burlen Loring bloring at lbl.gov
Thu Jun 30 13:45:48 EDT 2011


Hi Pratik,

you do not need a physical monitor on your servers to use gpu. My 
knowledge of X11 is limited, but I think you need to declare a display 
in your xorg config, probably so the windowing system can set up buffers 
and so on(I welcome correction/clarification on this point). As long as 
you have the display resolution on the server set at least as large as 
your client's monitor you should not have the problem you experienced. 
2500x1600 would be a reasonably safe setting.

Burlen

On 06/30/2011 04:03 AM, pratik wrote:
> Burlen,
> Thanks a lot for that tip! Yes, that indeed was the problem.
>
> I was just wondering....why does the GPU rendering have to depend on 
> the monitor at all? Since the GPU cards are doing the processing (at 
> least as far as i know)...is there any way in which one can remove 
> this monitor dependency altogether?
> Also, if it is not possible to do gpu rendering without a monitor, 
> then is it impossible to use the gpu's present in a cpu+gpu 
> cluster(with headless nodes)?
>
> I also tried xvfb, but it seems that it does not use the graphics 
> cards at all, and i got terrible graphics.
>
> Thanks again,
> Pratik
> On Wednesday 29 June 2011 09:13 PM, Burlen Loring wrote:
>> Hi Pratik,
>>
>> I have had some similar issues when my desktop screen resolution was 
>> higher than the server display resolution. Do you know what the 
>> display resolution is set to on the server side? I was able to check 
>> the server side resolution using xrandr command. The issue was fixed 
>> by increasing the server display resolution in the xorg.conf. This 
>> could be one possibility for what you are seeing.
>>
>> Burlen
>>
>> On 06/29/2011 07:42 AM, pratik wrote:
>>> Hi everyone,
>>> This is my configuration(PV v.3.10.1):
>>> Server: desktop with 4 NVIDIA tesla C1060 cards; Client is a laptop.
>>> To make use of GPU rendering, i start an Xserver on a free display 
>>> like so(i cannot use the :0 display, also i cannot use $DISPLAY. if 
>>> i use the $DISPLAY option, the server disconnects as soon as i load 
>>> data from paraview):
>>>
>>> startx -- :14
>>>
>>> and my .xinitrc contains
>>>
>>> exec xdm
>>>
>>> so that xdm executes on the Xserver that is started.
>>> once this is done, i simply run
>>>
>>> mpirun -np 4 pvserver -display localhost:14.0
>>> OR
>>> pvserver -display localhost:14.0
>>>
>>>
>>> Everything goes fine, client connects to server etc. But when the 
>>> actual visualization is done, a strip at the top (roughly 1/3 of 
>>> screen) is blanked(see the attached image). A curious thing that i 
>>> observed was that when i change the orientation, rotate the object 
>>> etc, the LOD actor seems to render properly(meaning that i can see 
>>> the whole picture), but after i let it settle then this clipping 
>>> takes place. Only a few weeks back the rendering seemed to have 
>>> happened perfectly, so i don't know why it is suddenly behaving this 
>>> way now.
>>>
>>> Thanks  in advance for the help.
>>>
>>> -pratik
>>>
>>>
>>> _______________________________________________
>>> Powered bywww.kitware.com
>>>
>>> Visit other Kitware open-source projects athttp://www.kitware.com/opensource/opensource.html
>>>
>>> Please keep messages on-topic and check the ParaView Wiki at:http://paraview.org/Wiki/ParaView
>>>
>>> Follow this link to subscribe/unsubscribe:
>>> http://www.paraview.org/mailman/listinfo/paraview
>>>      
>>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20110630/bbce2fa8/attachment.htm>


More information about the ParaView mailing list