[vtkusers] VTK with NVidia Optimus

Paulo Waelkens paulo.waelkens at gmail.com
Sun Jan 3 12:26:37 EST 2016


Hi Chiang,
I also found this video, but my BIOS does look different. Apparently my
laptop has a hybrid graphics with a dynamic switching model (
https://wiki.archlinux.org/index.php/Hybrid_graphics), meaning it is
impossible to disable the IGP, since the IGP's framebuffer provides the
only interface to the screen. The discrete GPU, when in use, will write its
output to the IGP's framebuffer.
In the coming days I'll try writing a "pure" OpenGL win32 application (i.e.
a program that uses the window manager of the OS, unlike a console
application), and try out the solutions nvidia gives in the
OptimusRenderingPolicies.

I suspect the real problem is that I'm starting all the rendering stuff
from a console application (i.e. not a win32 application, that explicitly
uses the window manager of the OS). *Maybe* the optimus driver always
assigns the IGP to console applications, since in windows you would usually
create a win32 application if you wanted to do rendering. To test this I'll
do the following:
- create a win32 application with simple OpenGL code (e.g. draw a sphere)
=> play with nvidia settings until it uses the discrete GPU. This should
work, I hope
- create a console application, that links to a dll with OpenGL code, that
draws a sphere => this is basically analogous to what VTK is doing in my
application right now. I get the feeling, this will always default to the
IGP
- create a win32 application, that links to the same OpenGL dll => if I get
this to work with the discrete GPU, this is the solution!

I'm no OpenGL expert though; maybe what I'm planning here is not possible.
Guess I'll find out =D

Cheers,
Paulo


On 2 January 2016 at 03:54, WangQ <wangq1979 at outlook.com> wrote:

> Hi Paulo,
>
> Not sure about alienware, but my m4700 is able to disable IGP through bios
> setting. I always switch it off since I need CUDA, and my opengl code works
> well when the IGP is switched off. But i did not try the suggestion in WP.
>
> I just googled it and found this
> https://www.youtube.com/watch?v=HuqyR-496Sw
>
> Check it to see if you are luck.
>
> Cheers,
>
> Chiang
>
> ------------------------------
> Date: Sat, 2 Jan 2016 00:02:01 +0100
> Subject: Re: [vtkusers] VTK with NVidia Optimus
> From: paulo.waelkens at gmail.com
> To: wangq1979 at outlook.com
> CC: vtkusers at vtk.org
>
>
> Dear Chiang,
> thank you for your suggestion. It is not possible to disable the IGP in
> the BIOS of my laptop (Alienware17 r3), even after getting the newest BIOS
> from dell. I *guess* this means the laptop does not have a hardware
> multiplexer(?), whilst your M4700 has one(?). So this approach won't work,
> I think.
> Did you perhaps try the programmatic solutions suggested by nvidia (e.g.
> NvOptimusEnablement export thing) when doing your OpenGL programming?
> Thanks! Regards,
> Paulo
>
> On 1 January 2016 at 19:54, WangQ <wangq1979 at outlook.com> wrote:
>
>
> Hi,
>
> You may try disabling optimus in bios setting to see if works. At least
> this works with my DELL M4700 and direct opengl programming. Not 100% sure
> whether for VTK.
>
> Cheers,
>
> Chiang
> ------------------------------
> Date: Fri, 1 Jan 2016 18:42:17 +0100
> From: paulo.waelkens at gmail.com
> To: vtkusers at vtk.org
> Subject: [vtkusers] VTK with NVidia Optimus
>
> Dear all,
>
> I'm trying to get my VTK 6.3 application to use the NVidia GPU on my
> laptop. The laptop (Dell Alienware) combines an intel HD530 with an Nvidia
> GTX970m using the NVidia Optimus technology. At the moment, my application
> always uses the integrated GPU, which is slow and horrible. Did any of you
> figure out how to make the NVidia Optimus driver choose the discrete GPU
> instead??? The laptop uses the discrete GPU for games, so it's not a
> hardware problem, I'd say.
>
> NOTE: I have both the latest intel and nvidia GPU drivers, and installed
> the intel driver first (as suggested somewhere). I've built VTK 6.3 with
> shared libraries and using the OpenGL2 flag.
>
> I've followed the solutions described by nvidia, without success:
>
> http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf
> - set the discrete GPU as the "Preferred graphics processor" => no effect
> - adding this line to my main.cpp
> extern "C" { _declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
> } //=>no effect
>
> To check if the discrete GPU is being used, I use the GPU activity icon of
> the nvidia control panel (
> http://acer--uk.custhelp.com/app/answers/detail/a_id/9075/~/determining-which-graphics-card-is-used-with-nvidia-optimus).
> The icon does work correctly, since it shows activity when games are
> running (e.g. Starcraft 2).
>
> I figured maybe I need to add the *extern "C" { _declspec(dllexport)
> DWORD NvOptimusEnablement = 0x00000001; }* line in the VTK source code
> somewhere. This is a bit of a wild guess, but maybe, since I'm linking to
> VTK dynamically, all the OpenGL stuff is done withing the VTK dll
> boundaries, so NVidia Optimus needs a hint from VTK, not from my (console)
> application that uses VTK.
>
> I'm really running out of tricks here, and was wondering if one of you
> knows how to proceed.
> Thanks!
>
> _______________________________________________ Powered by www.kitware.com
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html Please keep messages
> on-topic and check the VTK FAQ at: http://www.vtk.org/Wiki/VTK_FAQ Search
> the list archives at: http://markmail.org/search/?q=vtkusers Follow this
> link to subscribe/unsubscribe:
> http://public.kitware.com/mailman/listinfo/vtkusers
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/vtkusers/attachments/20160103/e842a436/attachment.html>


More information about the vtkusers mailing list