[vtkusers] VTK with NVidia Optimus

Paulo Waelkens paulo.waelkens at gmail.com
Wed Jan 13 04:17:42 EST 2016


Hello!
>From my previous message "So either the windows 8.1 drivers are not working
properly and/or the GPU switch works for the 700m GPU series but not for
the 900m GPU series". => It turns out, the windows 8.1 drivers are at
fault. Yesterday I've installed windows 10, then the newest intel HD530,
then the newest nvidia drivers, and now the GPU switching works. In
conclusion, the nvidia optimus GPU switching does not work on windows
server 2012 R2 (==windows 8.1), but works on Windows 10. Hope this helps
anyone else stuck with this problem.
Cheers!
Paulo

On 10 January 2016 at 22:34, Paulo Waelkens <paulo.waelkens at gmail.com>
wrote:

> Hi Chiang,
>
> indeed, there SHOULD be a soft trigger for the graphic card switch. As far
> as I understood the <rant> vague, incorrect and absolutely infuriating!!!
> </rant> NVIDIA optimus documentation, there are a few approaches to trigger
> a switch between GPUs, as already mentioned. Unfortunately, they do not
> work for my current setup GTX970m on Windows Server 2012R2 (==Win 8.1, I
> believe). I ran a CUDA kernel prior to calling any rendering routines in my
> main function, and while I got GPU activity on the nvidia control panel for
> my executable, the rendering was still performed by the IGP, in defiance of
> the empty claims in the NVIDIA documentation. Sigh...
>
> I did discover one workaround, though: if you connect an external monitor
> to the laptop (HDMI port, in my case), and set this as the primary monitor,
> then the discrete GPU is always used, for everything. Including VTK.
> Problem, sort of, solved =P Also, I've tried running the same executable on
> another laptop with NVIDIA optimus (using the latest NVIDIA drivers), and
> the switch worked there. What?!?! That laptop had windows 10, instead of
> windows 8.1 and a different GPU model (GT740m). So either the windows 8.1
> drivers are not working properly and/or the GPU switch works for the 700m
> GPU series but not for the 900m GPU series (since they must have different
> drivers, given their different hardware architecture and features).
> Hopefully NVidia will get their act together at some point and fix their
> drivers...
>
> Anyway, thanks for your help, Chiang! I'm sort of dropping this issue for
> now, since the external monitor solution is a (dreadful) fix, and I cannot
> install windows 10 right now (because you cannot install VS2013 on windows
> 10; only if you upgrade from windows 8.1. Darned!)
>
> Cheers!
> Paulo
>
> On 4 January 2016 at 01:57, WangQ <wangq1979 at outlook.com> wrote:
>
>> Hi Paulo,
>>
>> It is quite strange. Suppose to have hard or soft trigger for graphic
>> card switch. You may also try a simple CUDA program to see if the dedicated
>> graphic card is used. If you get CUDA installed, there is an example about
>> cuda with opengl rendering. If successfully executed, the dedicated one
>> should be used.
>>
>> Cheers,
>>
>> Chiang
>>
>> ------------------------------
>> Date: Sun, 3 Jan 2016 18:26:37 +0100
>>
>> Subject: Re: [vtkusers] VTK with NVidia Optimus
>> From: paulo.waelkens at gmail.com
>> To: wangq1979 at outlook.com
>> CC: vtkusers at vtk.org
>>
>> Hi Chiang,
>> I also found this video, but my BIOS does look different. Apparently my
>> laptop has a hybrid graphics with a dynamic switching model (
>> https://wiki.archlinux.org/index.php/Hybrid_graphics), meaning it is
>> impossible to disable the IGP, since the IGP's framebuffer provides the
>> only interface to the screen. The discrete GPU, when in use, will write its
>> output to the IGP's framebuffer.
>> In the coming days I'll try writing a "pure" OpenGL win32 application
>> (i.e. a program that uses the window manager of the OS, unlike a console
>> application), and try out the solutions nvidia gives in the
>> OptimusRenderingPolicies.
>>
>> I suspect the real problem is that I'm starting all the rendering stuff
>> from a console application (i.e. not a win32 application, that explicitly
>> uses the window manager of the OS). *Maybe* the optimus driver always
>> assigns the IGP to console applications, since in windows you would usually
>> create a win32 application if you wanted to do rendering. To test this I'll
>> do the following:
>> - create a win32 application with simple OpenGL code (e.g. draw a sphere)
>> => play with nvidia settings until it uses the discrete GPU. This should
>> work, I hope
>> - create a console application, that links to a dll with OpenGL code,
>> that draws a sphere => this is basically analogous to what VTK is doing in
>> my application right now. I get the feeling, this will always default to
>> the IGP
>> - create a win32 application, that links to the same OpenGL dll => if I
>> get this to work with the discrete GPU, this is the solution!
>>
>> I'm no OpenGL expert though; maybe what I'm planning here is not
>> possible. Guess I'll find out =D
>>
>> Cheers,
>> Paulo
>>
>>
>> On 2 January 2016 at 03:54, WangQ <wangq1979 at outlook.com> wrote:
>>
>> Hi Paulo,
>>
>> Not sure about alienware, but my m4700 is able to disable IGP through
>> bios setting. I always switch it off since I need CUDA, and my opengl code
>> works well when the IGP is switched off. But i did not try the suggestion
>> in WP.
>>
>> I just googled it and found this
>> https://www.youtube.com/watch?v=HuqyR-496Sw
>>
>> Check it to see if you are luck.
>>
>> Cheers,
>>
>> Chiang
>>
>> ------------------------------
>> Date: Sat, 2 Jan 2016 00:02:01 +0100
>> Subject: Re: [vtkusers] VTK with NVidia Optimus
>> From: paulo.waelkens at gmail.com
>> To: wangq1979 at outlook.com
>> CC: vtkusers at vtk.org
>>
>>
>> Dear Chiang,
>> thank you for your suggestion. It is not possible to disable the IGP in
>> the BIOS of my laptop (Alienware17 r3), even after getting the newest BIOS
>> from dell. I *guess* this means the laptop does not have a hardware
>> multiplexer(?), whilst your M4700 has one(?). So this approach won't work,
>> I think.
>> Did you perhaps try the programmatic solutions suggested by nvidia (e.g.
>> NvOptimusEnablement export thing) when doing your OpenGL programming?
>> Thanks! Regards,
>> Paulo
>>
>> On 1 January 2016 at 19:54, WangQ <wangq1979 at outlook.com> wrote:
>>
>>
>> Hi,
>>
>> You may try disabling optimus in bios setting to see if works. At least
>> this works with my DELL M4700 and direct opengl programming. Not 100% sure
>> whether for VTK.
>>
>> Cheers,
>>
>> Chiang
>> ------------------------------
>> Date: Fri, 1 Jan 2016 18:42:17 +0100
>> From: paulo.waelkens at gmail.com
>> To: vtkusers at vtk.org
>> Subject: [vtkusers] VTK with NVidia Optimus
>>
>> Dear all,
>>
>> I'm trying to get my VTK 6.3 application to use the NVidia GPU on my
>> laptop. The laptop (Dell Alienware) combines an intel HD530 with an Nvidia
>> GTX970m using the NVidia Optimus technology. At the moment, my application
>> always uses the integrated GPU, which is slow and horrible. Did any of you
>> figure out how to make the NVidia Optimus driver choose the discrete GPU
>> instead??? The laptop uses the discrete GPU for games, so it's not a
>> hardware problem, I'd say.
>>
>> NOTE: I have both the latest intel and nvidia GPU drivers, and installed
>> the intel driver first (as suggested somewhere). I've built VTK 6.3 with
>> shared libraries and using the OpenGL2 flag.
>>
>> I've followed the solutions described by nvidia, without success:
>>
>> http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf
>> - set the discrete GPU as the "Preferred graphics processor" => no effect
>> - adding this line to my main.cpp
>> extern "C" { _declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
>> } //=>no effect
>>
>> To check if the discrete GPU is being used, I use the GPU activity icon
>> of the nvidia control panel (
>> http://acer--uk.custhelp.com/app/answers/detail/a_id/9075/~/determining-which-graphics-card-is-used-with-nvidia-optimus).
>> The icon does work correctly, since it shows activity when games are
>> running (e.g. Starcraft 2).
>>
>> I figured maybe I need to add the *extern "C" { _declspec(dllexport)
>> DWORD NvOptimusEnablement = 0x00000001; }* line in the VTK source code
>> somewhere. This is a bit of a wild guess, but maybe, since I'm linking to
>> VTK dynamically, all the OpenGL stuff is done withing the VTK dll
>> boundaries, so NVidia Optimus needs a hint from VTK, not from my (console)
>> application that uses VTK.
>>
>> I'm really running out of tricks here, and was wondering if one of you
>> knows how to proceed.
>> Thanks!
>>
>> _______________________________________________ Powered by
>> www.kitware.com Visit other Kitware open-source projects at
>> http://www.kitware.com/opensource/opensource.html Please keep messages
>> on-topic and check the VTK FAQ at: http://www.vtk.org/Wiki/VTK_FAQ
>> Search the list archives at: http://markmail.org/search/?q=vtkusers
>> Follow this link to subscribe/unsubscribe:
>> http://public.kitware.com/mailman/listinfo/vtkusers
>>
>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/vtkusers/attachments/20160113/32270dbc/attachment.html>


More information about the vtkusers mailing list