<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><div>Hi Bishwajit,<br> <br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div style="font-size:small">However in both below cases I am able to see CPU power getting consumed when the script is run.</div></div></blockquote><div><br></div><div>It looks like your GPU build isn't actually building against the GPU. In order to use the GPU with the NVidia driver, you either need to use an X server or use EGL.<br><br><br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">-DVTK_USE_X=OFF</div></blockquote><div><br></div><div>In
this case, you have disabled X but not enabled EGL so you're left with
OSMesa when you really intended to use the GPU instead. See <a href="https://blog.kitware.com/off-screen-rendering-through-the-native-platform-interface-egl/">https://blog.kitware.com/off-screen-rendering-through-the-native-platform-interface-egl/</a> for more details on configuring ParaView to use EGL.<br><br><br></div></div></div></div><div class="gmail_extra">- Chuck<br></div></div>