[vtkusers] VTK in a MultiTouch Environment

David Gobbi david.gobbi at gmail.com
Fri Oct 16 14:25:41 EDT 2009


Hi Christopher,

VTK should be able to draw into the OpenGL context that is provided by
your python library.  You just have to call vtkRenderWindow::WindowInfo()
and vtkRenderWindow::SetSize() so that VTK will use your context instead
of creating its own.

For example, I do the following to get VTK to draw into a PyQt window:

    def polish(self):
          self._RenderWindow.SetSize(size.width(),size.height())
          self._RenderWindow.SetWindowInfo(str(int(self.winId()))

This works because Qt calls the "polish()" method on my qtRenderWidget
before the Qt window is mapped to the screen.  When that happens,
I get the size and window ID from the Qt window, and use those to
set up the RenderWindow.

So if PyMT can give you the window Id of the window that it uses for
OpenGL, then getting VTK to use that window should be easy.

With respect to the interactors, it might be easiest not to use them at
all.  You can write your own code that takes the multi-touch information
from PyMT, and then use the vtkPicker yourself and directly modify
the Prop3D and clipping plane as necessary.  By all means take a look
at the vtkInteractors, just realize that you don't necessarily have to use
them.

    David


On Fri, Oct 16, 2009 at 10:06 AM, Christopher Denter
<dennda at the-space-station.com> wrote:
> Hello,
>
> my name is Christopher and I'm a CS student currently searching for a topic
> for my bachelor's thesis. One topic that I'd find particularly interesting
> is an application that visualizes rendered volumes and allows the user(s) to
> explore the set of data by using multitouch gestures.
> For an example of how this may look in practice, please see this remarkable
> video: http://www.vimeo.com/6866296
>
> For the rendering part, I would like to use VTK so I don't need to write the
> volume rendering myself.
> The actual GUI and interaction would be made with PyMT [0], a python library
> that allows you to write multitouch applications with ease.
>
> There are two main problems here that I already see:
>        a) Is it possible to take the rendered volume and use it in my PyMT
> application (which has an OpenGL context itself)? Of course, this needs to
> be fast enough to allow for actual interaction with the user. If yes, how
> would I do that?
>        b) What do I need to do in order to translate finger gestures into
> VTK commands like 'rotate the volume' or 'show everything up to that
> particular slice'? Is that even possible properly?
>
> I understand that (as chapter 13 of the user's guide explains) you can
> subclass vtkRenderWindowInteractor and friends, but those seem a bit mouse
> and keyboard centric (because you need to override methods such as
> onMiddleButtonDown()). However, both mouse and keyboard is something you do
> not usually have in a multitouch environment (as you can clearly see in the
> above video).
>
> I would be really happy if it would be possible to realize the application
> sketched above. Thoughts on achievability are also very happily accepted.
> Keep in mind that this will be a bachelor's thesis with 3-6 months (of
> non-exclusive working time spent) and not a doctoral thesis.
>
> In the hope that my questions can be answered and with best regards,
> Christopher
>
> [0] http://pymt.txzone.net
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the VTK FAQ at:
> http://www.vtk.org/Wiki/VTK_FAQ
>
> Follow this link to subscribe/unsubscribe:
> http://www.vtk.org/mailman/listinfo/vtkusers
>



More information about the vtkusers mailing list