[vtk-developers] Widget picking interfaces only for mouse
Thompson, David C
dcthomp at sandia.gov
Sun Dec 7 19:35:59 EST 2003
Drew Dolgert wrote:
> I've been using VTK and its widgets inside a virtual
> reality environment called a Windows CAVE. I'm
> wondering whether anyone has thought of generalizing
> events and picking from 2D mouse positions. While
> implementing widgets and picking for immersive
> environments, I've noticed that 2D mouse (x,y) points
> are responsible for some redundant code.
This is a great idea; have you thought about how 3D
devices might be used for interaction in general (camera
and actor transforms) as opposed to just picking? Also,
do you have your own classes for specific devices or use
something like VRPN (http://www.cs.unc.edu/Research/vrpn/)?
Finally, what about devices that might be used in
multiple contexts? For example, if you wanted to use a
wand to position a cutting plane with a vtk3DWidget, you
could use the wand as an implicit line to select handles
on the widget or you might want to use the wand to define
the plane's base point and normal directly. It seems like
a choice that depends heavily on the specific device and
tastes of the user.
There's certainly a lot of things that would be cool
for the interactor classes.
- Gestures. Some meta-interactor class that collects
events from other interactors and replaces or
annotates them with information summarizing a bunch
of events (i.e., "Events 11 to 23 are a rotation about
X of 5 degrees")
- Collaboration. Allowing multiple devices to control
cameras and props either simultaneously or exclusively.
Right now, only one active interactor is allowed per
render window; this is fine if the collaboration is
remote (there are multiple render windows/applications),
but for big screens or tiled displays, there may be
multiple input devices.
Has anyone else been tackling interactor problems? I've seen
a few messages before on the list but don't recall if
anyone followed up.
David
More information about the vtk-developers
mailing list