[vtk-developers] why does vtkOpenGLCamera set its own aspect ratio?
Michael Halle
halazar at media.mit.edu
Fri Feb 23 11:08:46 EST 2001
Is there a reason that vtkOpenGLCamera computes its own aspect ratio
during a Render(), rather than using the one available from the
renderer using ComputeAspect()/GetAspect()?
Why do I ask? I want to add a "PixelAspect" field to vtkViewport to
allow rendering for displays with non-square pixels. ComputeAspect()
will now roll the PixelAspect into Aspect, so any queries using
GetAspect will work correctly. Since OpenGLCamera computes and sets
its own idea of the aspect ratio without PixelAspect, it tramples my
best-laid plans.
Normal rendering seems to work fine with the trivial change
(essentially changing SetAspect(aspect) to GetAspect(aspect)). I'm not
sure about picking/stereo though.
If you have ideas, please let me know!
Michael Halle
mhalle at bwh.harvard.edu
More information about the vtk-developers
mailing list