[vtkusers] Background pixel blending in VTK (2D)
Kyle Nethery-Pavelchak
wizardanim2000 at gmail.com
Mon Apr 16 17:19:03 EDT 2018
Hi all -
My company is using VTK to render overlaid DICOM images to a screen.
All of the images in question are single channel intensity mapped
images encoded in a ushort buffer.
Quick overview of how data is built:
- vtkImageReslice is hooked into vtkImageMapToWindowLevelColors object.
- We add a custom lookup table to the window/level colors object.
- We connect the w/l colors object to a vtkImageMapper.
- We connect the image mapper to a Actor2D.
- We then use a vtkImageBlend class and set input connections to two
channels from the Actor2D generated in the above steps.
For each Actor2D object rendered to the screen, we can control default
values of pixels on the screen by accessing values within the lookup
table. However, for pixels outside the extent of the data being
drawn, the value appears to always be (0,0,0,1).
We run into issues when considering blending of two images. We need
the background image (if larger than the overlay image) to be present
outside the extents of the overlay image.
I see / have tried various API which seem like they should work - ie:
vtkImageReslice::SetBackgroundColor, setting default value in lookup
table, reimplementing the vtkImageBlend class to attempt to access
these background pixels.
One strange behavior I've noticed, is in the 'SetBackgroundColor(...)'
function on the reslicer, a value of (x, 0, 0, 0) sets the intensity
value of the background pixels - i can achieve a white background if i
set 'x' to be the highest value in our lookup table. However, the BG
channels remain 0 and A remains 1 even if i specify 0 in the function.
I would assume this is because the DICOM image is a single channel
image, and VTK is generating the background with the same format ..?
I am wondering if anyone has seen/experienced this before, or, knows
how to work around this kind of issue.
thanks
More information about the vtkusers
mailing list