[vtkusers] Building a colored texture by interpolating a set of colored points.

Jean-Max Redonnet jmax.red at gmail.com
Fri Dec 15 06:41:06 EST 2017


Thanks for your answer.

Unfortunately, I'm not trying to compute normals but some other vector
field resulting from previous calculation.

Here is the code I'm working on. But for moment it doesn't provides the
expected results...
<some calculation stuff...>

        vtkPoints points = new vtkPoints();
        vtkUnsignedCharArray colors = new vtkUnsignedCharArray();
        colors.SetNumberOfComponents(3);
        colors.SetName("Colors");


        // Weaves, Chunks and SurfacePoints are my own objects from which I
get the geometrical informations I need
        // SurfacePoint holds both 3D and parametric informations of each
point. ie (x,y,z) and (u, v)
        for (Weave w : part.getFirstPly().getWeaves()) {
            for (int i = 0; i < w.getNbPoints() - 1; i++) {
                WeaveChunk chunk = w.getChunkAtIndex(i);
                SurfacePoint p = chunk.getStart();
                points.InsertNextPoint(p.u, p.v, 0.0);
                Vector3d vec = chunk.getDirectionVector();
                vec.normalize();
                double[] dcolor = new double[]{vec.x, vec.y, vec.z};
                byte[] color = new byte[3];
                for(int j = 0; j < 3; j++)
                    color[j] = (byte)( 255 * dcolor[j]);
                colors.InsertNextTuple3(color[0], color[1], color[2]);
            }
        }

        vtkPolyData inputPolyData = new vtkPolyData();
        inputPolyData.SetPoints(points);

        vtkDelaunay2D delaunay = new vtkDelaunay2D();
        delaunay.SetInputData(inputPolyData);
        delaunay.Update();

        vtkPolyData outputPolyData = new vtkPolyData();
        outputPolyData.ShallowCopy(delaunay.GetOutput());
        outputPolyData.GetPointData().SetScalars(colors);

<rendering stuff...>

I can figure out what I missed.

Any hint would be very appreciated.

jMax

2017-12-15 12:38 GMT+01:00 Jean-Max Redonnet <jmax.red at gmail.com>:

> Thanks for your answer.
>
> Unfortunately, I'm not trying to compute normals but some other vector
> field resulting from previous calculation.
>
> Here is the code I'm working on. But for moment it doesn't provides the
> expected results...
> <some calculation stuff...>
>
>         vtkPoints points = new vtkPoints();
>         vtkUnsignedCharArray colors = new vtkUnsignedCharArray();
>         colors.SetNumberOfComponents(3);
>         colors.SetName("Colors");
>
>
>         // Weaves, Chunks and SurfacePoints are my own objects from which
> I get the geometrical informations I need
>         // SurfacePoint holds both 3D and parametric informations of each
> point. ie (x,y,z) and (u, v)
>         for (Weave w : part.getFirstPly().getWeaves()) {
>             for (int i = 0; i < w.getNbPoints() - 1; i++) {
>                 WeaveChunk chunk = w.getChunkAtIndex(i);
>                 SurfacePoint p = chunk.getStart();
>                 points.InsertNextPoint(p.u, p.v, 0.0);
>                 Vector3d vec = chunk.getDirectionVector();
>                 vec.normalize();
>                 double[] dcolor = new double[]{vec.x, vec.y, vec.z};
>                 byte[] color = new byte[3];
>                 for(int j = 0; j < 3; j++)
>                     color[j] = (byte)( 255 * dcolor[j]);
>                 colors.InsertNextTuple3(color[0], color[1], color[2]);
>             }
>         }
>
>         vtkPolyData inputPolyData = new vtkPolyData();
>         inputPolyData.SetPoints(points);
>
>         vtkDelaunay2D delaunay = new vtkDelaunay2D();
>         delaunay.SetInputData(inputPolyData);
>         delaunay.Update();
>
>         vtkPolyData outputPolyData = new vtkPolyData();
>         outputPolyData.ShallowCopy(delaunay.GetOutput());
>         outputPolyData.GetPointData().SetScalars(colors);
>
> <rendering stuff...>
>
> I can figure out what I missed.
>
> Any hint would be very appreciated.
>
> jMax
>
> 2017-12-13 17:37 GMT+01:00 Cory Quammen <cory.quammen at kitware.com>:
>
>> That seems like a reasonable approach to doing what you are describing.
>> If you need only to compute normals between points on a Delaunay2D surface
>> and display them immediately, then the rendering is already doing that for
>> you on the GPU, so it is very fast. If you need to save out the texture,
>> then doing it the way you describe also sounds reasonable.
>>
>> HTH,
>> Cory
>>
>> On Mon, Dec 11, 2017 at 8:32 AM, Jean-Max Redonnet <jmax.red at gmail.com>
>> wrote:
>>
>>> Hi, I'm still quite new to vtk, and I would like a pro advice.
>>>
>>> I have a set of pixels, each one given with its full rgb color
>>> information. I would like to build a full image by interpolating color of
>>> pixels between the given points. First, I would like to know if there is a
>>> magic wand to do that. I searched it for days but maybe I missed something.
>>>
>>> If the job need to be done by hand, I plan to do this:
>>> - first I split my image using a Delaunay algorithm.
>>> - then, each pixels (except the given ones) belongs to a single
>>> triangle. Thus I can calculate the scalar factors using a barycenter
>>> formula.
>>>
>>> For example, let be a triangle defined by three pixels A, B and C. Let
>>> be P a pixel inside this triangle. The following vectorial relations can be
>>> written: PA = a.u, PB = b.v and PC = c.w, where u,v and w are unit vector
>>> along (PA), (PB) and (PC) respectively. In this context I plan to use
>>> scalar factors a, b and c to interpolate A,B and C pixels colors values.
>>> The color of pixel P being then set to these interpolated values.
>>>
>>> Is it the right way to do ? Do you know a better way to do this ?
>>>
>>> Giving you a larger sight may help you to help me. Actually I want to
>>> interpolate a vector field from a given set of vectors. The idea is to do
>>> something like normal maps used in computer graphics. Mapping the 3D
>>> vectors position to a 2D space is not a problem since their starting point
>>> belongs to a parametric surface, therefore the 2D parametric space of the
>>> surface can be used. Then the (u,v) space can easily be mapped to an image
>>> pixels space while the coordinates of my vectors can meanwhile be mapped to
>>> a rgb color space.
>>>
>>> Using an image to build and store my vector field interpolation is
>>> convenient but I would like to set up all this in the most efficient manner.
>>>
>>> According to you, what is the better way to do this ?
>>>
>>> Thanks for any help.
>>>
>>> jMax
>>>
>>>
>>> _______________________________________________
>>> Powered by www.kitware.com
>>>
>>> Visit other Kitware open-source projects at
>>> http://www.kitware.com/opensource/opensource.html
>>>
>>> Please keep messages on-topic and check the VTK FAQ at:
>>> http://www.vtk.org/Wiki/VTK_FAQ
>>>
>>> Search the list archives at: http://markmail.org/search/?q=vtkusers
>>>
>>> Follow this link to subscribe/unsubscribe:
>>> http://public.kitware.com/mailman/listinfo/vtkusers
>>>
>>>
>>
>>
>> --
>> Cory Quammen
>> Staff R&D Engineer
>> Kitware, Inc.
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://vtk.org/pipermail/vtkusers/attachments/20171215/eed38b6e/attachment.html>


More information about the vtkusers mailing list