[vtkusers] ActiViz C# | add 2 actors

Matias Montroull matimontg at gmail.com
Mon Jan 20 13:57:23 EST 2014


Ok, so here's my code: See the registration commented section at the bottom.

Question, Source should be my NDI points or the mesh points?

Thanks for your time..

//Mesh
            vtkDICOMImageReader readerdicom = new vtkDICOMImageReader();
            readerdicom.SetDirectoryName(@"C:\CDDICOM\images");
            readerdicom.Update();

//extract skin
            vtkContourFilter skinextractor = vtkContourFilter.New();
            skinextractor.SetInputConnection(readerdicom.GetOutputPort());
            skinextractor.SetValue(0, -800);

//remove inner points
            vtkPolyDataConnectivityFilter contornos = new
vtkPolyDataConnectivityFilter();
            contornos.SetInputConnection(skinextractor.GetOutputPort());
            contornos.SetExtractionModeToLargestRegion();
            contornos.SetColorRegions(1);
            vtkCleanPolyData removepoints = new vtkCleanPolyData();
            removepoints.SetInput(contornos.GetOutput());
            removepoints.Update();

            vtkPolyDataNormals skinnormals = vtkPolyDataNormals.New();
            skinnormals.SetInputConnection(removepoints.GetOutputPort());
            skinnormals.SetFeatureAngle(90.0);
            skinnormals.ComputePointNormalsOn();

            vtkPolyDataMapper skinmapper = vtkPolyDataMapper.New();
            skinmapper.SetInputConnection(skinnormals.GetOutputPort());
            skinmapper.ScalarVisibilityOff();

            vtkActor actor_mesh = vtkActor.New();
            actor_mesh.SetMapper(skinmapper);


//Rregistracion (I load my NDI tracker registered points)

            vtkTesting test = vtkTesting.New();
            string root = test.GetDataRoot();
            string filePath = System.IO.Path.Combine(root,
@"C:/tracker/reg.poi");

            FileStream fs = null;
            StreamReader sr = null;
            String sLineBuffer;
            String[] sXYZ;
            char[] chDelimiter = new char[] { ' ', '\t', ';' };
            double[] xyz = new double[3];
            vtkPoints points = vtkPoints.New();
            int cnt = 0;


            // in case file must be open in another application too use
"FileShare.ReadWrite"
            fs = new FileStream(filePath, FileMode.Open, FileAccess.Read,
FileShare.ReadWrite);
            sr = new StreamReader(fs);
            while (!sr.EndOfStream)
            {
                sLineBuffer = sr.ReadLine();
                cnt++;
                sXYZ = sLineBuffer.Split(chDelimiter,
StringSplitOptions.RemoveEmptyEntries);
                if (sXYZ == null)
                {
                    System.Windows.MessageBox.Show("data seems to be in
wrong format at line " + cnt, "Format Exception");
                    return;
                }
                xyz[0] = double.Parse(sXYZ[0],
CultureInfo.InvariantCulture);
                xyz[1] = double.Parse(sXYZ[1],
CultureInfo.InvariantCulture);
                xyz[2] = double.Parse(sXYZ[2],
CultureInfo.InvariantCulture);
                points.InsertNextPoint(xyz[0], xyz[1], xyz[2]);
            }
            vtkPolyData polydata = vtkPolyData.New();
            polydata.SetPoints(points);

            vtkSurfaceReconstructionFilter surf = new
vtkSurfaceReconstructionFilter();
            surf.SetInput(polydata);
            vtkContourFilter cf = new vtkContourFilter();
            cf.SetInputConnection(surf.GetOutputPort());

            vtkVertexGlyphFilter glyphFilter = vtkVertexGlyphFilter.New();
            glyphFilter.SetInputConnection(polydata.GetProducerPort());
            glyphFilter.Update();
            // Visualize
            vtkPolyDataMapper registracion_mapper = vtkPolyDataMapper.New();


registracion_mapper.SetInputConnection(glyphFilter.GetOutputPort());
            vtkActor actorregistracion = vtkActor.New();
            actorregistracion.SetMapper(registracion_mapper);
            actorregistracion.GetProperty().SetPointSize(3);
            actorregistracion.GetProperty().SetColor(1, 0.5, 0);

            double[] a = actorregistracion.GetCenter();
            double[] b = actor_mesh.GetCenter();



        //Registration:
            vtkIterativeClosestPointTransform closestpoints = new
vtkIterativeClosestPointTransform();
            closestpoints.SetSource(removepoints.GetOutput());
            closestpoints.SetTarget(glyphFilter.GetOutput());
            closestpoints.GetLandmarkTransform().SetModeToRigidBody();
            closestpoints.SetMaximumNumberOfIterations(1000);
            closestpoints.SetMaximumNumberOfLandmarks(400);
            closestpoints.StartByMatchingCentroidsOff();
            closestpoints.Modified();
            closestpoints.Update();

            vtkTransformPolyDataFilter transform = new
vtkTransformPolyDataFilter();
            transform.SetInput(removepoints.GetOutput());
            transform.SetTransform(closestpoints);
            transform.Update();

            vtkPolyDataMapper solucion_mapper = vtkPolyDataMapper.New();
            solucion_mapper.SetInputConnection(transform.GetOutputPort());
            vtkActor actorsolucion = vtkActor.New();
            actorsolucion.SetMapper(solucion_mapper);
            actorsolucion.GetProperty().SetColor(0, 0, 1);
            actorsolucion.GetProperty().SetPointSize(3);



On Mon, Jan 20, 2014 at 3:47 PM, Maarten Beek <beekmaarten at yahoo.com> wrote:

> Not after the mapper because that doesn't generate a vtkPolyData (inherits
> from vtkDataSet).
> I would put it at the end of both pipelines (vtkClosestPointTransform
> needs two inputs: source and target) before the results of these pipelines
> are added to their mappers.
> Depends on what data you want to register (vtkPolyDataNormals doesn't
> change the location of the points so you could take the result before this
> algorithm as well).
>
>
>   On Monday, January 20, 2014 10:04:10 AM, Maarten Beek <
> beekmaarten at yahoo.com> wrote:
>  Both data have their own coordinate system. One from the apparatus that
> generated the skin surface, the other one from the NDI tracker. Both data
> would be in the same coordinate system if some calibration of the equipment
> is performed.
>
> The current data has to be registered to each other (moving one object
> into the coordinate system of the other). A first step would be to align
> the centers (see: vtkClosestPointTransform), then some additional
> iterations would be required to improve the registration.
>
>
>
>   On Saturday, January 18, 2014 7:44:23 PM, Matias Montroull <
> matimontg at gmail.com> wrote:
>  Hi,
>
> I've managed to add 2 actors to a render, however, I would like to set the
> same center for both..
>
> Reason being is that I have a skin face surface and a set of points I
> registered using a NDI tracker. So, I'd like to view both to see how close
> my points are to the surface to make some tests and then optimization (find
> the closest points to the surface).
>
> My points appear at the right of the image as shown in the screenshoot
> attached.. Is that because I need to set the center to be the same por
> something else I'm missing?
>
> Thanks!
>
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the VTK FAQ at:
> http://www.vtk.org/Wiki/VTK_FAQ
>
> Follow this link to subscribe/unsubscribe:
> http://www.vtk.org/mailman/listinfo/vtkusers
>
>
>
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the VTK FAQ at:
> http://www.vtk.org/Wiki/VTK_FAQ
>
> Follow this link to subscribe/unsubscribe:
> http://www.vtk.org/mailman/listinfo/vtkusers
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.vtk.org/pipermail/vtkusers/attachments/20140120/bab32fe4/attachment.html>


More information about the vtkusers mailing list