[Paraview] rendering large data sets

Adam Simpson adambsimpson at gmail.com
Sat Feb 15 19:32:56 EST 2014


Hi,
   I have ~1 billion 3D points I am trying to render but have so far been unsuccessful. I import poly data from a .vtp file and then apply a 2D vertex glyph filter but have been unable to get any output even after running on 32 cores for over 12 hours. Is there anything obviously wrong with the following batch script? It seems to work ok for small data sets(~10000 points).

vis.py :
-----------
from paraview.simple import *

sim_vtp = XMLPolyDataReader( FileName=['sim-1.vtp'] )

Glyph = Glyph( GlyphType="2D Glyph", GlyphTransform="Transform2" )
Glyph.GlyphType.GlyphType = 'Vertex'
Glyph.SetScaleFactor = 0.02
Glyph.MaximumNumberofPoints = 100000000

DataRepresentation = Show()

RenderView = GetRenderView()
ResetCamera()
Render()
WriteImage("sim-1.png")
---------

I launch it with something like this:
-----------
$ mpirun -n 32 pvbatch vis.py
-----------

The header of the .vtp data file looks like this:
-----------
<?xml version="1.0"?>
<VTKFile type="PolyData" version="0.1" byte_order="LittleEndian" compressor="vtkZLibDataCompressor">
 <PolyData>
   <Piece NumberOfPoints="997335909" NumberOfVerts="0" NumberOfLines="0" NumberOfStrips="0" NumberOfPolys="0">
     <PointData>
     </PointData>
     <CellData>
     </CellData>
     <Points>
       <DataArray type="Float32" Name="Points" NumberOfComponents="3" format="binary" RangeMin="0.25980760565" RangeMax="246.47706935">
-----------

I have looked into possibly using the D3 filter to better distribute the data but am unsure if this is necessary or helpful. I have also looked at perhaps using the Clean filter to reduce the number of points as well. Any advice on how to render this in a reasonable time or what my render time expectation should be?

Thanks,
Adam


More information about the ParaView mailing list