[Paraview] Volume Rendering 17GB 8.5 billion cell volume

David Trudgian David.Trudgian at UTSouthwestern.edu
Tue Oct 20 16:45:40 EDT 2015


Hi Berk,

Thanks for this info – good to know that the Mesa issue is due to an inherent texture size limit, and not some local issue with our config etc.

We’ll make more use of our K40s…. when they are free in the queue!

Thanks again for looking into this.

--
David Trudgian Ph.D.
Computational Scientist, BioHPC
UT Southwestern Medical Center
Dallas, TX 75390-9039
Tel: (214) 648-4833

From: Berk Geveci [mailto:berk.geveci at kitware.com]
Sent: Tuesday, October 20, 2015 3:01 PM
To: David Trudgian <David.Trudgian at UTSouthwestern.edu>
Cc: ParaView Mailing List <paraview at paraview.org>
Subject: Re: [Paraview] Volume Rendering 17GB 8.5 billion cell volume

Hi folks,

I wanted to close the loop on this. Here are my findings:

* ParaView master (4.4 should also do) + OpenGL2 + NVIDIA Tesla w 12 GB memory: I verified that I can volume render data up to the capacity of the card. I could volume render a 1400x1400x1400 volume of floats.

* ParaView master (4.4 should also do) + OpenGL2 + Mesa (OSMesa 11, llvmpipe, swrast): Mesa has some fairly small limits on 3D texture size, which is what we use for volume rendering. So, ~ 1000x1000x1000 will be the upper end of what can be done for now. In time, we will implement multiple textures / streaming to enable rendering of larger volumes.

Best,
-berk

On Mon, Sep 28, 2015 at 11:00 AM, David Trudgian <David.Trudgian at utsouthwestern.edu<mailto:David.Trudgian at utsouthwestern.edu>> wrote:
Berk,

Thanks very much for looking into this. Look forward to trying things out whenever they’re ready.

DT

--
David Trudgian Ph.D.
Computational Scientist, BioHPC
UT Southwestern Medical Center
Dallas, TX 75390-9039
Tel: (214) 648-4833<tel:%28214%29%20648-4833>

From: Berk Geveci [mailto:berk.geveci at kitware.com<mailto:berk.geveci at kitware.com>]
Sent: Monday, September 28, 2015 9:58 AM

To: David Trudgian <David.Trudgian at UTSouthwestern.edu<mailto:David.Trudgian at UTSouthwestern.edu>>
Cc: ParaView Mailing List <paraview at paraview.org<mailto:paraview at paraview.org>>
Subject: Re: [Paraview] Volume Rendering 17GB 8.5 billion cell volume

Hi David,

I have been trying to find some cycles to check this out myself with ParaView 4.4. Thanks to hardware issues (i.e. my big workstation's disk dying), I haven't been able to. Good news is that I found issues with OSMesa + OpenGL2 that we are working through. Give me another 1-1.5 weeks.

Best,
-berk

On Mon, Sep 28, 2015 at 10:46 AM, David Trudgian <David.Trudgian at utsouthwestern.edu<mailto:David.Trudgian at utsouthwestern.edu>> wrote:
Hi Berk,

Finally managed to grab an allocation of some Tesla K40 nodes on our cluster, to check GPU rendering of the full 17GB file with 2 x 12GB GPUs. I see the same thing as I did with OSMesa rendering.

The 9GB downsampled version works great, across 2 nodes both with a single K40. Go up to the 17GB original file and nothing is rendered, no errors. Same behavior with OPENGL or OPENGL2 backends.

This is all on paraview 4.3.1 still – I need to find time to build OSMesa / MPI versions of 4.4 here. But, does 4.4. have any fixes that would be expected to affect this?

Thanks,

--
David Trudgian Ph.D.
Computational Scientist, BioHPC
UT Southwestern Medical Center
Dallas, TX 75390-9039
Tel: (214) 648-4833<tel:%28214%29%20648-4833>

From: Berk Geveci [mailto:berk.geveci at kitware.com<mailto:berk.geveci at kitware.com>]
Sent: Tuesday, September 15, 2015 2:43 PM
To: David Trudgian <David.Trudgian at UTSouthwestern.edu<mailto:David.Trudgian at UTSouthwestern.edu>>
Cc: ParaView Mailing List <paraview at paraview.org<mailto:paraview at paraview.org>>
Subject: Re: [Paraview] Volume Rendering 17GB 8.5 billion cell volume

Hey David,

I am hoping to have some time to play around with volume rendering and hopefully tracking this issue, one thing that I wanted to clarify: it sounds from you description that you have a short (2 byte) value. Is that correct?

Thanks,
-berk

On Wed, Sep 9, 2015 at 5:00 PM, David Trudgian <david.trudgian at utsouthwestern.edu<mailto:david.trudgian at utsouthwestern.edu>> wrote:
Hi,

We have been experimenting with using Paraview to display very volumes from very
large TIFF stacks generated by whole-brain microscopy equipment. The test stack
has dimensions of 5,368x10,695x150. Stack is assembled in ImageJ from individual
TIFFs, exported as a RAW and loaded into paraview. Saved as a .vti for
convenience. Can view slices fine in standalone paraview client on a 256GB machine.

When we attempt volume rendering on this data across multiple nodes with MPI
nothing appears in the client. Surface view works as expected. On switching to
volume rendering the client's display will show nothing. There are no messages
from the client or servers - no output.

This is happening when running pvserver across GPU nodes with NVIDIA Tesla
cards, or using CPU only with OSMESA. pvserver memory usage is well below what
we have on the nodes - no memory warnings/errors.

Data is about 17GB, 8 billion cells. If we downsize to ~4GB or ~9GB then we can
get working volume rendering. The 17GB never works regardless of scaling
nodes/mpi processes. The 4/9GB will work on 1 or 2 nodes.

Am confused by the lack of rendering, as we don't have memory issues, or an
other messages at all. Am wondering if there are any inherent limitation, or I'm
missing something stupid.

Thanks,

Dave Trudgian


_______________________________________________
Powered by www.kitware.com<http://www.kitware.com>

Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: http://paraview.org/Wiki/ParaView

Search the list archives at: http://markmail.org/search/?q=ParaView

Follow this link to subscribe/unsubscribe:
http://public.kitware.com/mailman/listinfo/paraview


________________________________

UT Southwestern


Medical Center



The future of medicine, today.



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20151020/dbd8411f/attachment.html>


More information about the ParaView mailing list