[Paraview] [Non-DoD Source] Building on Cray systems

Tim Gallagher tim.gallagher at gatech.edu
Thu Feb 4 09:24:08 EST 2016


Andy, 

We don't really care about the compiler or MPI used for paraview. Our code only supports Intel and GNU, but for simplicity I usually build paraview with GNU so everybody can use it. We usually use the default MPI for a system also, which on copper is cray-mpich/7.1.0 currently. 

When we build our code, we have to specify the Catamount toolchain so everything is statically linked because we haven't really figured out how to update everything to use shared libraries on the compute nodes. When we first set up our build environment, shared libraries wasn't an option. If we go that route, will I need the FREEZE_PYTHON option since shared linking won't be available? 

I suppose the proper answer is that we should update our build environment for shared linking rather than static. It's been on my to-do list to figure out for awhile now, but I haven't been able to write the proper toolchain file for it. 

It appears that on copper at least (haven't checked the others), the system install has the libvtkPVPythonCatalyst* libraries (I misspoke in my previous email) but it does not have the development files from the PARAVIEW_INSTALL_DEVELOPMENT_FILES option. That and PARAVIEW_ENABLE_COPROCESSING are the only options in addition to the standard set of build options we need. 

Tim 

----- Original Message -----

From: "Andy Bauer" <andy.bauer at kitware.com> 
To: "Richard C Angelini (Rick) CIV USARMY RDECOM ARL (US)" <richard.c.angelini.civ at mail.mil> 
Cc: "tim gallagher" <tim.gallagher at gatech.edu>, "paraview" <paraview at paraview.org> 
Sent: Thursday, February 4, 2016 9:15:03 AM 
Subject: Re: [Paraview] [Non-DoD Source] Building on Cray systems 




Hi Rick, 

Did you build ParaView with PARAVIEW_INSTALL_DEVELOPMENT_FILES enabled? Tim will need that for using Catalyst if he's going to be using your builds but not if he's going to do his own. 

Tim, some questions on what you need: 

    * Do you have a specific compiler and version you want/need to use? Same thing for MPI implementation. 
    * Do you have a specific version of ParaView that you want to use? 


I would recommend using the superbuild tools, to build statically with Python and Mesa. The other libraries can be built with the superbuild (definitely use system MPI though) for convenience even though for Catalyst you probably won't need many of them. The FREEZE_PYTHON option is to statically linking the other Python modules into the executable. This is definitely useful for when running with a high number of MPI ranks since when loading a module (e.g. paraview.simple) in parallel it can really kill the file system if thousands of processes are simultaneously trying to load a bunch of Python modules. Note though that this isn't needed for a Catalyst Python script since that is done specially where process 0 reads the file and broadcasts it to all of the other processes. 
Cheers, 
Andy 



On Thu, Feb 4, 2016 at 8:54 AM, Angelini, Richard C (Rick) CIV USARMY RDECOM ARL (US) < richard.c.angelini.civ at mail.mil > wrote: 


Tim - I've already built ParaView on all of these systems - there are 
modules available to load various version of Paraview. If you need to do 
your own builds to support specific functionality - I can provide you the 
build scripts we use on those systems. 




-----Original Message----- 
From: ParaView [mailto: paraview-bounces at paraview.org ] On Behalf Of Tim 
Gallagher 
Sent: Thursday, February 04, 2016 8:25 AM 
To: paraview < paraview at paraview.org > 
Subject: [Non-DoD Source] [Paraview] Building on Cray systems 

All active links contained in this email were disabled. Please verify the 
identity of the sender, and confirm the authenticity of all links contained 
within the message prior to copying and pasting the address to a Web 
browser. 




---- 

Hi everybody, 

I'm about to endeavor on the always fun process of building Paraview on Cray 
systems, specifically Copper (ERDC), Garnet (ERDC) and Excalibur (ARL). 
Little is ever easy on these systems and I've never succeeded at building 
paraview on them in the past. However, we want to run with co-processing on 
the compute nodes and so it's time to try again. 

I saw there are some build scripts in the ParaviewSuperbuild for Cray 
systems. Does anybody know of any documentation or examples on how to use 
them? What dependencies do I need to build using the superbuild and what can 
I use that is already on the system? For example -- python, HDF5, zlib, etc 
are all available, but do I need to build my own versions? 

Is it possible to build just Paraview (not using the superbuild) using the 
system-installed modules? Does the FREEZE_PYTHON option work or help 
eliminate the issues of running on the compute nodes? 

If anybody has any advice on the best way to go, I would greatly appreciate 
it. We need to have python, co-processing, and off-screen rendering enabled; 
otherwise, it's just the standard build options. 

Thanks! 

Tim 
_______________________________________________ 
Powered by Caution-www.kitware.com 

Visit other Kitware open-source projects at 
Caution- http://www.kitware.com/opensource/opensource.html 

Please keep messages on-topic and check the ParaView Wiki at: 
Caution- http://paraview.org/Wiki/ParaView 

Search the list archives at: Caution- http://markmail.org/search/?q=ParaView 

Follow this link to subscribe/unsubscribe: 
Caution- http://public.kitware.com/mailman/listinfo/paraview 

_______________________________________________ 
Powered by www.kitware.com 

Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html 

Please keep messages on-topic and check the ParaView Wiki at: http://paraview.org/Wiki/ParaView 

Search the list archives at: http://markmail.org/search/?q=ParaView 

Follow this link to subscribe/unsubscribe: 
http://public.kitware.com/mailman/listinfo/paraview 





-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20160204/276fbacf/attachment.html>


More information about the ParaView mailing list