<div dir="ltr"><div><div><div><div><div><div><div><div>Hi Tim,<br><br></div>I would recommend the ParaView superbuild script at Scripts/Sites/Cray-PrgEnv-cross-compile.sh for building PV 5.0.<br><br></div>The options you want to give it are the following:<br>Cray-PrgEnv-cross-compile.sh <comp> </path/to/cmake> </temp/download/directory> </install/directory><br><br></div>Here, comp needs to be the string used to load the program environment, e.g. "PrgEnv-[gnu/gcc/Gnu] on those machines.<br><br></div>I used this to build a PV 5.0 pre-release version on Cori@NERSC and it worked just fine. It doesn't have an option to freeze python but after running that script you can go into the newly created cross/paraview/src/paraview-build subdirectory and switch it to that. <br><br></div>Another thing is if you're using the Intel compilers you may need to do a "module load gcc" when building and running your code. Intel's C++11 compilers rely on the GCC header files and libraries for some stuff and without gcc loaded it will give errors like "missing GLIBC".<br><br></div>Good luck and let us know how it goes!<br><br></div>Best,<br></div>Andy<br><div><div><div><div><br><br></div></div></div></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Thu, Feb 4, 2016 at 9:24 AM, Tim Gallagher <span dir="ltr"><<a href="mailto:tim.gallagher@gatech.edu" target="_blank">tim.gallagher@gatech.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div style="font-family:times new roman,new york,times,serif;font-size:12pt;color:#000000">Andy,<br><br>We don't really care about the compiler or MPI used for paraview. Our code only supports Intel and GNU, but for simplicity I usually build paraview with GNU so everybody can use it. We usually use the default MPI for a system also, which on copper is cray-mpich/7.1.0 currently. <br><br>When we build our code, we have to specify the Catamount toolchain so everything is statically linked because we haven't really figured out how to update everything to use shared libraries on the compute nodes. When we first set up our build environment, shared libraries wasn't an option. If we go that route, will I need the FREEZE_PYTHON option since shared linking won't be available?<br><br>I suppose the proper answer is that we should update our build environment for shared linking rather than static. It's been on my to-do list to figure out for awhile now, but I haven't been able to write the proper toolchain file for it. <br><br>It appears that on copper at least (haven't checked the others), the system install has the libvtkPVPythonCatalyst* libraries (I misspoke in my previous email) but it does not have the development files from the PARAVIEW_INSTALL_DEVELOPMENT_FILES option. That and PARAVIEW_ENABLE_COPROCESSING are the only options in addition to the standard set of build options we need. <br><br>Tim<br><br><hr><div style="color:#000;font-weight:normal;font-style:normal;text-decoration:none;font-family:Helvetica,Arial,sans-serif;font-size:12pt"><b>From: </b>"Andy Bauer" <<a href="mailto:andy.bauer@kitware.com" target="_blank">andy.bauer@kitware.com</a>><br><b>To: </b>"Richard C Angelini (Rick) CIV USARMY RDECOM ARL (US)" <<a href="mailto:richard.c.angelini.civ@mail.mil" target="_blank">richard.c.angelini.civ@mail.mil</a>><br><b>Cc: </b>"tim gallagher" <<a href="mailto:tim.gallagher@gatech.edu" target="_blank">tim.gallagher@gatech.edu</a>>, "paraview" <<a href="mailto:paraview@paraview.org" target="_blank">paraview@paraview.org</a>><br><b>Sent: </b>Thursday, February 4, 2016 9:15:03 AM<span class=""><br><b>Subject: </b>Re: [Paraview] [Non-DoD Source] Building on Cray systems<br><br></span><div><div class="h5"><div dir="ltr"><div><div>Hi Rick,<br><br></div>Did you build ParaView with PARAVIEW_INSTALL_DEVELOPMENT_FILES enabled? Tim will need that for using Catalyst if he's going to be using your builds but not if he's going to do his own.<br><br></div>Tim, some questions on what you need:<br><ul><li>Do you have a specific compiler and version you want/need to use? Same thing for MPI implementation.</li><li>Do you have a specific version of ParaView that you want to use?</li></ul><p>I would recommend using the superbuild tools, to build statically with Python and Mesa. The other libraries can be built with the superbuild (definitely use system MPI though) for convenience even though for Catalyst you probably won't need many of them. The FREEZE_PYTHON option is to statically linking the other Python modules into the executable. This is definitely useful for when running with a high number of MPI ranks since when loading a module (e.g. paraview.simple) in parallel it can really kill the file system if thousands of processes are simultaneously trying to load a bunch of Python modules. Note though that this isn't needed for a Catalyst Python script since that is done specially where process 0 reads the file and broadcasts it to all of the other processes.</p><p>Cheers,</p><p>Andy<br></p></div><div class="gmail_extra"><br><div class="gmail_quote">On Thu, Feb 4, 2016 at 8:54 AM, Angelini, Richard C (Rick) CIV USARMY RDECOM ARL (US) <span dir="ltr"><<a href="mailto:richard.c.angelini.civ@mail.mil" target="_blank">richard.c.angelini.civ@mail.mil</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Tim - I've already built ParaView on all of these systems - there are<br>
modules available to load various version of Paraview. If you need to do<br>
your own builds to support specific functionality - I can provide you the<br>
build scripts we use on those systems.<br>
<br>
<br>
<br>
<br>
-----Original Message-----<br>
From: ParaView [mailto:<a href="mailto:paraview-bounces@paraview.org" target="_blank">paraview-bounces@paraview.org</a>] On Behalf Of Tim<br>
Gallagher<br>
Sent: Thursday, February 04, 2016 8:25 AM<br>
To: paraview <<a href="mailto:paraview@paraview.org" target="_blank">paraview@paraview.org</a>><br>
Subject: [Non-DoD Source] [Paraview] Building on Cray systems<br>
<br>
All active links contained in this email were disabled. Please verify the<br>
identity of the sender, and confirm the authenticity of all links contained<br>
within the message prior to copying and pasting the address to a Web<br>
browser.<br>
<br>
<br>
<br>
<br>
----<br>
<span><br>
Hi everybody,<br>
<br>
I'm about to endeavor on the always fun process of building Paraview on Cray<br>
systems, specifically Copper (ERDC), Garnet (ERDC) and Excalibur (ARL).<br>
Little is ever easy on these systems and I've never succeeded at building<br>
paraview on them in the past. However, we want to run with co-processing on<br>
the compute nodes and so it's time to try again.<br>
<br>
I saw there are some build scripts in the ParaviewSuperbuild for Cray<br>
systems. Does anybody know of any documentation or examples on how to use<br>
them? What dependencies do I need to build using the superbuild and what can<br>
I use that is already on the system? For example -- python, HDF5, zlib, etc<br>
are all available, but do I need to build my own versions?<br>
<br>
Is it possible to build just Paraview (not using the superbuild) using the<br>
system-installed modules? Does the FREEZE_PYTHON option work or help<br>
eliminate the issues of running on the compute nodes?<br>
<br>
If anybody has any advice on the best way to go, I would greatly appreciate<br>
it. We need to have python, co-processing, and off-screen rendering enabled;<br>
otherwise, it's just the standard build options.<br>
<br>
Thanks!<br>
<br>
Tim<br>
_______________________________________________<br>
</span>Powered by <a href="http://Caution-www.kitware.com" rel="noreferrer" target="_blank">Caution-www.kitware.com</a><br>
<span><br>
Visit other Kitware open-source projects at<br>
</span>Caution-<a href="http://www.kitware.com/opensource/opensource.html" rel="noreferrer" target="_blank">http://www.kitware.com/opensource/opensource.html</a><br>
<span><br>
Please keep messages on-topic and check the ParaView Wiki at:<br>
</span>Caution-<a href="http://paraview.org/Wiki/ParaView" rel="noreferrer" target="_blank">http://paraview.org/Wiki/ParaView</a><br>
<br>
Search the list archives at: Caution-<a href="http://markmail.org/search/?q=ParaView" rel="noreferrer" target="_blank">http://markmail.org/search/?q=ParaView</a><br>
<span><br>
Follow this link to subscribe/unsubscribe:<br>
</span>Caution-<a href="http://public.kitware.com/mailman/listinfo/paraview" rel="noreferrer" target="_blank">http://public.kitware.com/mailman/listinfo/paraview</a><br>
<br>_______________________________________________<br>
Powered by <a href="http://www.kitware.com" rel="noreferrer" target="_blank">www.kitware.com</a><br>
<br>
Visit other Kitware open-source projects at <a href="http://www.kitware.com/opensource/opensource.html" rel="noreferrer" target="_blank">http://www.kitware.com/opensource/opensource.html</a><br>
<br>
Please keep messages on-topic and check the ParaView Wiki at: <a href="http://paraview.org/Wiki/ParaView" rel="noreferrer" target="_blank">http://paraview.org/Wiki/ParaView</a><br>
<br>
Search the list archives at: <a href="http://markmail.org/search/?q=ParaView" rel="noreferrer" target="_blank">http://markmail.org/search/?q=ParaView</a><br>
<br>
Follow this link to subscribe/unsubscribe:<br>
<a href="http://public.kitware.com/mailman/listinfo/paraview" rel="noreferrer" target="_blank">http://public.kitware.com/mailman/listinfo/paraview</a><br>
<br></blockquote></div><br></div>
</div></div></div><br></div></div></blockquote></div><br></div>