Compiling ParaView3 for Cray supercomputers
This wiki page is currently a work in progress.
Objectives and Overview
Our goal is to run pvbatch on Cray massively parallel processor systems. Pvbatch is ParaView's MPI-enabled batch application. It reads batch scripts written in python and distributes the work across many processors. Pvbatch will be built when you compile ParaView 3. Before you can compile ParaView 3 you must compile CMake, OSMesa, and Python. The entire process will take about three hours to complete and you should have at least one gigabyte of workspace.
These instructions are intended for Cray MPP systems running the Catamount operating system. Specifically, these instructions have been tested on the Bigben XT3 supercomputer at Pittsburgh Supercomputing Center.
Terminology
These terms are probably self explanatory, but just to clarify...
- front end node - the computer/shell you log into and work on.
- native operating system - the operating system that runs on the front end node (SuSE linux for example)
- native build - software that is built for and executes on front end nodes
- compute node - the computers/processors running parallel computations
- catamount - the operating system that runs on the compute nodes
- catamount build - software that has been cross compiled to execute on the compute nodes
Build steps
You will log into a shell on a front end node. You will download source code and then compile CMake, OSMesa, Python, and ParaView3. Some of these packages must be compiled twice- one native version and one cross compiled version. The steps are:
- Compile a CMake native build.
- Compile an OSMesa native build.
- Compile an OSMesa catamount build.
- Compile a Python native build.
- Compile a Python catamount build.
- Compile a ParaView 3 native build.
- Compile a ParaView 3 catamount build.
- Step 2 is optional if your front end system already has OSMesa installed.
- Step 4 is optional if your front end system already has Python installed.
Why are the native builds required?
During the ParaView build process helper binaries are compiled and executed to generate source files for future build targets. When you cross compile ParaView the helper binaries cannot execute since they are non-native to the front end node you are working on. The solution is to build a native version of ParaView first, and then tell CMake to use the native helper binaries while cross compiling.
Additional information
The instructions on this wiki page detail the steps required to build the software but do not provide additional background information. Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files. You may find these pages very helpful:
Compilers
The front end nodes have more than one compiler installed. We will use both the PGI and GNU compilers. At Bigben, the PGI compiler is the default compiler when you log in. You can switch to gcc like this:
## switch from PGI to GNU compiler module switch PrgEnv-pgi PrgEnv-gnu ## switch from GNU to PGI compiler module switch PrgEnv-gnu PrgEnv-pgi
Toolchains
When you cross compile with CMake you will input a toolchain file. The instructions on this wiki page use individual toolchain files for each compiler, but in reality the toolchain files are identical. The instructions assume path names like this:
~/ toolchains/ Toolchain-Catamount-gcc.cmake Toolchain-Catamount-pgi.cmake
The contents of the toolchain files can be found on the CMake/CrayXT3 page.
Directory structure
Setup your directories however you'd like. Path names on this wiki page are usually given in two forms, a general form and an example form, where ~/ is your home directory:
General form Example form <install-dir> ~/install <catamount-install-dir> ~/install-catamount <toolchain-dir> ~/toolchains <paraview-source-dir> ~/projects/paraview/ParaView3 <paraview-native-build-dir> ~/projects/paraview/build-native ... ...
Here is how my directory tree looks:
~/ install/ bin/ include/ lib/ install-catamount/ bin/ include/ lib/ toolchains/ projects/ cmake/ CMake/ build-native/ mesa/ mesa-native/ mesa-catamount/ python/ python-for-cmake/ build-native/ build-catamount/ paraview/ ParaView3/ build-native/ build-catamount/
Note, some of these directories will be created automatically when you extract archives or checkout code from cvs/svn. The install directories and subdirectories are created automatically when you run "make install" commands. Here is a command you could use to set up the directory tree:
cd ~/ mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount
Compiling CMake
Getting the source
You will need the latest version of CMake from CVS.
cd ~/projects/cmake cvs -d :pserver:anonymous@www.cmake.org:/cvsroot/CMake login ## respond with password: cmake cvs -d :pserver:anonymous@www.cmake.org:/cvsroot/CMake co CMake
Native build
It shouldn't matter which compiler you use to build CMake. I used the default PGI compiler.
General build command:
cd <cmake-build-dir> <cmake-src-dir>/bootstrap --prefix=<native-install-dir> make make install
Example build command:
cd ~/projects/cmake/build ~/projects/cmake/CMake/bootstrap --prefix=~/install make make install
Compiling OSMesa
You will download the Mesa source code and compile the OSMesa target. OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory. The native build is only required if your native system does not have OSMesa already installed. At Bigben, OSMesa was found at /usr/lib64/libOSMesa.so with headers in /usr/include.
Getting the source
You can download the Mesa source directly using wget. In case the url changes, here is the Mesa download page
cd ~/projects/mesa wget http://easynews.dl.sourceforge.net/sourceforge/mesa3d/MesaLib-7.0.2.tar.gz tar -zxf MesaLib-7.0.2.tar.gz
Native build
Use the PGI compiler. Since Mesa uses an in-source build you might want to copy the source dir before you start.
cd ~/projects/mesa cp -r Mesa-7.0.2 mesa-native ## edit mesa-native/configs/default ## ## replace line: INSTALL_DIR = /usr/local ## with: INSTALL_DIR = ~/install ## or: INSTALL_DIR = <native-install-dir> cd mesa-native make linux-osmesa make install
Catamount build
Use the PGI compiler. Since Mesa uses an in-source build you might want to copy the source dir before you start.
cd ~/projects/mesa cp -r Mesa-7.0.2 mesa-catamount ## edit mesa-catamount/configs/default ## ## replace line: INSTALL_DIR = /usr/local ## with: INSTALL_DIR = ~/install-catamount ## or: INSTALL_DIR = <catamount-install-dir> cd mesa-catamount make catamount-osmesa-pgi make install
Compiling Python
CMake files for building Python can be checked out from the ParaView repository. The native python build is only required if your system doesn't already have python libraries and binaries installed. On Bigben, python was located at /usr/lib64/libpython2.3.so and /usr/bin/python2.3.
Getting the source
These instructions use Python from the subversion repository. It is possible to use Python release 2.5.1 and apply a patch, more details are here. The following commands grab Python source from subversion and place it into a directory named python-with-cmake. Next cvs downloads CMake files directly into the python-with-cmake directory.
cd ~/projects/python svn co http://svn.python.org/projects/python/trunk python-with-cmake cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login ## respond with empty password cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co -d python-with-cmake ParaView3/Utilities/CMakeBuildForPython
Native build
Use the GNU compiler. Switch from PGI if you need to:
module switch PrgEnv-pgi PrgEnv-gnu
General build command:
cd <python-build-native-dir> <native-install-dir>/bin/ccmake <python-source-dir> -DCMAKE_INSTALL_PREFIX=<native-install-dir> ## configure with ccmake make make install
Example build command:
cd ~/projects/python/build-native ~/install/bin/ccmake ~/projects/python/python-with-cmake -DCMAKE_INSTALL_PREFIX=~/install ## configure with ccmake make make install
Catamount build
Use the GNU compiler. Switch from PGI if you need to:
module switch PrgEnv-pgi PrgEnv-gnu
When configuring with CMake:
- Confirm all MODULE__*_SHARED options are off
- Turn off MODULE__pwd_ENABLE
- Turn off ENABLE_IPV6
- Turn off WITH_THREAD
General build command:
cd <python-build-catamount-dir> <native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~<toolchain-dir>/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=<catamount-install-dir> -C <python-source-dir>/CMake/TryRunResults-Python-catamount-gcc.cmake <python-source-dir> ## configure with ccmake make make install
Example build command:
cd ~/projects/python/build-catamount ~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=~/install-catamount -C ~/projects/python/python-with-cmake/CMake/TryRunResults-Python-catamount-gcc.cmake ~/projects/python/python-with-cmake/ ## configure with ccmake make make install
Compiling ParaView3
Getting the source
cd ~/projects/paraview cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login ## respond with empty password cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co ParaView3
Native build
You will build a native version of ParaView but do not need to install it. Use the PGI compiler. Switch from GNU if you need to:
module switch PrgEnv-gnu PrgEnv-pgi
When configuring ccmake:
- turn on BUILD_SHARED_LIBS
- turn on PARAVIEW_ENABLE_PYTHON
- watch out for these X11 pitfalls.
- see these python notes.
General build command:
cd <paraview-native-build-dir> <native-install-dir>/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 <paraview-source-dir> ## configure with ccmake make
Example build command:
cd ~/projects/paraview/build-native ~/install/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 ~/projects/paraview/ParaView3 ## configure with ccmake make
Catamount build
Use the PGI compiler. Switch from GNU if you need to:
module switch PrgEnv-gnu PrgEnv-pgi
When configuring with CMake:
- turn on PARAVIEW_ENABLE_PYTHON
- turn on PARAVIEW_USE_MPI
- turn OFF VTK_USE_METAIO
- confirm VTK_OPENGL_HAS_OSMESA: ON
- confirm VTK_NO_PYTHON_THREADS: ON
- confirm BUILD_SHARED_LIBS: OFF
- confirm OSMESA_LIBRARY is the one you cross compiled and installed locally.
- confirm PYTHON_LIBRARY is the one you cross compiled and installed locally.
- set PYTHON_EXECUTABLE to a native python binary, NOT a cross compiled python binary.
- see these python notes.
General build command:
cd <paraview-catamount-build-dir> <native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=<toolchain-dir>/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=<paraview-native-build-dir> -DPARAVIEW_BUILD_QT_GUI=0 -C <paraview-source-dir>/CMake/TryRunResults-ParaView3-catamount-pgi.cmake <paraview-source-dir> ## configure with ccmake make
Example build command:
cd ~/projects/paraview/build-catamount ~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=~/projects/paraview/build-native -DPARAVIEW_BUILD_QT_GUI=0 -C ~/projects/paraview/ParaView3/CMake/TryRunResults-ParaView3-catamount-pgi.cmake ~/projects/paraview/ParaView3 ## configure with ccmake make
Testing
Here is a simple python script to test pvbatch, coloredSphere.py:
from paraview.servermanager import * Connect() sphere =sources.SphereSource() sphere.ThetaResolution = 100 sphere.PhiResolution = 100 filter = filters.ProcessIdScalars() filter.Input = sphere view = CreateRenderView() display = CreateRepresentation(filter, view) lt = rendering.PVLookupTable() display.LookupTable = lt display.ColorAttributeType = 0; # Point Data display.ColorArrayName = "ProcessId" lt.RGBPoints = [0.0, 0, 0, 1, 1, 1, 0, 0] lt.ColorSpace = 1 ; # HSV view.StillRender() view.ResetCamera() view.StillRender() view.WriteImage("/usr/users/6/bgeveci/coloredSphere.png","vtkPNGWriter");
Note the script contains an absolute path to write its output file, coloredSphere.png. The script could be run with the command:
mpirun -np 2 pvbatch coloredSphere.py
But on Bigben you do not enter the mpirun command directly. Instead the Bigben system wraps all jobs in a job script. On Bigben, the job script coloredSphere.job might look like:
#!/bin/sh #PBS -l size=2 #PBS -l walltime=30 #PBS -j oe #PBS -q debug set echo pbsyod -size $PBS_O_SIZE ${HOME}/projects/paraview/build-catamount/bin/pvbatch ${HOME}/coloredSphere.py
The script is submitted by typing:
qsub coloredSphere.job
You can check the status of submitted jobs by typing:
qstat -a
More information about running jobs at Bigben can be found here.
Pitfalls
X11 Pitfalls
The native operating system may or may not have X11 installed. Depending on the availability of X11, you may have to compile the native version of ParaView with OSMesa instead of X.
To disable X and enable OSMesa (assuming OSMesa libraries can be found), add this flag to your ccmake command line:
-DVTK_USE_X=0
Some modifications to ParaView3/VTK/Rendering/CMakeLists.txt are pending (12/19/07), when these changes are commited to CVS to following notes will no longer be needed:
To enable OSMesa:
- erase the contents of OPENGL_gl_LIBRARY, make it an empty string
- confirm OSMESA_LIBRARY is found
- confirm OSMESA_INCLUDE_DIR is found
Even if you think you are using OSMesa and not X11, the CMake procedure CHECK_FUNCTION_EXISTS might find X and decide to use it. You can fix this by:
- Set internal CMakeCache variable CMAKE_USE_GLX_PROC_ADDRESS to 0
- Fix the checks in <paraview-source-dir>/VTK/Rendering/CMakeLists.txt so that VTK_NO_EXTENSION_LOADING gets set to 1.
- Confirm that <paraview-build-dir>/VTK/Rendering/vtkOpenGLExtensionManagerConfigure.h has VTK_NO_EXTENSION_LOADING defined and all other definitions are commented out or undefined.
Python notes
The cmakeified python does not build all modules which means the cmakeified python binary cannot be used during the ParaView build. Set the cmake variable PYTHON_EXECUTABLE to the system's pre-installed python binary, for example /usr/bin/python2.3. If your system does not have python installed you must build it (native build only, no need for cross compile) from source using python's official build system in place of cmake. The problem is only with the python binary and not with the python library, you will still link against the cmakeified python library.
You only need to continue reading if you are interested in fixing the cmakeified python binary. The specific problem with occurs during ParaView's python wrapping stage while compiling. A python script imports the python module "compileall" which fails when it in turn imports "os". With the cmakeified python binary you can test the failure with a one line script:
import compileall
The root of the problem can be seen from a python interpreter:
>>> import sys >>> _names = sys.builtin_module_names >>> print _names ('__builtin__', '__main__', '_ast', '_types', 'exceptions', 'gc', 'imp', 'marshal', 'sys') ## the list above should contain something like 'posix'
You should have no problems linking ParaView with your natively compiled python libraries, the problem is just with the binary. For more information see building python with cmake and module not found.
When this problem has been solved, these instructions may be relevant:
Python paths
A python binary is used during the ParaView build process to generate source files for future build targets. If you have compiled a native python binary that you would like to use then make sure the advanced CMake variables PYTHON_EXECUTABLE, PYTHON_LIBRARY, and PYTHON_INCLUDE_PATH are correct. If you compiled a native version of python with shared libraries and installed it to a local directory, it may have trouble loading since the install dir is not in your LD_LIBRARY_PATH. You can test this by typing:
ldd <native-install-dir>/bin/python ## bad: libpython2.6.so => not found
You can solve the problem by adding <native-install-dir>/lib to your ld path, or set the CMake variable PYTHON_EXECUTABLE to the python binary inside your python build directory, since this binary uses rpath.