[Paraview] Building on Titan using ParaViewSuperbuild
David E DeMarle
dave.demarle at kitware.com
Fri Aug 30 10:43:52 EDT 2013
That was a P.
When Mira at ANL ( a Q ) came out a few months ago, I briefly attempted to
port the xlc_bgp config into an xlc_bgq config, but got bogged down in
miscellaneous compilation issues and put it on hold.
It won't be that hard to do, it just takes a more time and patience than I
had at the time.
David E DeMarle
Kitware, Inc.
R&D Engineer
21 Corporate Drive
Clifton Park, NY 12065-8662
Phone: 518-881-4909
On Fri, Aug 30, 2013 at 10:10 AM, Benson Muite <benson_muite at yahoo.com>wrote:
> Hi,
>
> Can you let me know whether it is BG P or BG Q on which catalyst was built.
>
> Thanks,
> Benson
>
> On 30/08/2013 11:24, paraview-request at paraview.org wrote:
> > 1. Re: Building on Titan using ParaViewSuperbuild (David E DeMarle)
> >
> >
> > ----------------------------------------------------------------------
> >
> > Message: 1
> > Date: Thu, 29 Aug 2013 16:08:13 -0400
> > From: David E DeMarle <dave.demarle at kitware.com>
> > Subject: Re: [Paraview] Building on Titan using ParaViewSuperbuild
> > To: "Vanmoer, Mark W" <mvanmoer at illinois.edu>
> > Cc: "paraview at paraview.org" <paraview at paraview.org>
> > Message-ID:
> > <
> CANjZAi-1K+BfLgLEvPeQdAu+wPT-AQKU0azDDcE4kCd-rOT1yg at mail.gmail.com>
> > Content-Type: text/plain; charset="iso-8859-1"
> >
> > On Thu, Aug 29, 2013 at 3:51 PM, Vanmoer, Mark W <mvanmoer at illinois.edu
> >wrote:
> >
> >> So coprocessing will not be built using the below instructions? I would
> >> have mentioned that, but coprocessing appears to
> >>
> > still be part of a regular, non-cross-compile build, so I figured it was
> >> part of ENABLE_paraview
> >>
> > The coprocessing plugin, which adds things to the GUI to make it easy to
> > record coprocessing pipeline setups doesn't need to be turned on since
> that
> > lives in the client only. (It is like python trace or state recording,
> but
> > tailored to recording in-situ setups).
> >
> > Catalyst (the stripped down version of ParaView server that a simulation
> > code can link to and use to run those recorded pipelines quickly) is not
> > yet an option in ParaViewSuperbuild. To cross compile Catalyst a bit more
> > work will be required. It will follow the same plan as how the ParaView
> > server is compiled, but I just haven't tried it. When I did cross compile
> > Catalyst last year at this time I did the same steps that
> > ParaViewSuperbuild's TOOLS and CROSS build passes did, just by hand.
> >
> >> Also, for the below configcross.sh, do we need to pass in a CMake
> variable
> >> telling it where the tools build dir is located?****
> >>
> >>
> > That should be an option that you can easily set, but it isn't sorry.
> >
> > CMake/CrossCompilationMacros.cmake assumes it can find it one directory
> up
> > and over like so:
> > macro(find_hosttools)
> > set(PARAVIEW_HOSTTOOLS_DIR
> > ${CMAKE_BINARY_DIR}/../tools/paraview/src/paraview-build/ CACHE PATH
> > "Location of host built paraview compile tools directory")
> > set(PYTHON_HOST_EXE ${CMAKE_BINARY_DIR}/../tools/install/bin/python
> CACHE
> > PATH
> > "Location of host built python executable")
> > set(PYTHON_HOST_LIBDIR ${CMAKE_BINARY_DIR}/../tools/install/lib CACHE
> PATH
> > "Location of host built python libraries")
> > set(BOOST_HOST_INCLUDEDIR ${CMAKE_BINARY_DIR}/../tools/install/include
> > CACHE PATH
> > "Location of host built boost headers")
> > endmacro()
> >
> > You could predefine all four of those if you like.
> >
> >
> >> Thanks,****
> >>
> >> Mark****
> >>
> >> ** **
> >>
> >> *From:* David E DeMarle [mailto:dave.demarle at kitware.com]
> >> *Sent:* Thursday, August 29, 2013 1:41 PM
> >> *To:* Hong Yi
> >> *Cc:* Vanmoer, Mark W; paraview at paraview.org
> >>
> >> *Subject:* Re: [Paraview] Building on Titan using ParaViewSuperbuild****
> >>
> >> ** **
> >>
> >> On Thu, Aug 29, 2013 at 2:13 PM, Hong Yi <hongyi at renci.org> wrote:****
> >>
> >> Hi David,
> >>
> >> I just started to try superbuild on Titan also. I don't see you set
> >> ENABLE_MPI to be true in your configure script. Could you confirm
> whether
> >> ENABLE_MPI needs to be set to TRUE in order for ParaView to run on
> Titan in
> >> parallel? Since my purpose is to link our****
> >>
> >> ** **
> >>
> >> The ENABLE_MPI flag at the Superbuild level is unrelated. It has a
> purpose
> >> only when CROSS_BUILD_STAGE=HOST, that is when making ParaView binary
> >> installers for desktops from Superbuild. ****
> >>
> >> ** **
> >>
> >> You shouldn't turn it on in the TOOLS or CROSS stages. Instead let the
> >> CROSS stage use the system installed MPI. It does that by turning
> >> PARAVIEW_USE_MPI=ON when it configures the ParaView sub-build. See
> >> CMake/crosscompile/xk7_gnu to see where it does that, and to see the
> other
> >> flags it uses.****
> >>
> >> ****
> >>
> >> simulation code (already built statically with CMake on Titan) to
> >> ParaView CoProcessing libraries (I am using version 3.98.1) for in-situ
> >> visualization on Titan, so in this case, do I have to set
> ENABLE_paraview
> >> to true and do I need to enable OSMesa for ParaView to resort to
> off-screen
> >> rendering for in-situ visualization? ****
> >>
> >> ** **
> >>
> >> The CROSS stage turns on Python, Mesa and ParaView. Titan's accelerators
> >> don't really run X11, so Mesa is the only option for rendering
> there.****
> >>
> >> ** **
> >>
> >> Although I can build ParaView from source on Titan login nodes, I am
> not
> >> able to run it on compute nodes, so I am starting to try superbuild
> hoping
> >> to be able to cross build ParaView libraries to run in-situ
> visualization
> >> on Titan.****
> >>
> >> ** **
> >>
> >> I've cross compiled Catalyst itself before on a bluegene. I did it
> >> manually before SuperBuild existed. I will see if I can dig up my config
> >> scripts. Cross compiling Catalyst should be more or less that same
> thing as
> >> cross compiling ParaView, but a bit faster and easier because their is
> less
> >> code involved.****
> >>
> >> ** **
> >>
> >> Thanks,
> >> Hong****
> >> ------------------------------
> >>
> >> *From:* paraview-bounces at paraview.org [paraview-bounces at paraview.org]
> on
> >> behalf of David E DeMarle [dave.demarle at kitware.com]
> >> *Sent:* Thursday, August 29, 2013 1:21 PM****
> >>
> >>
> >> *To:* Vanmoer, Mark W
> >> *Cc:* paraview at paraview.org
> >> *Subject:* Re: [Paraview] Building on Titan using ParaViewSuperbuild****
> >>
> >> ** **
> >>
> >> Your tools build is pointing to the compiler wrapper that you normally
> >> would use to make code for the back end. ****
> >>
> >> The tools build should just use plain old local gcc since we only build
> >> things at that point that run on the login node.****
> >>
> >> ** **
> >>
> >> Try these setup scripts:****
> >>
> >> I source configtools.sh to set up my environment before I build the
> >> compile tools, and configcross.sh before before I cross compile
> ParaView.*
> >> ***
> >>
> >> ** **
> >>
> >> configtools.sh****
> >>
> >> #use my own cmake, system one is too old****
> >>
> >> setenv PATH
> >> /autofs/na4_proj/csc035/demarle/pvdev/titan/cmake-build/bin:${PATH}****
> >>
> >> #switch compiler to compile code for front end****
> >>
> >> module unload PrgEnv-pgi****
> >>
> >> module load gcc****
> >>
> >> #configure settings for to build compile tools only****
> >>
> >> cmake \****
> >>
> >> -DCROSS_BUILD_STAGE:STRING=TOOLS -Dcross_target:STRING=xk7_gnu \****
> >>
> >> -DCMAKE_BUILD_TYPE:STRING=Release \****
> >>
> >> -DBUILD_TESTING:BOOL=FALSE \****
> >>
> >> -DParaView_FROM_GIT:BOOL=OFF \****
> >>
> >> -DENABLE_paraview:BOOL=TRUE \****
> >>
> >> -DENABLE_boost:BOOL=TRUE \****
> >>
> >> -DENABLE_python:BOOL=TRUE \****
> >>
> >> -DENABLE_portfwd:BOOL=FALSE \****
> >>
> >> ../../ParaViewSuperbuild****
> >>
> >> ** **
> >>
> >> ** **
> >>
> >> then make****
> >>
> >> ** **
> >>
> >> configcross.sh****
> >>
> >> #use my own cmake, system one is too old****
> >>
> >> setenv PATH
> >> /autofs/na4_proj/csc035/demarle/pvdev/titan/cmake-build/bin:${PATH}****
> >>
> >> #switch compiler to compile code for back end****
> >>
> >> module unload PrgEnv-pgi****
> >>
> >> module unload gcc****
> >>
> >> module load PrgEnv-gnu****
> >>
> >> #not sure why module load wasn't sufficient, but ended up needing to
> force
> >> ****
> >>
> >> #cmake to choose the right compiler****
> >>
> >> setenv CC /opt/cray/xt-asyncpe/5.17/bin/cc****
> >>
> >> setenv CXX /opt/cray/xt-asyncpe/5.17/bin/CC****
> >>
> >> #configure settings to cross compile python, (mesa - implied), and
> paraview
> >> ****
> >>
> >> cmake \****
> >>
> >> -DCROSS_BUILD_STAGE:STRING=CROSS -Dcross_target:STRING=xk7_gnu \****
> >>
> >> -DCMAKE_BUILD_TYPE:STRING=Release \****
> >>
> >> -DBUILD_TESTING:BOOL=TRUE \****
> >>
> >> -DParaView_FROM_GIT:BOOL=OFF \****
> >>
> >> -DENABLE_paraview:BOOL=TRUE \****
> >>
> >> -DENABLE_python:BOOL=TRUE \****
> >>
> >> ../../ParaViewSuperbuild****
> >>
> >> ** **
> >>
> >> then make again****
> >>
> >> ** **
> >>
> >>
> >> ****
> >>
> >> David E DeMarle
> >> Kitware, Inc.
> >> R&D Engineer
> >> 21 Corporate Drive
> >> Clifton Park, NY 12065-8662
> >> Phone: 518-881-4909****
> >>
> >> ** **
> >>
> >> On Tue, Aug 27, 2013 at 4:26 PM, Vanmoer, Mark W <mvanmoer at illinois.edu
> >
> >> wrote:****
> >>
> >> Hi, I'm trying to follow the advice on building ParaView on Titan using
> >> the ParaViewSuperbuild method from an earlier discussion in June.****
> >>
> >> ****
> >>
> >> When I run make in the "TOOLS" directory I get the following error:****
> >>
> >> ****
> >>
> >> [ 66%] Building CXX object
> >> Utilities/ProcessXML/CMakeFiles/kwProcessXML.dir/ProcessXML.cxx.o****
> >>
> >> Linking CXX executable ../../bin/vtkkwProcessXML-pv4.0****
> >>
> >> /usr/bin/ld: attempted static link of dynamic object
> >> `../../lib/libvtkCommonCore-pv4.0.so.1'****
> >>
> >> collect2: error: ld returned 1 exit status****
> >>
> >> make[6]: *** [bin/vtkkwProcessXML-pv4.0] Error 1****
> >>
> >> make[5]: *** [Utilities/ProcessXML/CMakeFiles/kwProcessXML.dir/all]
> Error 2
> >> ****
> >>
> >> make[4]: *** [CMakeFiles/pvCompileTools.dir/rule] Error 2****
> >>
> >> make[3]: *** [pvCompileTools] Error 2****
> >>
> >> CMake Error at
> >>
> /ccs/home/vanmoer/builds/superbuild/tools-build/pv-paraview-build.cmake:26
> >> (message):****
> >>
> >> Failed!!!****
> >>
> >> ****
> >>
> >> ****
> >>
> >> make[2]: *** [paraview/src/paraview-stamp/paraview-build] Error 1****
> >>
> >> ****
> >>
> >> I don't see a BUILD_SHARED that I can toggle. All of the
> >> CMAKE_SHARED_LINKER_FLAGS* vars are empty, if those are related. ****
> >>
> >> ****
> >>
> >> Any suggestions?****
> >>
> >> ****
> >>
> >> Thanks,****
> >>
> >> Mark****
> >>
> >> ****
> >>
> >> ****
> >>
> >>
> >> _______________________________________________
> >> Powered by www.kitware.com
> >>
> >> Visit other Kitware open-source projects at
> >> http://www.kitware.com/opensource/opensource.html
> >>
> >> Please keep messages on-topic and check the ParaView Wiki at:
> >> http://paraview.org/Wiki/ParaView
> >>
> >> Follow this link to subscribe/unsubscribe:
> >> http://www.paraview.org/mailman/listinfo/paraview****
> >>
> >> ** **
> >>
> >> ** **
> >>
> > -------------- next part --------------
> > An HTML attachment was scrubbed...
> > URL: <
> http://www.paraview.org/pipermail/paraview/attachments/20130829/9f8989fe/attachment-0001.htm
> >
> >
> > ------------------------------
> >
> > Message: 2
> > Date: Fri, 30 Aug 2013 10:24:37 +0200
> > From: Richard GRENON <richard.grenon at onera.fr>
> > Subject: Re: [Paraview] Performance of the CGNS Reader
> > To: Mickael Philit <mickey.phy at gmail.com>
> > Cc: "Angelini, Richard C \(Rick\) CIV USARMY ARL \(US\)"
> > <richard.c.angelini.civ at mail.mil>, "paraview at paraview.org"
> > <paraview at paraview.org>
> > Message-ID: <522056C5.3060202 at onera.fr>
> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed
> >
> > Thank you for this answer, Mickael.
> >
> > My 1.36 Gb CGNS dataset is built from structured meshes and Paraview
> > should not 'eat' too much memory. I have checked that enabling
> > multi-core in Paraview does not change anything: PV always needs about
> > 15 mn to load my dataset, same loading time as without multi-core.
> >
> > I think that the loading time ratio 15 mn / 1 mn for PV against Tecplot
> > remains too high, even if PV parses the file two times. If Tecplot takes
> > advantage from multi-core (I don't know), the loading time ratio between
> > PV and Tecplot should not excess 8 when using 4 CPUs. A ratio of 15
> > leading to 15 mn of loading time or more for larger datasets is
> > unacceptable for interactivity. So PV is unusable for large CGNS
> > datasets, unless using batch mode. I think that an effort in redesigning
> > the CGNS reader would be welcome.
> >
> > Best regards.
> >
> > Richard
> >
> > Le 29/08/2013 20:55, Mickael Philit a ?crit :
> >> Hello,
> >>
> >> First, the CGNS reader coming through the VisItBridge is not working
> >> in parallel, it's a plain serial reader.
> >> Second, there are limitations to the current cgns reader way of doing
> >> thing, since :
> >> - At the beginning, it parses the whole file (this takes a lot of
> >> time) to get variable names, blocks and so on, before actually reading
> >> the data. [ I think that tecplot is cleaner because it seems to read
> >> the whole CGNS file in one pass ]
> >> - meshes are read in a temporary array and converted to a VTK vector
> >> of coordinates (thus memory manipulation)
> >> - for unstructured meshes, convertion from 'integer' to 'long' of
> >> cells connectivity eats memory.
> >> The CGNS reader can improve but at the cost of redesining some parts
> >> to fit better in paraview and go for parallel.
> >>
> >> Mickael
> >>
> >>
> >> On 29/08/2013 16:50, Angelini, Richard C (Rick) CIV USARMY ARL (US)
> >> wrote:
> >>> As a followup to this that may be related - does the CGNS reader
> >>> through the VisItBridge work in parallel? I've loaded up a couple
> >>> of different CGNS datasets and then applied the ProcessIDScalars
> >>> filter and it doesn't appear to be distributing the data - even
> >>> multi-block CGNS files.
> >>>
> >>>
> >>> ________________________________
> >>> Rick Angelini
> >>>
> >>> USArmy Research Laboratory
> >>> CISD/HPC Architectures Team
> >>> Building 120 Cube 315
> >>> Phone: 410-278-6266
> >>>
> >>> ________________________________________
> >>> From: paraview-bounces at paraview.org [paraview-bounces at paraview.org]
> >>> on behalf of Richard GRENON [richard.grenon at onera.fr]
> >>> Sent: Thursday, August 29, 2013 10:38 AM
> >>> To: paraview at paraview.org
> >>> Subject: [Paraview] Performance of the CGNS Reader
> >>>
> >>> Hello.
> >>>
> >>> I am testing the CGNS reader of Paraview 4.0.1 64 bits running on a
> >>> Linux Workstation having 4 CPUs and 5.8 Gbytes of memory. Paraview was
> >>> installed from the binaries available on the download page.
> >>>
> >>> I am trying to load a 1.36 Gbytes CGNS file that is available through
> >>> the network.
> >>>
> >>> While loading this file, the Paraview Windows is frozen and cannot be
> >>> refreshed, and I must check with the "ps" command on a terminal window
> >>> or with a system monitor if PV is still running or if it is really
> >>> frozen. A progress bar for all readers would be welcome in a next
> >>> release.
> >>>
> >>> Finally, the file can be loaded, but it always takes about 15 mn (+ or
> -
> >>> 1 mn depending of the load of the network), while Tecplot always loads
> >>> the same file within less that 1 mn !
> >>>
> >>> How do you explain this poor performance of the CGNS reader ? Can it be
> >>> improved, or am I missing something ? Is there some Paraview option
> that
> >>> could reduce loading time of large files ?
> >>>
> >>> Best regards
> >>>
> >>> --
> >>> Richard GRENON
> >>> ONERA
> >>> Departement d'Aerodynamique Appliquee - DAAP/ACI
> >>> 8 rue des Vertugadins
> >>> 92190 MEUDON - FRANCE
> >>> phone : +33 1 46 73 42 17
> >>> fax : +33 1 46 73 41 46
> >>> mailto:Richard.Grenon at onera.fr
> >>> http://www.onera.fr
> >>>
> >>> _______________________________________________
> >>> Powered by www.kitware.com
> >>>
> >>> Visit other Kitware open-source projects at
> >>> http://www.kitware.com/opensource/opensource.html
> >>>
> >>> Please keep messages on-topic and check the ParaView Wiki at:
> >>> http://paraview.org/Wiki/ParaView
> >>>
> >>> Follow this link to subscribe/unsubscribe:
> >>> http://www.paraview.org/mailman/listinfo/paraview
> >>> _______________________________________________
> >>> Powered by www.kitware.com
> >>>
> >>> Visit other Kitware open-source projects at
> >>> http://www.kitware.com/opensource/opensource.html
> >>>
> >>> Please keep messages on-topic and check the ParaView Wiki at:
> >>> http://paraview.org/Wiki/ParaView
> >>>
> >>> Follow this link to subscribe/unsubscribe:
> >>> http://www.paraview.org/mailman/listinfo/paraview
> >>
> >
>
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at:
> http://paraview.org/Wiki/ParaView
>
> Follow this link to subscribe/unsubscribe:
> http://www.paraview.org/mailman/listinfo/paraview
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20130830/dd203dab/attachment-0001.htm>
More information about the ParaView
mailing list