[Paraview] PV 3.10.1 - compile error w.r.t. MapReduceMPI
Paul Melis
paul.melis at sara.nl
Fri Aug 5 10:50:09 EDT 2011
Hi,
I just noticed my cmake is 2.8.2, would trying the latest version help here?
Paul
On 08/05/2011 04:47 PM, Paul Melis wrote:
> Hi David,
>
> On 08/05/2011 02:55 PM, David Partyka wrote:
>> Have you turned MPI On then Off and then back On again? I've seen it get
>> confused when this happens.
>
> Tried that, doesn't help, unfortunately
>
>> Can you try a fresh (delete your build tree)
>> build of 3.10.1 and turn everything on at once?
>
> A fresh build in a clean build directory (together with the cmake line
> below) doesn't solve the problem. Note that I'm not setting the openmpi
> header/library locations directly, but let CMake detect them. I assume
> that's not a problem?
>
> Regards,
> Paul
>
>>
>> On Fri, Aug 5, 2011 at 8:15 AM, Paul Melis <paul.melis at sara.nl
>> <mailto:paul.melis at sara.nl>> wrote:
>>
>> Hi,
>>
>> I'm building PV 3.8.1 and PV 3.10.1 on exactly the same Debian 6.0
>> system (64-bit) system using the exact same CMAKE configuration lines:
>>
>> #!/bin/sh
>> cmake \
>> -DCMAKE_BUILD_TYPE=Release \
>> -DMANTA_BUILD=$HOME/c/manta-2439-build \
>> -DMANTA_SOURCE=$HOME/c/manta-2439 \
>> -DPARAVIEW_BUILD_PLUGIN_Manta=ON \
>> -DPARAVIEW_ENABLE_PYTHON=ON \
>> -DPARAVIEW_USE_MPI=ON \
>> <source-dir>
>>
>> The 3.8.1 build succeeds, but the 3.10.1 buid fails because mpi.h is not
>> being found:
>>
>> [ 4%] Built target ProcessShader
>> [ 4%] Built target vtkMaterialLibraryConfiguredFiles
>> [ 5%] Built target vtkproj4
>> [ 5%] Built target lproj
>> [ 5%] Building CXX object
>> VTK/Utilities/mrmpi/src/CMakeFiles/MapReduceMPI.dir/mapreduce.cpp.o
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/mapreduce.cpp:14:17:
>> error:
>> mpi.h: No such file or directory
>> In file included from
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/mapreduce.cpp:22:
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/mapreduce.h:38:
>> error: field ‘MPI_Comm’ has incomplete type
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/mapreduce.h:81:
>> error: ‘MPI_Comm’ does not name a type
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/mapreduce.h:93:
>> error: ‘MPI_Comm’ does not name a type
>> In file included from
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/mapreduce.cpp:23:
>> /home/opti/c/ParaView-3.10.1/VTK/Utilities/mrmpi/src/keyvalue.h:36:
>> error: field ‘MPI_Comm’ has incomplete type
>>
>> The relevant CMAKE MPI variables seem to be set correctly:
>>
>> opti at optihd0:~/c/paraview-3101-release$ grep ^MPI_ CMakeCache.txt
>> MPI_COMPILER:FILEPATH=/usr/bin/mpic++
>> MPI_COMPILE_FLAGS:STRING=
>> MPI_EXTRA_LIBRARY:STRING=/usr/lib/openmpi/lib/libmpi.so;/usr/lib/openmpi/lib/libopen-rte.so;/usr/lib/openmpi/lib/libopen-pal.so;/usr/lib/libdl.so;/usr/lib/libnsl.so;/usr/lib/libutil.so;/usr/lib/libm.so;/usr/lib/libdl.so
>> MPI_INCLUDE_PATH:STRING=/usr/lib/openmpi/include;/usr/lib/openmpi/include/openmpi
>> MPI_LIBRARY:FILEPATH=/usr/lib/openmpi/lib/libmpi_cxx.so
>> MPI_LINK_FLAGS:STRING=-Wl,--export-dynamic
>> MPI_COMPILER-ADVANCED:INTERNAL=1
>> MPI_COMPILE_FLAGS-ADVANCED:INTERNAL=1
>> MPI_EXTRA_LIBRARY-ADVANCED:INTERNAL=0
>> MPI_INCLUDE_PATH-ADVANCED:INTERNAL=0
>> MPI_LIB:INTERNAL=MPI_LIB-NOTFOUND
>> MPI_LIBRARY-ADVANCED:INTERNAL=0
>> MPI_LINK_FLAGS-ADVANCED:INTERNAL=1
>> opti at optihd0:~/c/paraview-3101-release$ find /usr/lib -name mpi.h
>> /usr/lib/openmpi/include/mpi.h
>>
>> The MPI related cache variables seem to be almost the same between 3.8
>> and 3.10, and in the differences I see nothing related to missing
>> include paths or something like that:
>>
>> opti at optihd0:~/c/paraview-3101-release$ grep MPI_ CMakeCache.txt >
>> mpi310.txt
>> opti at optihd0:~/c/paraview-3101-release$ grep MPI_
>> ../paraview-381-release/CMakeCache.txt > mpi38.txt
>> opti at optihd0:~/c/paraview-3101-release$ diff -u mpi38.txt mpi310.txt
>> --- mpi38.txt 2011-08-05 14:13:28.781521205 +0200
>> +++ mpi310.txt 2011-08-05 14:13:24.305022071 +0200
>> @@ -1,3 +1,4 @@
>> +IceTMPI_LIB_DEPENDS:STATIC=general;m;general;IceTCore;general;/usr/lib/openmpi/lib/libmpi_cxx.so;general;/usr/lib/openmpi/lib/libmpi.so;general;/usr/lib/openmpi/lib/libopen-rte.so;general;/usr/lib/openmpi/lib/libopen-pal.so;general;/usr/lib/libdl.so;general;/usr/lib/libnsl.so;general;/usr/lib/libutil.so;general;/usr/lib/libm.so;general;/usr/lib/libdl.so;
>> MPI_COMPILER:FILEPATH=/usr/bin/mpic++
>> MPI_COMPILE_FLAGS:STRING=
>> MPI_EXTRA_LIBRARY:STRING=/usr/lib/openmpi/lib/libmpi.so;/usr/lib/openmpi/lib/libopen-rte.so;/usr/lib/openmpi/lib/libopen-pal.so;/usr/lib/libdl.so;/usr/lib/libnsl.so;/usr/lib/libutil.so;/usr/lib/libm.so;/usr/lib/libdl.so
>> @@ -15,12 +16,8 @@
>> ICET_MPI_MAX_NUMPROCS-ADVANCED:INTERNAL=1
>> //This is set from VTK_MPI_MAX_NUMPROCS.
>> ICET_MPI_MAX_NUMPROCS:INTERNAL=2
>> -//ADVANCED property for variable: ICET_MPI_POSTFLAGS
>> -ICET_MPI_POSTFLAGS-ADVANCED:INTERNAL=1
>> //This is set from VTK_MPI_POSTFLAGS.
>> ICET_MPI_POSTFLAGS:INTERNAL=
>> -//ADVANCED property for variable: ICET_MPI_PREFLAGS
>> -ICET_MPI_PREFLAGS-ADVANCED:INTERNAL=1
>> //This is set from a combination of VTK_MPI_PREFLAGS
>> VTK_MPI_NUMPROC_FLAG
>> // VTK_MPI_MAX_NUMPROCS VTK_MPI_PREFLAGS.
>> ICET_MPI_PREFLAGS:INTERNAL=;-np;2;
>>
>> Any clues?
>>
>> Regards,
>> Paul
>> _______________________________________________
>> Powered by www.kitware.com <http://www.kitware.com>
>>
>> Visit other Kitware open-source projects at
>> http://www.kitware.com/opensource/opensource.html
>>
>> Please keep messages on-topic and check the ParaView Wiki at:
>> http://paraview.org/Wiki/ParaView
>>
>> Follow this link to subscribe/unsubscribe:
>> http://www.paraview.org/mailman/listinfo/paraview
>>
>>
>
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at: http://paraview.org/Wiki/ParaView
>
> Follow this link to subscribe/unsubscribe:
> http://www.paraview.org/mailman/listinfo/paraview
More information about the ParaView
mailing list