[Paraview] Running Paraview on an HPC machine with multiple processes?
Jake Gerard
jake.a.gerard at gmail.com
Wed Jul 12 15:33:49 EDT 2017
Andy,
I wasn't setting up the server or anything. I have just been connecting to
one of my preset configurations. The velocity data shows up fine but for
some reason when there is more than one processor working on the
visualization Paraview can't read it.
Here is the error:
ERROR: In
/p/home/angel/PV/4.3.1/Build_4.3.1_osmesa/paraview/src/paraview/VTK/IO/Xdmf2/vtkXdmfHeavyData.cxx,
line 1128
vtkXdmfReader (0x104ae20): Failed to read attribute data
The setup has just been clicking the connect icon, the system, and choosing
the number of nodes and processors per node. If it helps I'm on Paraview
4.3.1 64-bit. Sorry I am pretty new to all of this stuff. Slowly getting
the hang of things.
Jake
On Wed, Jul 12, 2017 at 1:48 PM, Andy Smith <agsmith424 at gmail.com> wrote:
> Jake,
>
> If I start up the remote server in parallel using something like:
> mpiexec.hydra -n 8 pvserver -sp=11111
>
> and then connect from my workstation I am able to read an XDMF file like
> the one you posted using ParaView 5.4.0. Can you post how you are starting
> your server and the exact error that you receive?
>
> I've attached the .xmf file and a Python script to generate dummy data in
> a .h5 file that corresponds to your data structure.
>
> Note that I am using the xdmf2 reader for my testing. The xdmf3 reader
> fails to read the file (even locally in serial) with the following error:
>
> Type not one of accepted values: INTEGER in XdmfArrayType::New
> terminate called after throwing an instance of 'XdmfError'
> what(): Type not one of accepted values: INTEGER in XdmfArrayType::New
> Aborted
>
>
> -Andy
>
> On Wed, Jul 12, 2017 at 1:31 PM, Jake Gerard <jake.a.gerard at gmail.com>
> wrote:
>
>> Alessandro,
>>
>> Paraview says that it failed to read the attribute data. I just mean that
>> when I connect Paraview to the machine, the pressure data does not load if
>> I select more than 1 node or process/node.
>>
>> Jake
>>
>> On Wed, Jul 12, 2017 at 11:00 AM, Alessandro De Maio <demaio.a at gmail.com>
>> wrote:
>>
>>> Hi Jake,
>>> when you talk about running in multi-processing are you talking
>>> about the solver that produces the data or about running Paraview in
>>> mpi-mode?
>>> Which is the error you get?
>>>
>>> Alessandro
>>>
>>> On Wed, Jul 12, 2017 at 5:28 PM, Jake Gerard <jake.a.gerard at gmail.com>
>>> wrote:
>>>
>>>> Any help here would be greatly appreciated.
>>>>
>>>> On Tue, Jul 11, 2017 at 2:37 PM, Jake Gerard <jake.a.gerard at gmail.com>
>>>> wrote:
>>>>
>>>>> Good Afternoon,
>>>>>
>>>>> I have been moving to an HDF5/XDMF system for analyzing big data from
>>>>> a computational fluid model. I finally got all the basic components working
>>>>> on my local machine but have run into problems when trying to run on an HPC
>>>>> system. Here is the XDMF file:
>>>>>
>>>>> <?xml version="1.0" ?>
>>>>> <!DOCTYPE Xdmf SYSTEM "Xdmf.dtd" []>
>>>>> <Xdmf xmlns:xi="http://www.w3.org/2003/XInclude" Version="2.1">
>>>>> <Domain>
>>>>> <Grid Name="my_Grid" GridType="Uniform">
>>>>> <Topology TopologyType="3DCoRectMesh" Dimensions="91 19 19">
>>>>> </Topology>
>>>>> <Geometry GeometryType="Origin_DxDyDz">
>>>>> <DataItem Dimensions="3" NumberType="Integer" Format="XML">
>>>>> 0 0 0
>>>>> </DataItem>
>>>>> <DataItem Dimensions="3" NumberType="Integer" Format="XML">
>>>>> 1 1 1
>>>>> </DataItem>
>>>>> </Geometry>
>>>>> <Attribute Name="pressure" AttributeType="Scalar" Center="Node">
>>>>> <DataItem Dimensions="91 19 19" NumberType="Float"
>>>>> Format="HDF">
>>>>> out.h5:/pres_group/presmag
>>>>> </DataItem>
>>>>> </Attribute>
>>>>> <Attribute Name="velocity" AttributeType="Vector" Center="Node">
>>>>> <DataItem ItemType="Function" Function="JOIN($0, $1, $2)"
>>>>> Dimensions="91 19 19 3">
>>>>> <DataItem Dimensions="91 19 19" NumberType="Float"
>>>>> Format="HDF">
>>>>> out.h5:/velo_group/x_velo
>>>>> </DataItem>
>>>>> <DataItem Dimensions="91 19 19" NumberType="Float"
>>>>> Format="HDF">
>>>>> out.h5:/velo_group/y_velo
>>>>> </DataItem>
>>>>> <DataItem Dimensions="91 19 19" NumberType="Float"
>>>>> Format="HDF">
>>>>> out.h5:/velo_group/z_velo
>>>>> </DataItem>
>>>>> </DataItem>
>>>>> </Attribute>
>>>>> </Grid>
>>>>> </Domain>
>>>>> </Xdmf>
>>>>>
>>>>> This has worked properly on my machine. However, when I was getting an
>>>>> error of failing to read the pressure data when I tried this on multiple
>>>>> processes. The vector data for velocity was fine, but the pressure data
>>>>> could not be read. I narrowed the problem down to something regarding the
>>>>> number of processes because the pressure data worked fine on the HPC
>>>>> machine if I only ran it on 1 process. Is there anything that sticks out
>>>>> that could be causing this problem? For instance, is there a different
>>>>> format for these files when they are run on multiple processes?
>>>>>
>>>>> Respectfully,
>>>>>
>>>>> Jacob Gerard
>>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> Powered by www.kitware.com
>>>>
>>>> Visit other Kitware open-source projects at
>>>> http://www.kitware.com/opensource/opensource.html
>>>>
>>>> Please keep messages on-topic and check the ParaView Wiki at:
>>>> http://paraview.org/Wiki/ParaView
>>>>
>>>> Search the list archives at: http://markmail.org/search/?q=ParaView
>>>>
>>>> Follow this link to subscribe/unsubscribe:
>>>> http://public.kitware.com/mailman/listinfo/paraview
>>>>
>>>>
>>>
>>
>> _______________________________________________
>> Powered by www.kitware.com
>>
>> Visit other Kitware open-source projects at
>> http://www.kitware.com/opensource/opensource.html
>>
>> Please keep messages on-topic and check the ParaView Wiki at:
>> http://paraview.org/Wiki/ParaView
>>
>> Search the list archives at: http://markmail.org/search/?q=ParaView
>>
>> Follow this link to subscribe/unsubscribe:
>> http://public.kitware.com/mailman/listinfo/paraview
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20170712/d8fae9d8/attachment.html>
More information about the ParaView
mailing list