[Paraview] Running Paraview on an HPC machine with multiple processes?

Andy Smith agsmith424 at gmail.com
Wed Jul 12 14:48:00 EDT 2017


Jake,

If I start up the remote server in parallel using something like:
mpiexec.hydra -n 8 pvserver -sp=11111

and then connect from my workstation I am able to read an XDMF file like
the one you posted using ParaView 5.4.0.  Can you post how you are starting
your server and the exact error that you receive?

I've attached the .xmf file and a Python script to generate dummy data in a
.h5 file that corresponds to your data structure.

Note that I am using the xdmf2 reader for my testing.  The xdmf3 reader
fails to read the file (even locally in serial) with the following error:

Type not one of accepted values: INTEGER in XdmfArrayType::New
terminate called after throwing an instance of 'XdmfError'
  what():  Type not one of accepted values: INTEGER in XdmfArrayType::New
Aborted


-Andy

On Wed, Jul 12, 2017 at 1:31 PM, Jake Gerard <jake.a.gerard at gmail.com>
wrote:

> Alessandro,
>
> Paraview says that it failed to read the attribute data. I just mean that
> when I connect Paraview to the machine, the pressure data does not load if
> I select more than 1 node or process/node.
>
> Jake
>
> On Wed, Jul 12, 2017 at 11:00 AM, Alessandro De Maio <demaio.a at gmail.com>
> wrote:
>
>> Hi Jake,
>>      when you talk about running in multi-processing are you talking
>> about the solver that produces the data or about running Paraview in
>> mpi-mode?
>> Which is the error you get?
>>
>> Alessandro
>>
>> On Wed, Jul 12, 2017 at 5:28 PM, Jake Gerard <jake.a.gerard at gmail.com>
>> wrote:
>>
>>> Any help here would be greatly appreciated.
>>>
>>> On Tue, Jul 11, 2017 at 2:37 PM, Jake Gerard <jake.a.gerard at gmail.com>
>>> wrote:
>>>
>>>> Good Afternoon,
>>>>
>>>> I have been moving to an HDF5/XDMF system for analyzing big data from a
>>>> computational fluid model. I finally got all the basic components working
>>>> on my local machine but have run into problems when trying to run on an HPC
>>>> system. Here is the XDMF file:
>>>>
>>>> <?xml version="1.0" ?>
>>>> <!DOCTYPE Xdmf SYSTEM "Xdmf.dtd" []>
>>>> <Xdmf xmlns:xi="http://www.w3.org/2003/XInclude" Version="2.1">
>>>>   <Domain>
>>>>     <Grid Name="my_Grid" GridType="Uniform">
>>>>       <Topology TopologyType="3DCoRectMesh" Dimensions="91 19 19">
>>>>       </Topology>
>>>>       <Geometry GeometryType="Origin_DxDyDz">
>>>>         <DataItem Dimensions="3" NumberType="Integer" Format="XML">
>>>>           0 0 0
>>>>         </DataItem>
>>>>         <DataItem Dimensions="3" NumberType="Integer" Format="XML">
>>>>           1 1 1
>>>>         </DataItem>
>>>>       </Geometry>
>>>>       <Attribute Name="pressure" AttributeType="Scalar" Center="Node">
>>>>         <DataItem Dimensions="91 19 19" NumberType="Float" Format="HDF">
>>>>           out.h5:/pres_group/presmag
>>>>         </DataItem>
>>>>       </Attribute>
>>>>       <Attribute Name="velocity" AttributeType="Vector" Center="Node">
>>>>         <DataItem ItemType="Function" Function="JOIN($0, $1, $2)"
>>>> Dimensions="91 19 19 3">
>>>>           <DataItem Dimensions="91 19 19" NumberType="Float"
>>>> Format="HDF">
>>>>             out.h5:/velo_group/x_velo
>>>>           </DataItem>
>>>>           <DataItem Dimensions="91 19 19" NumberType="Float"
>>>> Format="HDF">
>>>>             out.h5:/velo_group/y_velo
>>>>           </DataItem>
>>>>           <DataItem Dimensions="91 19 19" NumberType="Float"
>>>> Format="HDF">
>>>>             out.h5:/velo_group/z_velo
>>>>           </DataItem>
>>>>         </DataItem>
>>>>       </Attribute>
>>>>     </Grid>
>>>>   </Domain>
>>>> </Xdmf>
>>>>
>>>> This has worked properly on my machine. However, when I was getting an
>>>> error of failing to read the pressure data when I tried this on multiple
>>>> processes. The vector data for velocity was fine, but the pressure data
>>>> could not be read. I narrowed the problem down to something regarding the
>>>> number of processes because the pressure data worked fine on the HPC
>>>> machine if I only ran it on 1 process. Is there anything that sticks out
>>>> that could be causing this problem? For instance, is there a different
>>>> format for these files when they are run on multiple processes?
>>>>
>>>> Respectfully,
>>>>
>>>> Jacob Gerard
>>>>
>>>
>>>
>>> _______________________________________________
>>> Powered by www.kitware.com
>>>
>>> Visit other Kitware open-source projects at
>>> http://www.kitware.com/opensource/opensource.html
>>>
>>> Please keep messages on-topic and check the ParaView Wiki at:
>>> http://paraview.org/Wiki/ParaView
>>>
>>> Search the list archives at: http://markmail.org/search/?q=ParaView
>>>
>>> Follow this link to subscribe/unsubscribe:
>>> http://public.kitware.com/mailman/listinfo/paraview
>>>
>>>
>>
>
> _______________________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at http://www.kitware.com/
> opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at:
> http://paraview.org/Wiki/ParaView
>
> Search the list archives at: http://markmail.org/search/?q=ParaView
>
> Follow this link to subscribe/unsubscribe:
> http://public.kitware.com/mailman/listinfo/paraview
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20170712/0166597e/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: testing.xmf
Type: application/octet-stream
Size: 1395 bytes
Desc: not available
URL: <http://public.kitware.com/pipermail/paraview/attachments/20170712/0166597e/attachment.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: create_testing_hdf5.py
Type: text/x-python
Size: 459 bytes
Desc: not available
URL: <http://public.kitware.com/pipermail/paraview/attachments/20170712/0166597e/attachment.py>


More information about the ParaView mailing list