[Paraview] Pvbatch not performing significantly better in parallel

Andy Bauer andy.bauer at kitware.com
Mon Apr 20 15:33:17 EDT 2015


Yes, please attach the state file and if it's small enough, one of the data
files.

On Sun, Apr 19, 2015 at 10:39 AM, Massimiliano Leoni <
leoni.massimiliano1 at gmail.com> wrote:

> Sure, I have a pressure scalar field with a contour filter, a velocity
> vector
> field with a clip filter and an annotated time.
>
> Do you want me to attach any file?
>
>
> In data domenica 19 aprile 2015 10:30:47, Andy Bauer ha scritto:
> > Can you share your full pipeline? Using the state file it's tough to see
> > exactly what's going on.
> >
> > You'll need to do your own build of ParaView and enable the
> spatio-temporal
> > plugin if you want to use that.
> >
> >
> > On Sun, Apr 19, 2015 at 5:58 AM, Massimiliano Leoni <
> >
> > leoni.massimiliano1 at gmail.com> wrote:
> > > Hi Andy,
> > >
> > > thanks for your reply. I think my cells are being correctly
> partitioned. I
> > > attach a screenshot of the grid when I color it by vtkProcessId.
> > >
> > > The spatio-temporal parallelism could do the trick, but the nightly are
> > > unavailable in this moment and I can't find the plugin in my current
> > > installations. I'll try again later.
> > >
> > > Anyway, I think I should get at least some benefit from pure spatial
> > > parallelism, which instead I am not experiencing.
> > > Is there any setting I might be missing?
> > >
> > > Best regards,
> > > Massimiliano
> > >
> > > In data sabato 18 aprile 2015 10:52:10, Andy Bauer ha scritto:
> > > > Hi Massimiliano,
> > > >
> > > > I don't think the XML unstructured grid reader partitions the data so
> > > > all
> > > > of your cells are probably just ending up on process 0.
> > > >
> > > > You may want to look at ParaView's spatio-temporal parallelism (
> > > > http://www.paraview.org/Wiki/Spatio-Temporal_Parallelism). Make
> sure to
> > >
> > > use
> > >
> > > > a time compartment size of 1 if you're just using the XML
> unstructured
> > >
> > > grid
> > >
> > > > reader.
> > > >
> > > > Regards,
> > > > Andy
> > > >
> > > > On Sat, Apr 18, 2015 at 6:40 AM, Massimiliano Leoni <
> > > >
> > > > leoni.massimiliano1 at gmail.com> wrote:
> > > > >  Hi everybody,
> > > > >
> > > > > I am trying to run pvbatch in parallel to render an animation,
> with a
> > >
> > > very
> > >
> > > > > easy script that looks like
> > > > >
> > > > >    - import sys
> > > > >    - from paraview.simple import *
> > > > >    -
> > > > >    - # read pvsm file from command line and load it
> > > > >    - stateFile = sys.argv[1]
> > > > >    - simulation = stateFile.split("/")[-1].split(".")[0]
> > > > >    - servermanager.LoadState(stateFile)
> > > > >    -
> > > > >    - # set active view and render animation
> > > > >    - SetActiveView(GetRenderView())
> > > > >    - WriteAnimation(simulation + ".jpg",magnification=2,quality=2)
> > > > >
> > > > > I compiled paraview from sources, configuring with
> > > > >
> > > > > cmake -DPARAVIEW_BUILD_QT=OFF -DCMAKE_BUILD_TYPE=Release
> > > > > -DBUILD_TESTING=OFF -DPARAVIEW_ENABLE_PYTHON=ON
> -DPARAVIEW_USE_MPI=ON
> > >
> > > ..
> > >
> > > > > and then building all.
> > > > >
> > > > > I am doing a benchmark on 11GB of data distributed over many
> pvd/vtu
> > >
> > > files
> > >
> > > > > [written by an MPI application in parallel].
> > > > >
> > > > > I copied the data to a tmpfs folder to ensure the execution is not
> > >
> > > slowed
> > >
> > > > > down by disk access.
> > > > >
> > > > > Executing pvbatch on 1 or 16 processors doesn't really seem to
> change
> > > > > anything.
> > > > > In particular, I was expecting to see the frames appearing in
> blocks
> > >
> > > of 16
> > >
> > > > > when running with mpi on 16 procs, but they always appear one at a
> > >
> > > time at
> > >
> > > > > a constant pace, which makes me suspect that the other processes
> > > > > aren't
> > > > > really contributing to the rendering.
> > > > >
> > > > >
> > > > >
> > > > > What could I be doing wrong?
> > > > > Any suggestion is highly appreciated.
> > > > >
> > > > > Best regards,
> > > > >
> > > > > Massimiliano
> > > > >
> > > > > _______________________________________________
> > > > > Powered by www.kitware.com
> > > > >
> > > > > Visit other Kitware open-source projects at
> > > > > http://www.kitware.com/opensource/opensource.html
> > > > >
> > > > > Please keep messages on-topic and check the ParaView Wiki at:
> > > > > http://paraview.org/Wiki/ParaView
> > > > >
> > > > > Search the list archives at:
> http://markmail.org/search/?q=ParaView
> > > > >
> > > > > Follow this link to subscribe/unsubscribe:
> > > > > http://public.kitware.com/mailman/listinfo/paraview
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20150420/935ea8bc/attachment.html>


More information about the ParaView mailing list