[Paraview] PV 3.12.0 coprocessing problem

Andy Bauer andy.bauer at kitware.com
Wed Dec 21 15:58:56 EST 2011


I'm attaching a couple of files that you can use to test if you have the
proper fix for your branch.  Run the python script with "mpirun -np 8
bin/pvbatch -sym parallelpythontest.py".  If you get the same results then
you don't have the fix yet for the image issue.  Note that
parallelpythontest2.png isn't getting colored properly while
parallelpythontest.png is.

Andy

On Wed, Dec 21, 2011 at 3:48 PM, Biddiscombe, John A. <biddisco at cscs.ch>wrote:

>  I’ll give it a try using master, a very simple python script is
> attached. ****
>
> ** **
>
> The original contained a lot more filters, this has most of them stripped
> out and just a contour left. And yes, it is possible that some processes
> have no points.****
>
> ** **
>
> JB****
>
> ** **
>
> *From:* Andy Bauer [mailto:andy.bauer at kitware.com]
> *Sent:* 21 December 2011 19:10
> *To:* Biddiscombe, John A.
> *Cc:* paraview at paraview.org
> *Subject:* Re: [Paraview] PV 3.12.0 coprocessing problem****
>
> ** **
>
> Hi John,
>
> There were a couple of issues when saving images.  One was for saving
> charts and maybe 2d views.  The other one was for when some processes
> didn't have any points or cells.  Looking at your stack traces I don't
> think it's the latter since that would fail in the python script and give a
> warning in there.  Any chance you could test with the paraview's current
> master branch?
>
> Are you using a python script to drive the coprocessing?  If yes, can you
> share it?
>
> Andy****
>
> On Wed, Dec 21, 2011 at 12:15 PM, Biddiscombe, John A. <biddisco at cscs.ch>
> wrote:****
>
> I'm getting lock ups when saving images using the coprocessing. It looks a
> lot like a bug that was fixed many moons ago, but maybe the fix got lost in
> a merge ...
>
> 1  process makes it to here and waits for MPI traffic
>
> >       vtkParallel.dll!vtkMPICommunicatorReduceData(const void *
> sendBuffer=0x000000000012c318, void * recvBuffer=0x000000000012c378,
> __int64 length=3, int type=11, int operation=1476395010, int
> destProcessId=0, int * comm=0x0000000005d77670)  Line 317   C++
>        vtkParallel.dll!vtkMPICommunicator::ReduceVoidArray(const void *
> sendBuffer=0x000000000012c318, void * recvBuffer=0x000000000012c378,
> __int64 length=3, int type=11, int operation=1, int destProcessId=0)  Line
> 1422 + 0x4c bytes      C++
>        vtkParallel.dll!vtkCommunicator::Reduce(const double *
> sendBuffer=0x000000000012c318, double * recvBuffer=0x000000000012c378,
> __int64 length=3, int operation=1, int destProcessId=0)  Line 633 C++
>        vtkParallel.dll!vtkMultiProcessController::Reduce(const double *
> sendBuffer=0x000000000012c318, double * recvBuffer=0x000000000012c378,
> __int64 length=3, int operation=1, int destProcessId=0)  Line 811       C++
>
>  vtkPVClientServerCore.dll!vtkPVSynchronizedRenderWindows::SynchronizeBounds(double
> * bounds=0x000000000d18a8a8)  Line 1381      C++
>
>  vtkPVClientServerCore.dll!vtkPVRenderView::GatherBoundsInformation(bool
> using_distributed_rendering=true)  Line 598     C++
>        vtkPVClientServerCore.dll!vtkPVRenderView::Render(bool
> interactive=false, bool skip_rendering=false)  Line 882  C++
>        vtkPVClientServerCore.dll!vtkPVRenderView::StillRender()  Line 745
>      C++
>
>  vtkPVClientServerCoreCS.dll!vtkPVRenderViewCommand(vtkClientServerInterpreter
> * arlu=0x0000000005d6be10, vtkObjectBase * ob=0x000000000d18a770, const
> char * method=0x000000000e1f7ee9, const vtkClientServerStream & msg={...},
> vtkClientServerStream & resultStream={...})  Line 258  C++
>
>  vtkClientServer.dll!vtkClientServerInterpreter::ProcessCommandInvoke(const
> vtkClientServerStream & css={...}, int midx=0)  Line 379 + 0x2f bytes
>  C++
>
>  vtkClientServer.dll!vtkClientServerInterpreter::ProcessOneMessage(const
> vtkClientServerStream & css={...}, int message=0)  Line 214 + 0x1d bytes
>      C++
>        vtkClientServer.dll!vtkClientServerInterpreter::ProcessStream(const
> vtkClientServerStream & css={...})  Line 183 + 0x14 bytes   C++
>
>  vtkPVServerImplementation.dll!vtkPVSessionCore::ExecuteStreamInternal(const
> vtkClientServerStream & stream={...}, bool ignore_errors=false)  Line 636
> C++
>
>  vtkPVServerImplementation.dll!vtkPVSessionCore::ExecuteStream(unsigned int
> location=21, const vtkClientServerStream & stream={...}, bool
> ignore_errors=false)  Line 606 C++
>
>  vtkPVServerImplementation.dll!vtkPVSessionBase::ExecuteStream(unsigned int
> location=21, const vtkClientServerStream & stream={...}, bool
> ignore_errors=false)  Line 157 C++
>        vtkPVServerManager.dll!vtkSMProxy::ExecuteStream(const
> vtkClientServerStream & stream={...}, bool ignore_errors=false, unsigned
> int location=21)  Line 2092     C++
>        vtkPVServerManager.dll!vtkSMViewProxy::StillRender()  Line 137 +
> 0x18 bytes     C++
>        vtkPVServerManager.dll!`anonymous
> namespace'::vtkRenderHelper::EventuallyRender()  Line 86      C++
>        vtkPVVTKExtensions.dll!vtkPVGenericRenderWindowInteractor::Render()
>  Line 302   C++
>        vtkRendering.dll!vtkRenderWindowInteractor::Initialize()  Line 632
>      C++
>        vtkRendering.dll!vtkRenderWindowInteractor::ReInitialize()  Line 76
> + 0x13 bytes        C++
>
>  vtkRendering.dll!vtkWin32OpenGLRenderWindow::SetOffScreenRendering(int
> offscreen=0)  Line 1268  C++
>
>  vtkPVServerManager.dll!vtkSMRenderViewProxy::CaptureWindowInternal(int
> magnification=1)  Line 875       C++
>        vtkPVServerManager.dll!vtkSMViewProxy::CaptureWindow(int
> magnification=1)  Line 268 + 0x20 bytes        C++
>        vtkPVServerManager.dll!vtkSMViewProxy::WriteImage(const char *
> filename=0x000000000b619f50, const char * writerName=0x000000000ac750e0,
> int magnification=1)  Line 307 + 0x11 bytes     C++
>        vtkPVServerManagerPythonD.dll!PyvtkSMViewProxy_WriteImage(_object *
> self=0x000000000b62f6d8, _object * args=0x000000000b62b438)  Line 367 +
> 0x1f bytes  C++
>
> ------------------------------
> but the other N-1 processes end up in here - waiting
>
> >       vtkParallel.dll!vtkMPICommunicator::BroadcastVoidArray(void *
> data=0x000000000012dd24, __int64 length=1, int type=6, int root=0)  Line
> 1159 + 0x31 bytes        C++
>        vtkParallel.dll!vtkCommunicator::Broadcast(int *
> data=0x000000000012dd24, __int64 length=1, int srcProcessId=0)  Line 256
>     C++
>        vtkParallel.dll!vtkMultiProcessController::Broadcast(int *
> data=0x000000000012dd24, __int64 length=1, int srcProcessId=0)  Line 402
>   C++
>
>  vtkPVServerManager.dll!vtkSMUtilities::SaveImageOnProcessZero(vtkImageData
> * image=0x000000000e2c79c0, const char * filename=0x000000000b729f50, const
> char * writerName=0x000000000ac850e0)  Line 139  C++
>        vtkPVServerManager.dll!vtkSMViewProxy::WriteImage(const char *
> filename=0x000000000b729f50, const char * writerName=0x000000000ac850e0,
> int magnification=1)  Line 311 + 0x1f bytes     C++
>        vtkPVServerManagerPythonD.dll!PyvtkSMViewProxy_WriteImage(_object *
> self=0x000000000b73f6d8, _object * args=0x000000000b73b438)  Line 367 +
> 0x1f bytes  C++
>
> There was a problem with the SaveImageOnProcess Zero which caused exactly
> this some time ago. I'm using a branch of mine derived from the v3.12.0
> tag, but I cherry picked Andy's patch from kitware/master or wherever that
> was mentioned a few days ago -are there othe fixes on master/next that I
> might beneeding?
>
> Any ideas. The pipeline is a contour of some image data on N processes.
> Pretty Simple.
>
> Any help appreciated.
>
> thanks
>
> JB****
>
> ** **
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20111221/31f44d6b/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: files.tgz
Type: application/x-gzip
Size: 19809 bytes
Desc: not available
URL: <http://www.paraview.org/pipermail/paraview/attachments/20111221/31f44d6b/attachment-0001.bin>


More information about the ParaView mailing list