[Paraview] capability of ParaView, Catalyst in distributed computing environment ...

Andy Bauer andy.bauer at kitware.com
Mon Jun 20 14:03:56 EDT 2016


Hi,

Glad to hear that this is working for you and thanks for sharing how you
did it! This is definitely a corner case that few Catalyst users/developers
will ever care about so I'm glad that the Catalyst API is flexible enough
to handle this.

Best,
Andy

On Sun, Jun 19, 2016 at 4:07 AM, <u.utku.turuncoglu at be.itu.edu.tr> wrote:

> Hi Andy,
>
> I used first approach and fix the issue using following customised
> coprocessorinitializewithpython function. In this case, i converted type
> of communicator coming from Fortran to C using MPI_Comm_f2c call. Now, the
> code works without any problem. Thanks for your kindly help.
>
> extern "C" void my_coprocessorinitializewithpython_(int *fcomm, const
> char* pythonScriptName, const char strarr[][255], int *size) {
>   if (pythonScriptName != NULL) {
>     if (!g_coprocessor) {
>       g_coprocessor = vtkCPProcessor::New();
>       MPI_Comm handle = MPI_Comm_f2c(*fcomm);
>       vtkMPICommunicatorOpaqueComm *Comm = new
> vtkMPICommunicatorOpaqueComm(&handle);
>       g_coprocessor->Initialize(*Comm);
>       vtkSmartPointer<vtkCPPythonScriptPipeline> pipeline =
> vtkSmartPointer<vtkCPPythonScriptPipeline>::New();
>       pipeline->Initialize(pythonScriptName);
>       g_coprocessor->AddPipeline(pipeline);
>       //pipeline->FastDelete();
>     }
>
>     if (!g_coprocessorData) {
>       g_coprocessorData = vtkCPDataDescription::New();
>       // must be input port for all model components and for all dimensions
>       for (int i = 0; i < *size; i++) {
>         g_coprocessorData->AddInput(strarr[i]);
>         std::cout << "adding input port [" << i << "] = " << strarr[i] <<
> std::endl;
>       }
>     }
>   }
> }
>
> Regards,
>
> --ufuk
>
> > Hi Ufuk,
> >
> > I can think of two potential fixes:
> >
> >    - Use the vtkCPProcessor:: Initialize(vtkMPICommunicatorOpaqueComm&
> >    comm) method to initialize each process with the proper MPI
> > communicator.
> >    Note that vtkMPICommunicatorOpaqueComm is defined in
> >    vtkMPICommunicator.cxx. A similar example to this is available in the
> >    Examples/Catalyst/MPISubCommunicatorExample directory.
> >    - Call vtkCPProcessor::Initialize() on all processes with your global
> >    communicator and then create a vtkMPIController partitioned the way
> you
> >    want and set that to be the "global" communicator through
> >    vtkMPIController::SetGlobalController().
> >
> > Please let us know if either of these methods work for you.
> >
> > Also, what code are you working on and is it a publicly available code?
> If
> > you show your implementation I may have some in-depth suggestions for
> > improvements.
> >
> > Best,
> >
> > Andy
> >
> >
> >
> > On Fri, Jun 17, 2016 at 4:17 AM, <u.utku.turuncoglu at be.itu.edu.tr>
> wrote:
> >
> >> Hi All,
> >>
> >> I was working on the issue recently and i am very close to having
> >> prototype code but i had some difficulties in initialization of the
> >> co-processing component with coprocessorinitializewithpython call. In my
> >> case, two model components and adaptor have its own processing source
> >> (or
> >> MPI_COMM_WORLD). For example, MPI processor 0, 1, 2, 3 are used by 1st
> >> model, 4, 5, 6, 7 are used by 2nd model code and 8, 9, 10, 11 is used by
> >> adaptor. The code basically handles transferring the grid information
> >> and
> >> data to adaptor. So, the problem is that if i try to call my custom
> >> coprocessorinitializewithpython call in adaptor (only in 8, 9, 10, 11)
> >> then it hangs in g_coprocessor->Initialize(); (see code at the end of
> >> the
> >> mail) step but if i call coprocessorinitializewithpython in the main
> >> code
> >> that uses all the available processor (between 0 and 11) and it runs
> >> without any problem. It seems that there is a restriction in the
> >> ParaView
> >> side (expecially vtkCPProcessor::Initialize() that can be found in
> >> CoProcessing/Catalyst/vtkCPProcessor.cxx) but i am not sure. Do you have
> >> any suggestion about that? Do you think that is it possible to fix it
> >> easily. Of corse the adaptor code could use all the processor but it is
> >> better to have its own dedicated resource that might have GPU support in
> >> those specific servers or processors. I am relatively new to VTK and it
> >> might be difficult for me to fix it and i need your guidance to start.
> >>
> >> Best Regards,
> >>
> >> --ufuk
> >>
> >> vtkCPProcessor* g_coprocessor;
> >>
> >> extern "C" void my_coprocessorinitializewithpython_(const char*
> >> pythonScriptName, const char strarr[][255], int *size) {
> >>   if (pythonScriptName != NULL) {
> >>     if (!g_coprocessor) {
> >>       g_coprocessor = vtkCPProcessor::New();
> >>       g_coprocessor->Initialize();
> >>       vtkSmartPointer<vtkCPPythonScriptPipeline> pipeline =
> >> vtkSmartPointer<vtkCPPythonScriptPipeline>::New();
> >>       pipeline->Initialize(pythonScriptName);
> >>       g_coprocessor->AddPipeline(pipeline);
> >>       //pipeline->FastDelete();
> >>     }
> >>
> >>     if (!g_coprocessorData) {
> >>       g_coprocessorData = vtkCPDataDescription::New();
> >>       // must be input port for all model components and for all
> >> dimensions
> >>       for (int i = 0; i < *size; i++) {
> >>         g_coprocessorData->AddInput(strarr[i]);
> >>         std::cout << "adding input port [" << i << "] = " << strarr[i]
> >> <<
> >> std::endl;
> >>       }
> >>     }
> >>   }
> >> }
> >>
> >> _______________________________________________
> >> Powered by www.kitware.com
> >>
> >> Visit other Kitware open-source projects at
> >> http://www.kitware.com/opensource/opensource.html
> >>
> >> Please keep messages on-topic and check the ParaView Wiki at:
> >> http://paraview.org/Wiki/ParaView
> >>
> >> Search the list archives at: http://markmail.org/search/?q=ParaView
> >>
> >> Follow this link to subscribe/unsubscribe:
> >> http://public.kitware.com/mailman/listinfo/paraview
> >>
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/paraview/attachments/20160620/ad2c4474/attachment.html>


More information about the ParaView mailing list