<div dir="ltr">FYI: pvserver will likely be run in a separate MPI job if you're doing a Live connection.<br></div><div class="gmail_extra"><br><div class="gmail_quote">On Sat, Oct 28, 2017 at 11:05 AM, Andy Bauer <span dir="ltr"><<a href="mailto:andy.bauer@kitware.com" target="_blank">andy.bauer@kitware.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div><div>Hi,<br><br></div>Catalyst by default uses MPI_COMM_WORLD of the existing MPI library that the simulation code is linked with. You can use another MPI communicator as well. An example of that is in the Examples/Catalyst/<wbr>MPISubCommunicatorExample source directory.<br><br></div>Best,<br></div>Andy<br></div><div class="gmail_extra"><br><div class="gmail_quote"><div><div class="h5">On Sat, Oct 28, 2017 at 7:50 AM, Kolja Petersen <span dir="ltr"><<a href="mailto:petersenkolja@gmail.com" target="_blank">petersenkolja@gmail.com</a>></span> wrote:<br></div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div class="h5"><div dir="ltr">Hello,<br>I am trying to understand a Catalyst implementation detail.<br><br>Because parallel Catalyst may transfer huge data to a parallel pvserver, I thought the Catalyst processes would have themselves added to the pvserver's MPI communicator. However, MPI_Comm_spawn() is the only function that I know of for this task, and I find "MPI_Comm_spawn" nowhere in the code (searched case insensitive).<br><br>I thought that the standard Catalyst TCP port 22222 was only used for control messages between Catalyst and pvserver, and data exchange would go via MPI. But apparently there is no MPI connection between Catalyst and pvserver, and all data are sent via TCP:22222, which could explain observed network bottlenecks.<br><br>Can somebody clarify this implementation detail?<br>Thanks<span class="m_-3131324059908449437HOEnZb"><font color="#888888"><br>Kolja<br></font></span></div>
<br></div></div>______________________________<wbr>_________________<br>
Powered by <a href="http://www.kitware.com" rel="noreferrer" target="_blank">www.kitware.com</a><br>
<br>
Visit other Kitware open-source projects at <a href="http://www.kitware.com/opensource/opensource.html" rel="noreferrer" target="_blank">http://www.kitware.com/opensou<wbr>rce/opensource.html</a><br>
<br>
Please keep messages on-topic and check the ParaView Wiki at: <a href="http://paraview.org/Wiki/ParaView" rel="noreferrer" target="_blank">http://paraview.org/Wiki/ParaV<wbr>iew</a><br>
<br>
Search the list archives at: <a href="http://markmail.org/search/?q=ParaView" rel="noreferrer" target="_blank">http://markmail.org/search/?q=<wbr>ParaView</a><br>
<br>
Follow this link to subscribe/unsubscribe:<br>
<a href="http://public.kitware.com/mailman/listinfo/paraview" rel="noreferrer" target="_blank">http://public.kitware.com/mail<wbr>man/listinfo/paraview</a><br>
<br></blockquote></div><br></div>
</blockquote></div><br></div>