<html xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta name=Title content=""><meta name=Keywords content=""><meta http-equiv=Content-Type content="text/html; charset=utf-8"><meta name=Generator content="Microsoft Word 15 (filtered medium)"><style><!--
/* Font Definitions */
@font-face
{font-family:Arial;
panose-1:2 11 6 4 2 2 2 2 2 4;}
@font-face
{font-family:"Courier New";
panose-1:2 7 3 9 2 2 5 2 4 4;}
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:12.0pt;
font-family:Calibri;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:#954F72;
text-decoration:underline;}
pre
{mso-style-priority:99;
mso-style-link:"HTML Preformatted Char";
margin:0in;
margin-bottom:.0001pt;
font-size:10.0pt;
font-family:"Courier New";}
p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph
{mso-style-priority:34;
margin-top:0in;
margin-right:0in;
margin-bottom:0in;
margin-left:.5in;
margin-bottom:.0001pt;
font-size:12.0pt;
font-family:Calibri;}
span.EmailStyle18
{mso-style-type:personal;
font-family:Calibri;
color:windowtext;}
span.HTMLPreformattedChar
{mso-style-name:"HTML Preformatted Char";
mso-style-priority:99;
mso-style-link:"HTML Preformatted";
font-family:Courier;}
span.EmailStyle21
{mso-style-type:personal-reply;
font-family:Calibri;
color:windowtext;}
span.msoIns
{mso-style-type:export-only;
mso-style-name:"";
text-decoration:underline;
color:teal;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
/* List Definitions */
@list l0
{mso-list-id:66922668;
mso-list-type:hybrid;
mso-list-template-ids:1339055108 67698705 67698713 67698715 67698703 67698713 67698715 67698703 67698713 67698715;}
@list l0:level1
{mso-level-text:"%1\)";
mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-.25in;}
@list l0:level2
{mso-level-number-format:alpha-lower;
mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-.25in;}
@list l0:level3
{mso-level-number-format:roman-lower;
mso-level-tab-stop:none;
mso-level-number-position:right;
text-indent:-9.0pt;}
@list l0:level4
{mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-.25in;}
@list l0:level5
{mso-level-number-format:alpha-lower;
mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-.25in;}
@list l0:level6
{mso-level-number-format:roman-lower;
mso-level-tab-stop:none;
mso-level-number-position:right;
text-indent:-9.0pt;}
@list l0:level7
{mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-.25in;}
@list l0:level8
{mso-level-number-format:alpha-lower;
mso-level-tab-stop:none;
mso-level-number-position:left;
text-indent:-.25in;}
@list l0:level9
{mso-level-number-format:roman-lower;
mso-level-tab-stop:none;
mso-level-number-position:right;
text-indent:-9.0pt;}
ol
{margin-bottom:0in;}
ul
{margin-bottom:0in;}
--></style></head><body bgcolor=white lang=EN-US link="#0563C1" vlink="#954F72"><div class=WordSection1><p class=MsoNormal><span style='font-size:11.0pt'>Thanks for your reply.<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'>The correct syntax to run on multiple hosts is: mpiexec –hosts N localhost M remotehost 1 appName<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'>I tested with hostname and it worked; it printed the expected output.<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'>However, by executing the same command with pvserver, it does not work. I ensured the location of paraview/pvserver is the same across both hosts. And I am providing the full path of pvserver.<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'>I would like to make sure that I understood the concept correctly:<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'>From host1, if I executed parallel pvservers with MPI (host1, host2, hostN), the clients should connect to host1. <o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'>Can you please confirm?<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt'><o:p> </o:p></span></p><div style='border:none;border-top:solid #B5C4DF 1.0pt;padding:3.0pt 0in 0in 0in'><p class=MsoNormal><b><span style='color:black'>From: </span></b><span style='color:black'>Patrick Begou <Patrick.Begou@legi.grenoble-inp.fr><br><b>Date: </b>Tuesday, July 25, 2017 at 5:45 PM<br><b>To: </b>Mariam <mbahameish@gmail.com>, <paraview@paraview.org><br><b>Subject: </b>Re: [Paraview] MPI on multiple nodes<o:p></o:p></span></p></div><div><p class=MsoNormal><span style='font-family:"Times New Roman"'><o:p> </o:p></span></p></div><p class=MsoNormal>I think the syntax for mpiexec is:<br><br>mpiexec -np 8 --host host_one,host_two pvserver<br><br>This would launch 4 occurences of pvserver on host_one and 4 on host_two.<br><br>1) first check mpi is running:<br><br>mpiexec -np 2 --host host_one,host_two hostname<br><br>This should print the name of the 2 hosts.<br><br>If it does not works check you can ssh from one host to the other without password.<br><br>2) You need the same paraview/pvserver install on the two hosts in the same location (not imposed but it is easier to launch)<br><br>3) May be use reverse connection from the parallel pvserver to the paraview client when you launch pvserver <br><br>Patrick<br><br><br>Mariam wrote: <o:p></o:p></p><blockquote style='margin-top:5.0pt;margin-bottom:5.0pt'><p class=MsoNormal><span style='font-size:11.0pt'>Hi, </span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>I would like to ask about running pvserver in parallel with MPI on multiple hosts. Is that possible?</span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'> </span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>The following commands were tested successfully:</span><o:p></o:p></p><p class=MsoListParagraph style='text-indent:-.25in;mso-list:l0 level1 lfo2'><![if !supportLists]><span style='mso-list:Ignore'>1)<span style='font:7.0pt "Times New Roman"'> </span></span><![endif]><span dir=LTR></span><span style='font-size:11.0pt'>Running MPI on localhost:</span><o:p></o:p></p><p class=MsoListParagraph><span style='font-size:11.0pt'>mpiexec –n M localhost pvserver</span><o:p></o:p></p><p class=MsoListParagraph style='text-indent:-.25in;mso-list:l0 level1 lfo2'><![if !supportLists]><span style='mso-list:Ignore'>2)<span style='font:7.0pt "Times New Roman"'> </span></span><![endif]><span dir=LTR></span><span style='font-size:11.0pt'>Running MPI on remotehost:</span><o:p></o:p></p><p class=MsoListParagraph><span style='font-size:11.0pt'>mpiexec –host remotehost –n M pvserver</span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'> </span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>But I was not able to run: mpiexec –hosts 2 localhost 1 remotehost 1 pvserver</span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>I run the following to test MPI on multiple hosts and it worked: mpiexec –hosts 2 localhost 1 remotehost 1 hostname</span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'> </span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>Is there any pre-configuration that should be done to pvserver? I want the clients to connect to the server (master server), and the jobs are distributed across multiple nodes.</span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'> </span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>Regards,</span><o:p></o:p></p><p class=MsoNormal><span style='font-size:11.0pt'>Mariam</span><o:p></o:p></p><p class=MsoNormal><span style='font-family:"Times New Roman"'><br><br><br><o:p></o:p></span></p><pre>_______________________________________________<o:p></o:p></pre><pre>Powered by <a href="http://www.kitware.com">www.kitware.com</a><o:p></o:p></pre><pre><o:p> </o:p></pre><pre>Visit other Kitware open-source projects at <a href="http://www.kitware.com/opensource/opensource.html">http://www.kitware.com/opensource/opensource.html</a><o:p></o:p></pre><pre><o:p> </o:p></pre><pre>Please keep messages on-topic and check the ParaView Wiki at: <a href="http://paraview.org/Wiki/ParaView">http://paraview.org/Wiki/ParaView</a><o:p></o:p></pre><pre><o:p> </o:p></pre><pre>Search the list archives at: <a href="http://markmail.org/search/?q=ParaView">http://markmail.org/search/?q=ParaView</a><o:p></o:p></pre><pre><o:p> </o:p></pre><pre>Follow this link to subscribe/unsubscribe:<o:p></o:p></pre><pre><a href="http://public.kitware.com/mailman/listinfo/paraview">http://public.kitware.com/mailman/listinfo/paraview</a><o:p></o:p></pre></blockquote><p class=MsoNormal><span style='font-family:"Times New Roman"'><br><br><br><o:p></o:p></span></p><pre>-- <o:p></o:p></pre><pre>===================================================================<o:p></o:p></pre><pre>| Equipe M.O.S.T. | |<o:p></o:p></pre><pre>| Patrick BEGOU | <a href="mailto:Patrick.Begou@grenoble-inp.fr">mailto:Patrick.Begou@grenoble-inp.fr</a> |<o:p></o:p></pre><pre>| LEGI | |<o:p></o:p></pre><pre>| BP 53 X | Tel 04 76 82 51 35 |<o:p></o:p></pre><pre>| 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 |<o:p></o:p></pre><pre>===================================================================<o:p></o:p></pre></div></body></html>