[Paraview] How to debug Paraview

Weiguang Guan guanw at rhpcs.mcmaster.ca
Wed Mar 12 10:22:16 EDT 2008


Wow, just like magic, your recipy solves the problem. Thank you so much.

Weiguang

On Wed, 12 Mar 2008, Moreland, Kenneth wrote:

> Actually, I think the answer is no.  If you want to see the image in the GUI window, then the answer is no.  Get rid of the -tdx and -tdy flags.
>
> -Ken
>
>> -----Original Message-----
>> From: Weiguang Guan [mailto:guanw at rhpcs.mcmaster.ca]
>> Sent: Wednesday, March 12, 2008 7:54 AM
>> To: Moreland, Kenneth
>> Cc: paraview at paraview.org
>> Subject: RE: [Paraview] How to debug Paraview
>>
>> I think the answer to your question is YES. I am running Paraview on HP's
>> SVA cluster. The paraview client (paraview) is running on one of the
>> display nodes of the cluster.
>>
>> I lauch paraview through a script called sva_paraview.sh, which does job
>> scheduling, allocating nodes, starting X servers besides mpirun-ing
>> pvserver-s.
>>
>> Weiguang
>>
>> On Wed, 12 Mar 2008, Moreland, Kenneth wrote:
>>
>>> Why do you have the -tdx and -tdy flags?  Are you driving a remote
>> display?
>>>
>>> -Ken
>>>
>>>> -----Original Message-----
>>>> From: Weiguang Guan [mailto:guanw at rhpcs.mcmaster.ca]
>>>> Sent: Wednesday, March 12, 2008 7:22 AM
>>>> To: Moreland, Kenneth
>>>> Cc: paraview at paraview.org
>>>> Subject: Re: [Paraview] How to debug Paraview
>>>>
>>>> Hi Kenneth,
>>>>
>>>> I appreciate your time very much. I attach two screen shots ---
>> generated
>>>> by using 1 and 4 pvserver-s respectively.
>>>>
>>>> What I did are:
>>>> (1) On node ic-rb18, "paraview --server=ic-rb1" is issued;
>>>> (2) mpirun -f abc, where file abc has the contents shown below
>>>> -h ic-rb1 -np 1 -e
>>>>
>> PATH=.:/home/guanw/visualization/software/install/bin:/opt/sva/chromium/bi
>>>> n/Linux:/opt/sharcnet/par
>>>>
>> aview/bin:/opt/sharcnet/compile/bin:/opt/hptc/bin:/opt/hptc/lsf/top/6.2/li
>>>> nux2.6-glibc2.3-x86_64-slurm/etc:/opt/hptc/l
>>>> sf/top/6.2/linux2.6-glibc2.3-x86_64-
>>>>
>> slurm/bin:/usr/kerberos/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/X11R6/bin:/
>>>> opt/hpmp
>>>> i/bin:/opt/sharcnet/pathscale/current/bin:/opt/sharcnet/pgi/pgi-
>>>> 6.1/linux86-64/6.1/bin:/opt/sharcnet/sq/bin:/opt/sva/b
>>>> in: -e DISPLAY=:0 pvserver -rc --client-host=ic-rb18 -tdx=1 -tdy=1
>>>> -h ic-rb10 -np 1 -e
>>>>
>> PATH=.:/home/guanw/visualization/software/install/bin:/opt/sva/chromium/bi
>>>> n/Linux:/opt/sharcnet/pa
>>>>
>> raview/bin:/opt/sharcnet/compile/bin:/opt/hptc/bin:/opt/hptc/lsf/top/6.2/l
>>>> inux2.6-glibc2.3-x86_64-slurm/etc:/opt/hptc/
>>>> lsf/top/6.2/linux2.6-glibc2.3-x86_64-
>>>>
>> slurm/bin:/usr/kerberos/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/X11R6/bin:/
>>>> opt/hpm
>>>> pi/bin:/opt/sharcnet/pathscale/current/bin:/opt/sharcnet/pgi/pgi-
>>>> 6.1/linux86-64/6.1/bin:/opt/sharcnet/sq/bin:/opt/sva/
>>>> bin: -e DISPLAY=:0 pvserver -rc --client-host=ic-rb18 -tdx=1 -tdy=1
>>>> -h ic-rb11 -np 1 -e
>>>>
>> PATH=.:/home/guanw/visualization/software/install/bin:/opt/sva/chromium/bi
>>>> n/Linux:/opt/sharcnet/pa
>>>>
>> raview/bin:/opt/sharcnet/compile/bin:/opt/hptc/bin:/opt/hptc/lsf/top/6.2/l
>>>> inux2.6-glibc2.3-x86_64-slurm/etc:/opt/hptc/
>>>> lsf/top/6.2/linux2.6-glibc2.3-x86_64-
>>>>
>> slurm/bin:/usr/kerberos/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/X11R6/bin:/
>>>> opt/hpm
>>>> pi/bin:/opt/sharcnet/pathscale/current/bin:/opt/sharcnet/pgi/pgi-
>>>> 6.1/linux86-64/6.1/bin:/opt/sharcnet/sq/bin:/opt/sva/
>>>> bin: -e DISPLAY=:0 pvserver -rc --client-host=ic-rb18 -tdx=1 -tdy=1
>>>> -h ic-rb12 -np 1 -e
>>>>
>> PATH=.:/home/guanw/visualization/software/install/bin:/opt/sva/chromium/bi
>>>> n/Linux:/opt/sharcnet/pa
>>>>
>> raview/bin:/opt/sharcnet/compile/bin:/opt/hptc/bin:/opt/hptc/lsf/top/6.2/l
>>>> inux2.6-glibc2.3-x86_64-slurm/etc:/opt/hptc/
>>>> lsf/top/6.2/linux2.6-glibc2.3-x86_64-
>>>>
>> slurm/bin:/usr/kerberos/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/X11R6/bin:/
>>>> opt/hpm
>>>> pi/bin:/opt/sharcnet/pathscale/current/bin:/opt/sharcnet/pgi/pgi-
>>>> 6.1/linux86-64/6.1/bin:/opt/sharcnet/sq/bin:/opt/sva/
>>>> bin: -e DISPLAY=:0 pvserver -rc --client-host=ic-rb18 -tdx=1 -tdy=1
>>>>
>>>> I notice, in step (1) paraview --server=ic-rb1, there is a redundance
>> of
>>>> using the --server as I use --client-host for pvserver. But if I don't
>> use
>>>> --server for paraview, the connection can not be established.
>>>>
>>>> Weiguang
>>>>
>>>> On Tue, 11 Mar 2008, Moreland, Kenneth wrote:
>>>>
>>>>> Can you post a screen shot?  That might help identify the problem.
>>>>>
>>>>> -Ken
>>>>>
>>>>>
>>>>> On 3/11/08 3:59 PM, "Weiguang Guan" <guanw at rhpcs.mcmaster.ca> wrote:
>>>>>
>>>>> I have tried to run paraview not in mpirun. Although it doesn't solve
>>>> the
>>>>> problem, at least it gets rid of the error message "MPI Application
>> rank
>>>> 0
>>>>> exited before MPI_Init() with status 0".
>>>>>
>>>>> Weiguang
>>>>>
>>>>> On Tue, 11 Mar 2008, Moreland, Kenneth wrote:
>>>>>
>>>>>> Are you running the paraview application in mpirun?  You shouldn't.
>>>>>> The server, pvserver, is an MPI application but the client, paraview,
>>>> is
>>>>> not.
>>>>>>
>>>>>> -Ken
>>>>>>
>>>>>>
>>>>>> On 3/11/08 1:11 PM, "Weiguang Guan" <guanw at rhpcs.mcmaster.ca> wrote:
>>>>>>
>>>>>> Hi everyone,
>>>>>>
>>>>>> I have run into a weird problem with Paraview and can not get a
>>>> solution
>>>>>> from the mailing list. Does anyone know how to debug Paraview? Does
>>>>>> Paraview spit out log file?
>>>>>>
>>>>>> The problem I have is missing polygon from rendering when paraview
>> run
>>>> in
>>>>>> client/server mode. This problem happens even if I run one pvserver.
>>>>>>
>>>>>> No abnormal message has been seen until we exit Paraview --- "MPI
>>>>>> Application rank 0 exited before MPI_Init() with status 0"
>>>>>>
>>>>>> Weiguang
>>>>>> _______________________________________________
>>>>>> ParaView mailing list
>>>>>> ParaView at paraview.org
>>>>>> http://www.paraview.org/mailman/listinfo/paraview
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>
>


More information about the ParaView mailing list