[Paraview] plugin mismatch?

pratik pratik.mallya at gmail.com
Fri Apr 29 14:35:53 EDT 2011


I was just reading the "paraview tutorial". In the place where they 
demonstrate the D3 filter, it seems that it is the missing ghost cells 
that result in the fractured visualization. Could this be the cause for 
the fractured glyphs? If it is then why isn't D3 able to solve the 
problem here?

pratik
On Friday 29 April 2011 12:34 PM, pratik wrote:
> Ok..one last mail before I go into the code of the plugin:
> 1)attached is an image which somwhat clearly shows what i was talking 
> about in the previous mail. This kind of "division of the plane" does 
> not occur when i run pv locally
> 2) using D3 filter with "duplicated cells" *and* "minimal memory" 
> checked gives the right tensors for "tensors8.vtk" but not for larger 
> data. However this option does result in none of the mpi spawned 
> processes from starvation of data: i.e data seems to be distributed 
> properly. yet, the problem of broken glyphs suggests that there is 
> something that is preventing proper reassembly of the tensor glyphs. 
> Also, the tensors which i generated were read from an h5 file whereas 
> the "tensors8.vtk" is a different format...this may or may not be a 
> factor.
>
> pratik
>
> On Friday 29 April 2011 03:40 AM, pratik wrote:
>> I also had a small doubt... when i initial read in my data, i can see 
>> the plane, and that is coloured by a scalar. When i zoom in and out, 
>> i can clearly see the plane "divided" into 8 equal parts, before it 
>> renders completely in 1 sec. I am assuming that this is the way in 
>> which pv breaks data between the various processes(i can observe this 
>> even before i use the D3 filter). If that is the case then there 
>> should be no use for D3...since the data is already distributed.
>> Please do correct me if I am wrong here
>>
>> pratik
>> On Friday 29 April 2011 03:10 AM, pratik wrote:
>>> ok. So it does seem like data is being distributed properly by the 
>>> D3 filter, because on fresh runs, none of the processes spawned are 
>>> starving of memory....however, the glyphs that i get for large 
>>> datasets are still broken....now i really don't know why...perhaps 
>>> the data is not being reassembled properly?
>>>
>>> pratik
>>> On Thursday 28 April 2011 10:47 PM, pratik wrote:
>>>> Also, it turns out that for the larger dataset that I am 
>>>> visualizing (256X256 grid of tensors) even the previous technique 
>>>> does not give unbroken glyphs...
>>>> On Thursday 28 April 2011 10:35 PM, pratik wrote:
>>>>> I spoke too soon...the mpi log shows that some processes are yet 
>>>>> being starved of data:
>>>>> MPI: libxmpi.so 'SGI MPT 1.23  03/28/09 11:45:59'
>>>>> MPI: libmpi.so  'SGI MPT 1.23  03/28/09 11:43:39'
>>>>>     MPI Environmental Settings
>>>>> MPI: MPI_DSM_DISTRIBUTE (default: not set) : 1
>>>>> MPI: The default size of the mapped stack area is 780 MBytes.  The 
>>>>> current
>>>>>      stack limit (unlimited) is greater than this size.  To 
>>>>> specify a new
>>>>>      size (in bytes) of the mapped stack, set the 
>>>>> MPI_MAPPED_STACK_SIZE
>>>>>      environment variable.
>>>>>
>>>>> Host: r2i2n2, CPUs: 16, TotPhysMem 3048401 pages, PhysMemPerCPU 
>>>>> 190525 pages
>>>>> Memmap mmap size: 4621264 pages (18928697344 bytes) mapped ranks 8
>>>>> Memmap_init complete. shm base=0x2b129e733000, sym_static=1
>>>>> RANK:0 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:0 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:0 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:0 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:0 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:1 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:1 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:1 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:1 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:1 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:2 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:2 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:2 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:2 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:2 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:3 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:3 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:3 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:3 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:3 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:4 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:4 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:4 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:4 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:4 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:5 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:5 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:5 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:5 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:5 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:6 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:6 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:6 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:6 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:6 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> RANK:7 sbrk        base=0x          64f000, pagesize=0x1000 (4096)
>>>>> RANK:7 static/heap base=0x          506000, top=0x        
>>>>> 2ed43000, len=190525 pages
>>>>> RANK:7 stack       base=0x    7fff1d7b2000, top=0x    
>>>>> 7fff4bfef000, len=190525 pages
>>>>> RANK:7 mpibuffer   base=0x               0, top=0x               
>>>>> 0, len=     0 pages
>>>>> RANK:7 symheap     base=0x    2b126e72b000, top=0x    
>>>>> 2b129e72b000, len=196608 pages
>>>>> Connected to client
>>>>> Process id: 2 >> ERROR: In 
>>>>> /home/pratikm/source/ParaView/ParaView-3.10.1/VTK/Graphics/vtkTensorGlyph.cxx, 
>>>>> line 131
>>>>> vtkTensorGlyph2 (0x4706a70): No data to glyph!
>>>>>
>>>>> Process id: 3 >> ERROR: In 
>>>>> /home/pratikm/source/ParaView/ParaView-3.10.1/VTK/Graphics/vtkTensorGlyph.cxx, 
>>>>> line 131
>>>>> vtkTensorGlyph2 (0x47029a0): No data to glyph!
>>>>>
>>>>>
>>>>> pratik
>>>>>
>>>>> On Thursday 28 April 2011 10:30 PM, pratik wrote:
>>>>>> But hey here's something..... i check use minimal memory and it 
>>>>>> WORKED!
>>>>>> i wonder how that happened? anyway, i'm glad that it worked :)
>>>>>> Thanks Sven and Utkarsh for all the help !
>>>>>>
>>>>>> pratik
>>>>>> On Thursday 28 April 2011 10:27 PM, pratik wrote:
>>>>>>> Just tried it...but getting same result :( .
>>>>>>> First loaded data, then applied D3 filter and then applied 
>>>>>>> tensor glyph..... am i doing something wrong here?
>>>>>>>
>>>>>>> pratik
>>>>>>> On Thursday 28 April 2011 10:20 PM, pratik wrote:
>>>>>>>> Thanks Sven! I'll try that out and post the results.
>>>>>>>> Btw. that would be a nice thing to add to the wiki, if it isn't 
>>>>>>>> there already :)
>>>>>>>>
>>>>>>>> pratik
>>>>>>>> On Thursday 28 April 2011 10:17 PM, Sven Buijssen wrote:
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> Being the one who uploaded the plugin (having added a mere 
>>>>>>>>> frosting to the
>>>>>>>>> servermanager XML that got posted here in 2009) I'd like to 
>>>>>>>>> add the following:
>>>>>>>>>
>>>>>>>>> The plugin is merely a convenient way to use 
>>>>>>>>> vtkTensorGlyph.cxx from within
>>>>>>>>> ParaView, but does not add additional features. As such it 
>>>>>>>>> does not provide any
>>>>>>>>> magic to support multi-core/multiple pvserver processes.
>>>>>>>>>
>>>>>>>>> When using the input file tensors8.vtk shipped with the plugin 
>>>>>>>>> in a multi-core
>>>>>>>>> environment, you have to use the D3 filter first to distribute 
>>>>>>>>> the data. You
>>>>>>>>> will notice that then the glyphs are already wrong for -np 2
>>>>>>>>>
>>>>>>>>> Sven
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> pratik wrote, On 04/28/11 18:41:
>>>>>>>>>> I am quite sure now that the issue is do do with the some of 
>>>>>>>>>> the processes
>>>>>>>>>> spawned by mpi not being able to do their jobs; i already 
>>>>>>>>>> posted the verbose
>>>>>>>>>> output of the run previously, now i will also attach an image 
>>>>>>>>>> that clearly shows
>>>>>>>>>> that parts of the glyphs are not being rendered. Can any one 
>>>>>>>>>> the paraview
>>>>>>>>>> developers working with mpi please have a look at why this 
>>>>>>>>>> seems to be happening?
>>>>>>>>>> I am now looking through the code of the plugin to see if 
>>>>>>>>>> that is where the
>>>>>>>>>> problem is......If i do find out I'll definitely post it here.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> any help on this would be really helpful.
>>>>>>>>>>
>>>>>>>>>> pratik
>>>>>>>>>> On Thursday 28 April 2011 06:37 PM, pratik wrote:
>>>>>>>>>>> Great Idea. Also, thanks very much for helping me Utkarsh.
>>>>>>>>>>>
>>>>>>>>>>> analysis:
>>>>>>>>>>> 1) works on client perfectly
>>>>>>>>>>> 2) started pvserver on head node with -np 1 (for mpi):- no 
>>>>>>>>>>> problem
>>>>>>>>>>> 3) works on head node UPTO -np 3 .... -np 4 becomes fractured.
>>>>>>>>>>>
>>>>>>>>>>> Now, the other filters that i have used have worked 
>>>>>>>>>>> perfectly....this may be
>>>>>>>>>>> something to do with the plugin, as you suggested.
>>>>>>>>>>>
>>>>>>>>>>> pratik
>>>>>>>>>>>
>>>>>>>>>>> On Thursday 28 April 2011 06:19 PM, Utkarsh Ayachit wrote:
>>>>>>>>>>>> Ok first things first. This is community contributed plugin 
>>>>>>>>>>>> so I have
>>>>>>>>>>>> no idea what it does or what it doesn't. To identify the 
>>>>>>>>>>>> issue, I
>>>>>>>>>>>> always tend to start simple. Run without connecting to the 
>>>>>>>>>>>> server at
>>>>>>>>>>>> all. In builtin mode, does the plugin work? Next try with 1 
>>>>>>>>>>>> pvserver
>>>>>>>>>>>> processes without MPI. Next 2 pvserver processes and so on.
>>>>>>>>>>>>
>>>>>>>>>>>> Utkarsh
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Apr 28, 2011 at 8:46 AM, 
>>>>>>>>>>>> pratik<pratik.mallya at gmail.com>   wrote:
>>>>>>>>>>>>> iran mpirun with the verbose option on the server and got 
>>>>>>>>>>>>> this as output;
>>>>>>>>>>>>> MPI: libxmpi.so 'SGI MPT 1.23  03/28/09 11:45:59'
>>>>>>>>>>>>> MPI: libmpi.so  'SGI MPT 1.23  03/28/09 11:43:39'
>>>>>>>>>>>>>      MPI Environmental Settings
>>>>>>>>>>>>> MPI: MPI_DSM_DISTRIBUTE (default: not set) : 1
>>>>>>>>>>>>> MPI: The default size of the mapped stack area is 780 
>>>>>>>>>>>>> MBytes.  The current
>>>>>>>>>>>>>       stack limit (unlimited) is greater than this size.  
>>>>>>>>>>>>> To specify a new
>>>>>>>>>>>>>       size (in bytes) of the mapped stack, set the 
>>>>>>>>>>>>> MPI_MAPPED_STACK_SIZE
>>>>>>>>>>>>>       environment variable.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Host: r2i1n0, CPUs: 16, TotPhysMem 3048401 pages, 
>>>>>>>>>>>>> PhysMemPerCPU 190525 pages
>>>>>>>>>>>>> Memmap mmap size: 4621264 pages (18928697344 bytes) mapped 
>>>>>>>>>>>>> ranks 8
>>>>>>>>>>>>> Memmap_init complete. shm base=0x2b70b120b000, sym_static=1
>>>>>>>>>>>>> RANK:0 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:0 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:0 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:0 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:0 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:1 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:1 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:1 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:1 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:1 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:2 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:2 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:2 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:2 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:2 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:3 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:3 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:3 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:3 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:3 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:4 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:4 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:4 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:4 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:4 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:5 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:5 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:5 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:5 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:5 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:6 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:6 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:6 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:6 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:6 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> RANK:7 sbrk        base=0x          64f000, 
>>>>>>>>>>>>> pagesize=0x1000 (4096)
>>>>>>>>>>>>> RANK:7 static/heap base=0x          506000, top=0x        
>>>>>>>>>>>>> 2ed43000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:7 stack       base=0x    7fff0acd9000, top=0x    
>>>>>>>>>>>>> 7fff39516000,
>>>>>>>>>>>>> len=190525 pages
>>>>>>>>>>>>> RANK:7 mpibuffer   base=0x               0, 
>>>>>>>>>>>>> top=0x               0, len=
>>>>>>>>>>>>> 0 pages
>>>>>>>>>>>>> RANK:7 symheap     base=0x    2b7081203000, top=0x    
>>>>>>>>>>>>> 2b70b1203000,
>>>>>>>>>>>>> len=196608 pages
>>>>>>>>>>>>> Connected to client
>>>>>>>>>>>>> Process id: 3>>   ERROR: In
>>>>>>>>>>>>> /home/pratikm/source/ParaView/ParaView-3.10.1/VTK/Graphics/vtkTensorGlyph.cxx, 
>>>>>>>>>>>>>
>>>>>>>>>>>>> line 131
>>>>>>>>>>>>> vtkTensorGlyph2 (0x4656cd0): No data to glyph!
>>>>>>>>>>>>>
>>>>>>>>>>>>> Process id: 7>>   ERROR: In
>>>>>>>>>>>>> /home/pratikm/source/ParaView/ParaView-3.10.1/VTK/Graphics/vtkTensorGlyph.cxx, 
>>>>>>>>>>>>>
>>>>>>>>>>>>> line 131
>>>>>>>>>>>>> vtkTensorGlyph2 (0x4651510): No data to glyph!
>>>>>>>>>>>>>
>>>>>>>>>>>>> Client connection closed.
>>>>>>>>>>>>>
>>>>>>>>>>>>> How can there be no data to glyph? Is it some 
>>>>>>>>>>>>> communication problem which is
>>>>>>>>>>>>> causing this? I think that the two processes mentioned 
>>>>>>>>>>>>> here may not be doing
>>>>>>>>>>>>> their part of the processing.... what do you think?
>>>>>>>>>>>>>
>>>>>>>>>>>>> pratik
>>>>>>>>>>>>> On Thursday 28 April 2011 06:09 PM, pratik wrote:
>>>>>>>>>>>>>> Yes...it is still broken :( ;(
>>>>>>>>>>>>>> have you ever had the situation when paraview at both the 
>>>>>>>>>>>>>> ends were
>>>>>>>>>>>>>> compiled with different compilers? Is it an X- related 
>>>>>>>>>>>>>> problem?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> pratik
>>>>>>>>>>>>>> On Thursday 28 April 2011 06:01 PM, Utkarsh Ayachit wrote:
>>>>>>>>>>>>>>> Not sure, I was able to play with this plugin just fine. 
>>>>>>>>>>>>>>> Try this:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Go to "Edit | Settings" dialog. Open up the "Render 
>>>>>>>>>>>>>>> View" pages and go
>>>>>>>>>>>>>>> to the "Server" Page. There, uncheck the remote-render 
>>>>>>>>>>>>>>> threshold. Are
>>>>>>>>>>>>>>> things still broken?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Utkarsh
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Apr 28, 2011 at 8:21 AM, 
>>>>>>>>>>>>>>> pratik<pratik.mallya at gmail.com>     wrote:
>>>>>>>>>>>>>>>> sure...here it is.
>>>>>>>>>>>>>>>> This is a test file from the plugin folder  called 
>>>>>>>>>>>>>>>> "tensors8.vtk"
>>>>>>>>>>>>>>>> The glyphs are fractured...that is the problem.
>>>>>>>>>>>>>>>> I tried it only on client and it works fine.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> pratik
>>>>>>>>>>>>>>>> On Thursday 28 April 2011 05:41 PM, Utkarsh Ayachit wrote:
>>>>>>>>>>>>>>>>> Can you post a image?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Utkarsh
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Thu, Apr 28, 2011 at 8:08 AM, 
>>>>>>>>>>>>>>>>> pratik<pratik.mallya at gmail.com>
>>>>>>>>>>>>>>>>>    wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I did just as you told and the two pluigns loaded 
>>>>>>>>>>>>>>>>>> properly
>>>>>>>>>>>>>>>>>> however...the plugin is yet not working correctly; 
>>>>>>>>>>>>>>>>>> the tensors are
>>>>>>>>>>>>>>>>>> appearing
>>>>>>>>>>>>>>>>>> fractured...What may be the source of error now?
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> pratik
>>>>>>>>>>>>>>>>>> On Wednesday 27 April 2011 05:54 PM, Utkarsh Ayachit 
>>>>>>>>>>>>>>>>>> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Yes and then load the right plugin on the right 
>>>>>>>>>>>>>>>>>>> process from the
>>>>>>>>>>>>>>>>>>> "Manage PLugins" dialog. From  the error, it looks 
>>>>>>>>>>>>>>>>>>> like you are
>>>>>>>>>>>>>>>>>>> trying
>>>>>>>>>>>>>>>>>>> to load the icc built plugin on the gcc built client.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Utkarsh
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Apr 27, 2011 at 6:59 AM, 
>>>>>>>>>>>>>>>>>>> pratik<pratik.mallya at gmail.com>
>>>>>>>>>>>>>>>>>>>    wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Hi Utkarsh,
>>>>>>>>>>>>>>>>>>>> I compiled the plugin on BOTH server and my laptop, 
>>>>>>>>>>>>>>>>>>>> using the
>>>>>>>>>>>>>>>>>>>> *respective*
>>>>>>>>>>>>>>>>>>>> configurations(i.e server plugin using pv server 
>>>>>>>>>>>>>>>>>>>> settings(icc
>>>>>>>>>>>>>>>>>>>> compiler),
>>>>>>>>>>>>>>>>>>>> client plugin using client pv(gcc compiler) 
>>>>>>>>>>>>>>>>>>>> settings). Like you had
>>>>>>>>>>>>>>>>>>>> asked
>>>>>>>>>>>>>>>>>>>> earlier, i did compile pv on the cluster through 
>>>>>>>>>>>>>>>>>>>> source (which made
>>>>>>>>>>>>>>>>>>>> the
>>>>>>>>>>>>>>>>>>>> paraviewconfig..cmake file available ) and used it 
>>>>>>>>>>>>>>>>>>>> to compile the
>>>>>>>>>>>>>>>>>>>> plugin,
>>>>>>>>>>>>>>>>>>>> on
>>>>>>>>>>>>>>>>>>>> the server.
>>>>>>>>>>>>>>>>>>>> Is this what you are asking for?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> pratik
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wednesday 27 April 2011 04:18 PM, Utkarsh 
>>>>>>>>>>>>>>>>>>>> Ayachit wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> You need plugins compiled with both compilers and 
>>>>>>>>>>>>>>>>>>>>> load the icc
>>>>>>>>>>>>>>>>>>>>> plugin
>>>>>>>>>>>>>>>>>>>>> on
>>>>>>>>>>>>>>>>>>>>> server while load the gcc plugin on client. You 
>>>>>>>>>>>>>>>>>>>>> cannot interchange
>>>>>>>>>>>>>>>>>>>>> them
>>>>>>>>>>>>>>>>>>>>> or
>>>>>>>>>>>>>>>>>>>>> load the same one on both sides.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Utkarsh
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Sent from my iPad
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Apr 27, 2011, at 12:03 AM, 
>>>>>>>>>>>>>>>>>>>>> pratik<pratik.mallya at gmail.com>
>>>>>>>>>>>>>>>>>>>>>    wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I compiled PV on my laptop using gcc but on 
>>>>>>>>>>>>>>>>>>>>>> cluster using icc.
>>>>>>>>>>>>>>>>>>>>>> When i
>>>>>>>>>>>>>>>>>>>>>> try
>>>>>>>>>>>>>>>>>>>>>> to use the TensorglyphFilter plugin, the 
>>>>>>>>>>>>>>>>>>>>>> following error shows:
>>>>>>>>>>>>>>>>>>>>>> Load Error, Mismatch in version:
>>>>>>>>>>>>>>>>>>>>>> Paraview Signature: paraviewplugin|GNU|3.10
>>>>>>>>>>>>>>>>>>>>>> Plugin signature: paraviewplugin|Intel|3.10
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I also tried compiling both plugins using gcc/g++ 
>>>>>>>>>>>>>>>>>>>>>> compiler, but
>>>>>>>>>>>>>>>>>>>>>> the
>>>>>>>>>>>>>>>>>>>>>> same
>>>>>>>>>>>>>>>>>>>>>> error persists. What may be the problem?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> pratik
>>>>>>>>>>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>>>>>>>>>>> Powered by www.kitware.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Visit other Kitware open-source projects at
>>>>>>>>>>>>>>>>>>>>>> http://www.kitware.com/opensource/opensource.html
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Please keep messages on-topic and check the 
>>>>>>>>>>>>>>>>>>>>>> ParaView Wiki at:
>>>>>>>>>>>>>>>>>>>>>> http://paraview.org/Wiki/ParaView
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Follow this link to subscribe/unsubscribe:
>>>>>>>>>>>>>>>>>>>>>> http://www.paraview.org/mailman/listinfo/paraview
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> Powered by www.kitware.com
>>>>>>>>>>
>>>>>>>>>> Visit other Kitware open-source projects at 
>>>>>>>>>> http://www.kitware.com/opensource/opensource.html
>>>>>>>>>>
>>>>>>>>>> Please keep messages on-topic and check the ParaView Wiki at: 
>>>>>>>>>> http://paraview.org/Wiki/ParaView
>>>>>>>>>>
>>>>>>>>>> Follow this link to subscribe/unsubscribe:
>>>>>>>>>> http://www.paraview.org/mailman/listinfo/paraview
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>



More information about the ParaView mailing list