[vtkusers] large data sets and memory

Jon Johansson jon.johansson at ualberta.ca
Fri Nov 23 13:39:14 EST 2007


Hi Suzanne,

A graphics card won't give your computer more usable ram - the 
memory is included on the video card to speed up graphics
calculations done by the gpu. Putting a gpu and memory on
the graphics cards removes traffic from the system buses,
and allows the hardware to be tuned to graphics calculations.
Remember that graphics processing is efficient if the processing
looks like
   cpu -> video card -> monitor
With data going one way through a system bus the load is a  minimum.
If you have a lower end graphics card it may actually be grabbing 
some of the system memory and making it unavailable to the cpu.
In this case the video card isn't doing all the graphics work,
the cpu must take up the slack, so your system bus has a greater
load, the cpu isn't available for other work and you don't have
all the system memory available that you think you do.

Having said that, if you run Linux with a 2.6 kernel and you
don't mind compiling your own kernel, you can use your
video card as swap:

  http://gentoo-wiki.com/TIP_Use_memory_on_video_card_as_swap

Jon.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~   Jon I Johansson, Ph.D.              * Tel.: (780) 492-9304   ~~
~~   jon.johansson at ualberta.ca           * Fax.: (780) 492-1729   ~~
~~                                       * Office: G.S.B. 323C    ~~
~~   Programmer/Analyst          http://sciviz.aict.ualberta.ca   ~~
~~                                                                ~~
~~   Research Computing Support                                   ~~
~~   Room 352, General Services Building                          ~~
~~   Academic Information and Communication Technologies (AICT)   ~~
~~   University of Alberta                                        ~~
~~   Edmonton, Alberta, CANADA, T6G 2H1                           ~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
! "This  communication is intended for the use of the recipient to !
! which  it  is addressed, and may contain confidential, personal, !
! and/or  privileged  information.  Please contact  us immediately !
! if you are not the  intended  recipient  of  this communication. !
! If you are not the  intended  recipient  of  this communication, !
! do  not  copy,  distribute,   or   take   action  on   it.   Any !
! communication  received  in  error,  or subsequent reply, should !
! be deleted or destroyed."                                        !
!------------------------------------------------------------------!


-----Original Message-----
From: Suzanne Little [mailto:Suzanne.Little at ibai-institut.de] 
Sent: Friday, November 23, 2007 12:40 AM
To: 'vtkusers'
Cc: L.J.vanRuijven at amc.uva.nl; jon.johansson at ualberta.ca
Subject: Re: [vtkusers] large data sets and memory

Thanks Jon, XViz, Leo, others. That's quite helpful.

Leo: You mentioned having problems when your dataset was between 100 and 
200MB. Is this due to having multiple copies of the data generated 
during the pipeline? So the aggregate memory in use by VTK reaches 2GB?

Am I right in thinking that using a specific accelerated graphics card 
will only improve the execution speed and responsiveness but doesn't 
change the memory issues? Or will on-card memory be added to Windows' 
2GB process memory?

Thanks,
Suzanne

Jon Johansson wrote:
> I'm not sure this actually belongs on the VTK Wiki, but I've 
> seen the question come up before so I've added an entry at:
> 
>
http://www.vtk.org/Wiki/VTK_FAQ#How_can_a_user_process_access_more_than_2_GB
> _of_ram_in_32-bit_Windows.3F
> 
> I hope this helps,
> Jon.
> 
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> ~~   Jon I Johansson, Ph.D.              * Tel.: (780) 492-9304   ~~
> ~~   jon.johansson at ualberta.ca           * Fax.: (780) 492-1729   ~~
> ~~                                       * Office: G.S.B. 323C    ~~
> ~~   Programmer/Analyst          http://sciviz.aict.ualberta.ca   ~~
> ~~                                                                ~~
> ~~   Research Computing Support                                   ~~
> ~~   Room 352, General Services Building                          ~~
> ~~   Academic Information and Communication Technologies (AICT)   ~~
> ~~   University of Alberta                                        ~~
> ~~   Edmonton, Alberta, CANADA, T6G 2H1                           ~~
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> ! "This  communication is intended for the use of the recipient to !
> ! which  it  is addressed, and may contain confidential, personal, !
> ! and/or  privileged  information.  Please contact  us immediately !
> ! if you are not the  intended  recipient  of  this communication. !
> ! If you are not the  intended  recipient  of  this communication, !
> ! do  not  copy,  distribute,   or   take   action  on   it.   Any !
> ! communication  received  in  error,  or subsequent reply, should !
> ! be deleted or destroyed."                                        !
> !------------------------------------------------------------------!
> 
> From: vtkusers-bounces+jon.johansson=ualberta.ca at vtk.org
> [mailto:vtkusers-bounces+jon.johansson=ualberta.ca at vtk.org] On Behalf Of
> XViz
> Sent: Wednesday, November 21, 2007 8:30 AM
> To: vtkusers
> Subject: Re: [vtkusers] large data sets and memory
> 
> Hello Thomas,
>  
> Your problem was disscused many times in this mailing list. One process in
> Windows x86 (NT based 32 bit) cannot use more than 2Gb of memory, and the
> system itself - more than 4Gb (usually even less - near 3.2Gb). One
possible
> solution I know is using Windows x64 or some Linux. The other - reducing
the
> size of your dataset. :(
>  
>  
> XViz, D_E at ukr.net
> 2007-11-21 
> ----- Получено следующее ----- 
> От: thomas jetzfellner 
> Получатель: vtkusers 
> Время: 2007-11-21, 16:17:07
> Тема: [vtkusers] large data sets and memory
> 
> hi,
> 
> this is the first time, I write to the list and I think my question 
> could sound a little bit "stupid". my current problem is, when i load 
> large datasets there is a problem with the memory allocation. my 
> programm crashes on the update call of the vtkImageData. I tried it in 
> different ways and none worked on large data sets. small data is 
> processed correctly. Attached you find some code snippeds, on the update 
> call my program dies.
>  
>   vtkImageData* volume= new vtkImageData ;
>   volume->SetInput( centerImage->GetOutput() ) ;
>   volume->Update();
>  
>   vtkImageDataStreamer *ids = vtkImageDataStreamer::New();
>   ids->SetInputConnection(centerImage->GetOutputPort());
>   ids->SetNumberOfStreamDivisions(200);
>   ids->UpdateInformation();
>   ids->GetExtentTranslator()->SetSplitModeToBlock();
>   ids->Update();
> 
>   vtkMemoryLimitImageDataStreamer* mlds = 
> vtkMemoryLimitImageDataStreamer::New();
>   mlds->SetInputConnection( centerImage->GetOutputPort() ) ;
>   mlds->SetMemoryLimit ( 10000 ) ;
>   mlds->Update();
> 
> mybe it is a problem in my concept. i want to load a dataset with a 
> resolution of 512 x 512 x 1024, are there other ways to handle such a 
> load of data?
> my dev environment is visual studio 2005 on winXP SP2
> 
> thanks for any information and suggestions




More information about the vtkusers mailing list