[vtkusers] Memory allocation problems under Python wrapped VTK in Windows

Lassi Paavolainen lopaavol at jyu.fi
Fri Jan 16 04:02:28 EST 2009


Hi,

I was wondering about a memory allocation problem that seems to be only in 
some machines where we run our software so I decided to make a simple test 
program for that (code attached at the end of the message).

I got some interesting results from the test program that I cannot 
explain. I used same code in console application not using VTK and in VTK 
class but got different results of maximum memory possible to dynamically 
allocate. Result was also platform dependent in VTK class.

To the results. I used three different systems explained below:

Linux (64 bit): 2 GB of RAM, VTK 5.2, Python 2.5
Windows Vista (32 bit): 3 GB of RAM, VTK 5.2, Python 2.6
Windows XP Pro (32 bit): 8 GB of RAM (3 GB usable), VTK 5.2, Python 2.6

Exactly same code was used in both Windows machines (compiled in Vista). I 
ran this code from command line and using Python interpreter to ran 
wrapped VTK class and got following interesting results:

Linux:
- bash: about 3300 MB
- Python: about 3300 MB

Vista:
- Command line: about 1720 MB
- Python: about 1180 MB!!!

XP Pro:
- Command line: about 1920 MB
- Python: about 720 MB!!!

So my question is, how it is possible that VTK class ran from Python in 
Windows can't allocate as much continuous memory as it is possible to 
simple application? Same thing happens if I try to allocate memory using 
for example vtkUnsignedCharArray. I'm guessing the problem is not in VTK 
but Python wrapped VTK. Does anyone know where that memory limit defined 
by Python wrapped VTK and platform is coming from?

Here is the memory allocation testing code. Probably not the best but 
gives similar results as in real application.

#include <iostream>

#define START_ALLOC 1048576
#define MAX_ALLOC 4294967296
#define START_STEP 536870912

int main(int argc, char* argv[])
{
  long long alloc = START_ALLOC;
  long long maxAlloc = MAX_ALLOC;
  long long minAlloc = 0;
  long long step = START_STEP;
  while(maxAlloc - minAlloc > START_ALLOC) 
    {
    std::cout << "Allocating " << alloc << " bytes of memory." << std::endl;
    try 
      {
      char *mem = new char[alloc];
      minAlloc = alloc;
      alloc += step;
      delete[] mem;
      }
    catch (std::bad_alloc &ba)
      {
      std::cout << "Couldn't allocate " << alloc / 1048576 << " MB of memory." << std::endl;
      maxAlloc = alloc;
      alloc = minAlloc;
      step /= 2;
      alloc += step;
      }
    }

  std::cout << "Maximum size of memory allocated: " << minAlloc / 1048576 << " MB" << std::endl;

  return 0;
}

Regards,
Lassi Paavolainen



More information about the vtkusers mailing list