开发者

Out of memory (?) problem on Win32 (vs. Linux)

I have the following problem:

A program run on a windows machine (32bit, 3.1Gb memory, both VC++2008 and mingw compiled code) fails with a bad_alloc exception thrown (after allocating around 1.2Gb; the exception is thrown when trying to allocate a vector of 9 million doubles, i.e. around 75Mb) with plenty of RAM still available (at least according to task manager).

The same program run on linux machines (32bit, 4Gb memory; 32bit, 2Gb memory) runs fine with peak memory usage of around 1.6Gb. Interestingly the win32 code generated by mingw run on the 4Gb linux machine under wine also fails with a bad_alloc, albeit at a different (later) place then when run under windows...

What are the possible problems?

  • Heap fragmentation? (How would I know? How can this be solved?)
  • Heap corruption? (I have run the code with pageheap.exe enabled with no errors reported;开发者_开发技巧 implemented vector access with bounds checking --- again no errors; the code is essentially free of pointers, only std::vectors and std::lists are used. Running the program under Valgrind (memcheck) consumes too much memory and ends prematurely, but does not find any errors)
  • Out of memory??? (There should be enough memory)

Moreover, what could be the reason that the windows version fails while the linux version works (and even on machines with less memory)? (Also note that the /LARGEADDRESSAWARE linker flag is used with VC+2008 if that can have any effect)

Any ideas would be much appreciated, I am at my wits end with this... :-(


It has nothing to do with how much RAM is in your system. You are running out of virtual address space. For a 32 bit windows OS process, you get a 4GB virtual address space (irrespective of how much RAM you are using) out of 2GB for the user-mode (3GB in case of LARGEADDRESSAWARE) and 2 GB for kernel. When you do try to allocate memory using new, OS will try to find the contiguos block of virtual memory which is large enough to satisfy the memory allocation request. If your virtual address space is badly fragmented or you are asking for a huge block of memory then it will fail throwing a bad_alloc exception. Check how much virtual memory your process is using.


With Windows XP x86 and the default settings, 1.2 GB is about all the address space you have left for your heap after system libraries, your code, the stack and other stuff get their share. Note that largeaddressaware requires you to boot with the /3GB boot flag to try to give your process up to 3GB. The /3GB flag causes instability on a lot of XP systems, which is why it's not enabled by default.

Server variants of Windows x86 give you more address space, both by using the 3GB/1GB split and by using PAE to allow the use of your full 4GB of RAM.

Linux x86 uses a 3GB/1GB split by default.

A 64 bit OS would give you more address space, even for a 32bit process.


Are you compiling in Debug mode? If so, the allocation will generate a huge amount of debugging data which might generate the error you have seen, with a genuine out-of-memory. Try in Release to see if that solves the problem.

I have only experienced this with VC, not MinGW, but then I haven't checked either, this could still explain the problem.


To elaborate more about the virtual memory: Your application fails when it tries to allocate a single 90MB array, and there is no contiguous space of virtual memory where this can fit left. You might be able to get a little farther if you switched to data structures that use less memory -- perhaps some class that approximates a huge array by using a tree where all data is kept in 1MB (or so) leaf nodes. Also, under c++ when doing a huge amount of allocations, it really helps if all those big allocations are of same size, this helps reusing memory and keeps fragmentation much lower.

However, the correct thing to do in the long run is simply to switch to a 64-bit system.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜