Huge arrays throws out of memory despite enough memory available
Using the -Xmx1G
flag to provide a heap of one gigabyte, the following works as expected:
public class Biggy {
public static void main(String[] args) {
int[] array = new int[150 * 1000 * 1000];
}
}
The array should represent around 600 MB.
However, the following throws OutOfMemoryError:
public class Biggy {
public static void main(String[] args) {
int[] array = new int[200 * 1000 * 1000];
}
}
Despite the array should represent ar开发者_如何学JAVAound 800 MB and therefore easily fit in memory.
Where's the missing memory gone?
In Java you typically have multiple regions (and sub regions) in the heap. You have a young and tenured region with most collectors. Large arrays are added to the tenured area straight away however based on your maximum memory size, some space will be reserved for the young space. If you allocate memory slowly these regions will resize however a large block like this can simply fail as you have seen.
Given memory is usually relatively cheap (not always the case) I would just increase the maximum to the point where you would want the application fail if it ever used that much.
BTW: If you have a large structure like this you might consider using direct memory.
IntBuffer array = ByteBuffer.allocateDirect(200*1000*1000*4)
.order(ByteOrder.nativeOrder()).asIntBuffer();
int a = array.get(n);
array.put(n, a+1);
Its a bit tedious to write but has one big advantage, it uses almost no heap. (there is less than 1 KB over head)
There is enough memory available but not as a single continuous block of memory, as needed for an array.
Can you use a different data structure that uses smaller blocks of memory, or several smaller arrays?
For example, the following code does work with -Xmx1G
:
public class Biggy {
public static void main(String[] args) {
int [][]array = new int[200][];
for (int i = 0; i < 200; i++) {
array[i] = new int[1000 * 1000];
System.out.println("i=" + i);
}
}
}
Heap memory is divided between three spaces:
- Old Generation
- Survivor Space
- Eden Space
At start this object will live in the old generation and will remain here for a while.
By default, the virtual machine grows or shrinks the heap at each collection to try to keep the proportion of free space to live objects at each collection within a specific range. This target range is set as a percentage by the parameters -XX:MinHeapFreeRatio= and -XX:MaxHeapFreeRatio=, and the total size is bounded below by -Xms and above by -Xmx.
Default ratio in my jvm is 30/70 so max size of object in old generation is limited (with -Xmx1G) by 700Mb(btw, I'm getting the same exception when running with default jvm parameters).
However you could size generations using jvm options. For example you could
run your class with parameters -Xmx1G -XX:NewRatio=10
and new int[200 * 1000 * 1000];
will succeed.
From what I could say Java wasn't designed to hold large objects in memory. Typical usage of memory in application is graph of bunch of relatively small objects and typically you'll get OutOfMemoryError only if you run out of space in all of spaces.
Below are couple useful (and interesting to read) articles:
Ergonomics in the 5.0 Java[tm] Virtual Machine
Tuning Garbage Collection with the 5.0 Java[tm] Virtual Machine
精彩评论