开发者

OutOfMemory Error while trying to extract a large jar using ZipFileSet

With jdk1.5, i get an Out开发者_开发百科ofMemoryError while trying to extract a reasonably large jar. However, this does not happen on jdk6. Is it because of different default heap-size/permgen settings on jdk1.5 and jdk6 or is this a bug in jdk1.5 that was fixed in jdk6?

import java.io.*;
import java.util.zip.*;

public class UnZip {
   final int BUFFER = 2048;
   public static void main (String argv[]) {
      try {
         BufferedOutputStream dest = null;
         FileInputStream fis = new FileInputStream(argv[0]);
         ZipInputStream zis = new ZipInputStream(new BufferedInputStream(fis));
         ZipEntry entry;
         while((entry = zis.getNextEntry()) != null) {
            System.out.println("Extracting: " +entry);
            int count;
            byte data[] = new byte[BUFFER];
            // write the files to the disk
            FileOutputStream fos = new FileOutputStream(entry.getName());
            dest = new BufferedOutputStream(fos, BUFFER);
            while ((count = zis.read(data, 0, BUFFER)) != -1) {
               dest.write(data, 0, count);
            }
            dest.flush();
            dest.close();
         }
         zis.close();
      } catch(Exception e) {
         e.printStackTrace();
      }
   }
}


Quoting the Total Heap Size section of Tuning Garbage Collection with the 5.0 Java[tm] Virtual Machine:

Total Heap

(...)

By default, the virtual machine grows or shrinks the heap at each collection to try to keep the proportion of free space to live objects at each collection within a specific range. This target range is set as a percentage by the parameters -XX:MinHeapFreeRatio=<minimum> and -XX:MaxHeapFreeRatio=<maximum>, and the total size is bounded below by -Xms and above by -Xmx . The default parameters for the 32-bit Solaris Operating System (SPARC Platform Edition) are shown in this table:

-XX:MinHeapFreeRatio= 40
-XX:MaxHeapFreeRatio= 70
-Xms          3670k
-Xmx          64m

Default values of heap size parameters on 64-bit systems have been scaled up by approximately 30%. This increase is meant to compensate for the larger size of objects on a 64-bit system.

And in Default Heap Size of Java SE 6 HotSpot Virtual Machine Garbage Collection Tuning, they write:

Default Heap Size

If not otherwise set on the command line, the initial and maximum heap sizes are calculated based on the amount of memory on the machine. The proportion of memory to use for the heap is controlled by the command line options DefaultInitialRAMFraction and DefaultMaxRAMFraction, as shown in the table below. (In the table, memory represents the amount of memory on the machine.)

                                             Formula  Default
initial heap size     memory / DefaultInitialRAMFraction          memory / 64
maximum heap size     MIN(memory / DefaultMaxRAMFraction, 1GB)    MIN(memory / 4, 1GB)

Note that the default maximum heap size will not exceed 1GB, regardless of how much memory is installed on the machine.

So, yes, Java 6 has very different heap settings and the heap can grow to 1/4 of your RAM (or 1GB if you have more than 4GB) i.e. very likely much more than 64m nowadays.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜