Compressing and decompressing large size data in java?
I need to compress/decompress different types of files that are contained in a Folder the size of that folder might be more than 10-11 GB. I used following code but this is taking long time to compress the data.
BufferedReader in = new BufferedReader(new FileReader("D:/ziptest/expansion1.MPQ"));
BufferedOutputStream out = new BufferedOutputStream(
new GZIPOutputStream(new FileOutputStream("test.gz")));
int c;
while ((c = in.read()) != -1)
out.write(c);
in.close();
out.close();
Please suggest me some fast compressing and decompressing library in java, i also want to split the large file in different parts such as in a chunk of 100MB each开发者_StackOverflow社区.
Reader/Writer is only for Text and if you try to read binary with these is will get corrupted.
Instead I suggest you use FileInputStream. The fastest way to copy the data is to use your own buffer.
InputStream in = new FileInputStream("D:/ziptest/expansion1.MPQ");
OutputStream out = new GZIPOutputStream(
new BufferedOutputStream(new FileOutputStream("test.gz")));
byte[] bytes = new byte[32*1024];
int len;
while((len = in.read(bytes)) > 0)
out.write(bytes, 0, len);
in.close();
out.close();
Since you reading large chunks of bytes, it is more efficient not to BufferedInput/OuptuStream as this removes one copy. There is a BufferedOutptuStream after the GZIPOutputStream as you cannot control the size of data it produces.
BTW: If you are only reading this with Java, you can use DeflatorOutputStream, its slightly faster and smaller, but only supported by Java AFAIK.
精彩评论