开发者

Compressing binary data

On one of the working steps of my algorithm I ha开发者_高级运维ve a big array of binary data, that I want to compress.

Which algorithm (or may be, standard class) can you advise to use to compress the data as much efficient as possible?

EDIT:

The data firstly represented as byte[n] of 0 and 1. Then I join every 8 bytes into 1 and get byte[n/8] array.


The GZipStream or the DeflateStream are pretty standard classes to be used in such situations.

Obviously depending on the binary data you are trying to compress you will have better or worse compression ratio. For example if you try to compress a jpeg image with those algorithms you cannot expect very good compression ratio. If on the other hand the binary data represents text it will compress nicely.


I'll add DotNetZip and SharpZLib . The .NET "base" libraries (GZipStream/DeflateStream) are stream-based (so they compress a single stream of data. A stream is not a file, but let's say that the content of a file can be read as a stream). DotNetZip is more similar to the classic PkZip/WinZip/WinRar

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜