开发者

state of the art lossy compression program [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.

Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist

Closed 9 years ago.

Improve this question

Does anyone know of a state of the art LOSSY compression program for data BESIDES music and images? 开发者_高级运维I need actual executable or comilable source code.

I am trying to compress AMillionRandomDigits.bin.

Idea is to lossily compress AMillionRandomDigits.bin, then store LOSSY_COMPRESSED(amillionrandomdigits.bin) + DIFF(LOSSY_UNCOMPRESSED, amillionrandomdigits.bin) http://www.stanford.edu/~hwang41/


@user562688: Compressing a truly random number can't be done. The proof idea is that if you're trying to compress 100 bits to 90 bits, then you need all of the 2^100 strings to fit inside a space of size 2^90, which is too small. Therefore, there will be many collisions (at least 2^10 on average), which means that you cannot decode it back to the original string.

But to answer your original question, although the Johnson-Lindenstrauss algorithm isn't a compressing algorithm per-se it has some similar properties to what's done in image compression.

The goal of the Johnson-Lindentrauss algorithm is to take lots of vectors (say n vectors) in R^n, and to find a mapping to a much smaller space, R^log(n), such that the distances between all the vectors do not change by much.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜