开发者

IIS Compression vs manual GZIP

I'm currently working on a project that passes some geo data through a WCF service. This data gets up in size (2-4mb sometimes, and in special cases even more). To help decrease the size of data over the wire we've initially enabled gzip compression on IIS (this worked wonders). Alas in testing we f开发者_如何学编程ound one of the proxies we use makes this worthless.

So, instead I've decided to compress the data itself before it's sent out of our service. I'm using SharpZipLib in both WCF and our Silverlight client. It works well and shrinks our data from about 2.9MB to about 400KB, however the IIS compression was able to bring things down even further.

Now I'm curious...

  1. Is there any secret sauce behind the IIS GZip compression that makes it compress better?

  2. Is there a better compression algorithm that could be used?


1) Yes, there is a secret sauce, but I don't know what it is.

But this is what I really wanted to say:

2) Your desire for achieving higher compression is at risk of slowing your site down. Don't do it. Micro-optimisation is not helpful at this level.


Is there any secret sauce behind the IIS GZip compression that makes it compress better?

In SharpZipLib you can use SetLevel(9) (when using zip) to set to maximum compression. However, you have to remember that the whole payload is getting compressed when you go through IIS for compression, and when you do your own it's only part of the payload. So IIS will always be able to compress slightly more.

Is there a better compression algorithm that could be used?

Through IIS, not really. There are only so many compression methods usable through HTTP: http://en.wikipedia.org/wiki/HTTP_compression.

With custom compression, you can try 7zip, lzh, etc - anything you can find a library for or write yourself. A lot of it depends on what you are archiving, since different payloads compress differently. I'd try the ones built into sharpziplib right away (bzip2). I'd also try 7zip (possible with c#)


I toyed the various compressions for a little bit and then it dawned on me. The WCF endpoint is set up to use binaryEncoding. This means that IIS will take the binary encoded data and apply the compression to that.

In my case I was serialized the data using the standard DataContractSerializer and a MemoryStream. This however spits out XML.

The best solution we have found was to use the BinaryDictionaryWriter in my DataContractSerializer. This gives me binary encoded data that I can then compress using GZIP. Final results results in better compression than we got with IIS. (To the order of 700K via IIS to 500K using this method)

You can see an example of how to use the BinaryDictionaryWriter at the following post. It's the answer below the approved answer. How to transfer large amount of data using WCF?

Going to see the effects of now removing the binary encoding from the endpoint to see if that performance is worth the extra layer of "stuff".

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜