开发者

Why did .NET's EnableDecompression default value change between 2.0 and 3.0?

We use .NET Web Services--both non-WCF and WCF, though the overwhelming majority is non-WCF, for legacy reasons--pretty heavily, and as I was testing something in Fiddler, I noticed that the response body size was pretty large. Then I noticed that the request headers didn't have any Accept-Encoding headers.

After doing some digging, it appears that the default value for the property HttpWebClientProtocol.EnableDecompression (from the class of which all wsdl.exe-originated WS stubs derive) changed between .NET BCL versions 2.0 and 3.0. I'm curious as to the reason (which may be WCF-related), and further 开发者_JAVA百科as to whether there are any other [pretty] fundamental changes that are pretty quiet when you simply link against a different library.


Take a look at this connect link. The first comment from Microsoft states the following:

Please also note that as part of the fixed we changed the default value of EnableDecompression to be false by default. We were concerned that having it on by default would break existing customers that had implemented decompression on top of ASP.NET Web Services in v1.1.

It looks like the change was a result of a bug that they needed to fix.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜