开发者

Web Optimization: Why are Combined Files Faster?

I have read that combining all of your css files into one big one, or all of your script files into a single script file reduces the number of HTTP reque开发者_如何学JAVAsts and therefore speeds up download speed.

But I don't understand this. I thought that if you had multiple files (up to a limit, which is 10 I believe on modern browsers), the browser would download them in parallel, thus REDUCING the total time to download (divided by the number of connections allowed).

I am obviously missing a key piece of info here. Can someone turn on the lights?


There's overhead in every request/response. That's essentially what it comes down to.

Here's an example of a request header to Google ...

GET http://www.google.com/ HTTP/1.1 Accept: application/x-ms-application, image/jpeg, application/xaml+xml, image/gif, image/pjpeg, application/x-ms-xbap, application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword, application/x-shockwave-flash, / Accept-Language: en-US User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/4.0; GTB0.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; OfficeLiveConnector.1.4; OfficeLivePatch.1.3) Accept-Encoding: gzip, deflate Connection: Keep-Alive Host: www.google.com Cookie: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

I wrote an article about this last year... http://swortham.blogspot.com/2010/03/latency-requests-css-sprites-and-you.html

You are right that multiple files can be downloaded in parallel (2 or more from a single hostname, depending on the browser). And that in turn will cause the page to load progressively, which is good. But that doesn't mean that your homepage should be composed of 20+ css, js, and image files. Ideally you'd want to combine quite a bit to optimize the site.


Several things:

  • TCP setup and teardown - with the widespread use of keep-alive and pipelining, this is no longer too significant, except with some proxies which fall back to this "one HTTP request - one TCP connection" model for compatibility reasons.
  • HTTP headers overhead - this could be significant for small files - hundreds of bytes of headers can be larger than the response body.
  • latency (time from request start to response start) - this is somewhat reduced with keepalive and pipelining
  • limits on parallel downloads - this is the main one. IE6 used to limit this to 2 connections per hostname, it has been bumped to 6 in IE8 (other browsers have had sane limits for a while). See this older study on further parallelizing this ("use 4 domain names instead of 1").
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜