Doesn't External JavaScript files lead to more clientside processing?
I was thinking about external HTML files, and it occured to me, that if I group the functions from several HTML pages, in one JavaScript this will lead to extra开发者_如何学运维 clientside processing.
Basically, I would like some idea of whether this is correct.
Here is my thinking. Suppose I have one JavaScript file for five pages. If the user goes to each page, for each page he has to load not only the JavaScript for that page, but the JavaScript for the other four pages. The final sum is the user's browser loaded about 5 times as much JavaScript as he would have normally.
I think most people group there JavaScript by common functionality. So you can have the JavaScript file with several pages, however you may not use all the JavaScript on every page. So all the JavaScript you don't use on every page is run/loaded without need.
I have a sub-question. I know you don't have to redoanload the JavaScript file for each page. Is the JavaScript file run each time? Is the JavaScript reloaded? By reloaded, I mean what kind of over head is there for each time the browse has to get a file out of the cache?
Thanks, Grae
If I have a file of 200 lines, and seperate it out to 5 files of 40 lines each, the total number of lines remains at 200 BUT. remember that, if I pulled files 1-4 on the previous page, I only now need to pull file 5 since 1-4 are in my cache. additionally, most modern browsers are goint to thread those requests so instead of a single large file download for a single file, I get 5 threaded downloads of smaller files.
the overhead for the browsers would be pretty browser specific in how they handle it and above my head on exact implementation.
the user goes to each page, for each page he has to load not only the JavaScript for that page, but the JavaScript for the other four pages
If caching is set up correctly, the contrary will be true: The file will be loaded only once, at the beginning of the user's visiting the site. The overall amount of data to load will be reduced in most cases.
The JavaScript code for all four pages will be loaded in the browser's memory somehow, maybe even pre-parsed (I don't know the exact specifics of this), but that part of script processing is totally negligible.
It could still be wise to split your JS library into chunks, if they are totally separate on every four pages and really huge - it will depend on your script's structure. But mostly, having one external file, and therefore one slightly bigger request the first time but none afterwards, is the preferable way.
For your sub-question, take a look at Firebug's "Net" tab. It will show you which resources it loads and from where, and how long it takes to process them.
It's better to pack the javascript for all pages into one file. The file will be cached and not downloaded again by the browser for consecutive requests. The reason is that making a web request is far more expensive for your server and the client than for the client to parse the javascript-file.
Browsers are so fast these days that you don't have to worry about the client having to load some extra javascript that might not be used for that specific page.
To make your site fast, you should focus on keeping the amount of requests to an absolute minimum.
精彩评论