开发者

Site Performance and Download

I am wanting to find an auto开发者_StackOverflow社区mated way to download an entire website page (not the entire site just a single page) and all elements on the page, then sum the size of these files.

When I say files, I would like to know the total size of HTML, CSS, Images, local and remote JS files, and any CSS background images. Basically the entire page-weight for a given page.

I thought about using CURL but was not sure how to enable it to grab remote and local JS files as well as images referenced in the CSS files.


Try wget:

  • make it download all required files with -p or --page-requisites option
  • download scripts and images local to the site and not further than 2 hops away (this should get local images and code) with -l 2 for --level=2
  • and change the code files to link to your local files instead of their original path with -k for --convert-links:
    wget -p -l 2 -k http://full_url/to/page.html
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜