开发者

Saving website content as HTML for Offline use with link working

I'm having a site done in wordpress with some 10 pages. I'll need to save it, the whole site with the links working for offline use every time i update the content. Now what am doing is save each page then link everything etc.. but i want to do this programing.. so i dont have to do manually or are there any 开发者_Python百科tools or class already?

UPDATE: I'm downloading for creating a offline documentation kind of stuff, which will be distributed on CD's so links should be relative.


Use wget, check the -k option

  -k,  --convert-links      make links in downloaded HTML point to local files.

Reference: http://www.linuxask.com/questions/mirror-a-web-site-using-wget


If you set the links to a relative path (eg href="/foo/bar.html"), this can be accomplished.

Otherwise, I suggest setting up Apache or IIS on your local machine to be your test environment, and upload it to the web server when your code is ready for production.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜