开发者

Best Site Spider?

I am moving a bunch of sites to a new server, and to ensure i don't miss anything, want to be able to give a program a list of sites and for it to download every page/image on there. Is there any software that can do this? I may also use it to download a copy of some wordpress sites, so i can just upload static files (some of my WP sites never get updated, so hardly开发者_StackOverflow worth setting up new dbs etc)


You'll probably get lots of opinions. Here is one: http://www.httrack.com/


wget is your tool.

on unix/linux systems, it may already be installed. for windows systems, download it from http://gnuwin32.sourceforge.net/packages/wget.htm.

it is a command line tool, with a bunch of command line options for controlling the way it crawls the target website. use "wget --help" to list all available options.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜