Best Site Spider?
I am moving a bunch of sites to a new server, and to ensure i don't miss anything, want to be able to give a program a list of sites and for it to download every page/image on there. Is there any software that can do this? I may also use it to download a copy of some wordpress sites, so i can just upload static files (some of my WP sites never get updated, so hardly开发者_StackOverflow worth setting up new dbs etc)
You'll probably get lots of opinions. Here is one: http://www.httrack.com/
wget is your tool.
on unix/linux systems, it may already be installed. for windows systems, download it from http://gnuwin32.sourceforge.net/packages/wget.htm.
it is a command line tool, with a bunch of command line options for controlling the way it crawls the target website. use "wget --help" to list all available options.
精彩评论