Multiple Website Backup
Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup 开发者_如何学运维etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
- HTTP-Track website mirroring utility.
- Wget and scripts
- RSync and FTP login (or SFTP for security)
- Git can be used for backup and has security features and networking ability.
- 7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
- Put a backup generator script on each website (outputting a ZIP)
- Protect its access with a .htpasswd file
- On the backupserver, make a cron script download all the backups and store them
精彩评论