开发者

gzip: stdout: File too large when running customized backup script

I've create a plain and siple backup script that only backs up开发者_开发技巧 certain files and folders.

tar -zcf $DIRECTORY/var.www.tar.gz /var/www
tar -zcf $DIRECTORY/development.tar.gz /development
tar -zcf $DIRECTORY/home.tar.gz /home

Now this script runs for about 30mins then gives me the following error

gzip: stdout: File too large

Any other solutions that I can use to backup my files using shell scripting or a way to solve this error? I'm grateful for any help.


File too large is a error message from your libc: The output has exceeded the file size limit of your filesystem.

So this is not a gzip issue.

Options: Use another Filesystem or use split:

tar czf - www|split -b 1073741824 - www-backup.tar.

creates the backup.

Restore it from multiple parts:

cat www-backup.tar.*|gunzip -c |tar xvf -


Can the file system you are backing up to support large files?

Specifically, FAT32 has a limit of ~4GB in a single file, and other filesystems have similar limits.

If your backup is running for 30 minutes, the file could easily be getting that sort of size.


Use a different compression utility, like compress or bzip2

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜