开发者

Alternative to s3cmd for syncing between local folder and Amazon s3 with encryption capability [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.

We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.

Closed 4 years ago.

开发者_StackOverflowImprove this question

Currently I am writing something like this in ubuntu to sync my photos to Amazon s3, as an archive:

cd /media/PHOTO/Pictures
s3cmd sync --dry-run --delete-removed ./ s3://bob.photos/Pictures/ > ~/Desktop/output.txt

Since I think my photos are not very sensitive data, so I just leave the encryption issue aside. But as gmail start to have some issue to wiping out all of the user's email, so I am thinking of use s3 as my alternative backup for gmail also.

I am using getmail for retrieving mails from gmail, and I planned to upload it to s3 also, but the encryption issue arise again, this time I need to encrypt them all.

I have seen the s3cmd manual, and it said something like I need to upload every files again for every time, which I think it will be a waste of money.

Can anyone suggest an alternative of s3cmd, with encryption capability, preferably on the fly (i.e. it encrypts when upload, and decrypts when download but itself), manipulation by command is fine to me. And if it is OK, I will use it for my photos also.

Thanks in advance!


Try Rclone . This Allow you to do a Rsync - like operations. It can do multipart upload and can create upload in parrallel.. It also has Fuse capabilties... I use it for my website.

Create Bucket :

 rclone mkdir CONFIGURATIONFORCLOUD:my-bucket-in-the-cloud 

Dry Run :

rclone   --dry-run  --verbose --transfers=2 -q --checkers=1 copy  CONFIGURATIONFORCLOUD:my-bucket-in-the-cloud  /LOCALDIRECTORYFORFILETOBEUPLOADED --log-file /var/log/rclonelog/rclone.log

Start a sync between local folder and Remote Bucket :

rclone sync /LOCALDIRECTORYFORFILETOBEUPLOADED CONFIGURATIONFORCLOUD:my-bucket-in-the-cloud  --retries 4 --verbose --transfers=4 --log-file /var/log/rclonelog/rclone.log

As you can see you can LOG , Choose the number of concurrent sessions and do retries.

Rclone create a file profil per linux users, so If you have several users with different buckest level access, this can be used to match the rights


Just off-hand I would say the solution is download your emails, run gpg, save the output, then upload to s3 via s3cmd with sync. As long as you only download new email and gpg those you won't be uploading files repeatedly. Basically something like this:

getmail
gpg -e 
s3cmd sync

I'm obviously just prototyping here (i've never tried to do what you want) but idea should work.


Try s4cmd. It is the best and fastest alternative to s3cmd. I can upload about 30 GB over night.

http://stackoverflow.com/questions/39519397/issues-with-s4cmd

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜