开发者

How to scp to Amazon s3?

开发者_如何转开发I need to send backup files of ~2TB to S3. I guess the most hassle-free option would be Linux scp command (have difficulty with s3cmd and don't want an overkill java/RoR to do so).

However I am not sure whether it is possible: How to use S3's private and public keys with scp, and don't know what would be my destination IP/url/path?

I appreciate your hints.


As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).

Official AWS tools for copying files to/from S3

  1. command line tool (pip3 install awscli) - note credentials need to be specified, I prefer via environment variables rather than a file: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY.

    aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
    
    • http://docs.aws.amazon.com/cli/latest/reference/s3/index.html

    and an rsync-like command:

    aws s3 sync . s3://mybucket
    
    • http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
  2. Web interface:

    • https://console.aws.amazon.com/s3/home?region=us-east-1

Non-AWS methods

Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.

  • https://github.com/ncw/rclone

EDIT: Actually, AWS CLI is based on botocore:

https://github.com/boto/botocore

So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.


Here's just the thing for this, boto-rsync. From any Linux box, install boto-rsync and then use this to transfer /local/path/ to your_bucket/remote/path/:

boto-rsync -a your_access_key -s your_secret_key /local/path/ s3://your_bucket/remote/path/

The paths can also be files.

For a S3-compatible provider other than AWS, use --endpoint:

boto-rsync -a your_access_key -s your_secret_key --endpoint some.provider.com /local/path/ s3://your_bucket/remote/path/


You can't SCP.

The quickest way, if you don't mind spending money, is probably just to send it to them on a disk and they'll put it up there for you. See their Import/Export service.


Here you go,

scp USER@REMOTE_IP:/FILE_PATH >(aws s3 cp - s3://BUCKET/SAVE_FILE_AS_THIS_NAME)


Why don't you scp it to an EBS volume and then use s3cmd from there? As long as your EBS volume and s3 bucket are in the same region, you'll only be charged for inbound data charges once (from your network to the EBS volume)

I've found that once within the s3 network, s3cmd is much more reliable and the data transfer rate is far higher than direct to s3.


There is an amazing tool called Dragon Disk. It works as a sync tool even and not just as plain scp.

http://www.s3-client.com/

The Guide to setup the amazon s3 is provided here and after setting it up you can either copy paste the files from your local machine to s3 or setup an automatic sync. The User Interface is very similiar to WinSCP or Filezilla.


for our AWS backups we use a combination of duplicity and trickle duplicity for rsync and encryption and trickle to limit the upload speed

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜