开发者

Migrate large data set to s3

I am try to rename and migrate 2TB(10's of millions) of images from a local s开发者_Go百科erver onto amazon s3. I wrote PHP to send them using the aws library (running on the same local machine), but I don't have the 6 months it wants to finish. Looking for some ideas on doing this faster.

I have 2 ideas but not sure they would be any better.... 1. Use some mounted s3 solution (s3fs) that will parallel the upload (will it???) 2. Pull the images onto an ec2 instance and send to s3 from there. I could pull them with ssh/FTP or http. Still probably need to parallel them manually??

Any help would be appreciated.


Another possibility is to ship Amazon a hard drive with your data.


Split your 10M+ list of images into subsets. Upload each subset to S3 in parallel.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜