Upload folder with subfolders using S3 and the AWS console
When I try to upload a folder with subfolders to S3 through the AWS console, only the files are uploaded not the subfolders.
You also can't se开发者_C百科lect a folder. It always requires opening the folder first before you can select anything.
Is this even possible?
I suggest you to use AWS CLI. As it is very easy using command line and awscli
aws s3 cp SOURCE_DIR s3://DEST_BUCKET/ --recursive
or you can use sync by
aws s3 sync SOURCE_DIR s3://DEST_BUCKET/
Remember that you have to install aws cli and configure it by using your Access Key ID and Secrect Access Key ID
pip install --upgrade --user awscli
aws configure
You don't need Enhanced Uploader (which I believe does not exist anymore) or any third-party software (that always has a risk that someone will steal your private data or access keys from the S3 bucket or even from all AWS resources).
Since the new AWS S3 Web Upload manager supports drag'n'drop for files and folders, just login to https://console.aws.amazon.com/s3/home and start the uploading process as usual, then just drag the folder from your desktop directly to the S3 page.
The Amazon S3 Console now supports uploading entire folder hierarchies. Enable the Ehanced Uploader in the Upload dialog and then add one or more folders to the upload queue.
http://console.aws.amazon.com/s3
Normally I use the Enhanced Uploader available via the AWS management console. However, since that requires Java it can cause problems. I found s3cmd to be a great command-line replacement. Here's how I used it:
s3cmd --configure # enter access keys, enable HTTPS, etc.
s3cmd sync <path-to-folder> s3://<path-to-s3-bucket>/
Execute something similar to the following command:
aws s3 cp local_folder_name s3://s3_bucket_name/local_folder_name/ --recursive
I was having problem with finding the enhanced uploader tool for uploading folder and subfolders inside it in S3. But rather than finding a tool I could upload the folders along with the subfolders inside it by simply dragging and dropping it in the S3 bucket.
Note: This drag and drop feature doesn't work in Safari. I've tested it in Chrome and it works just fine.
After you drag and drop the files and folders, this screen opens up finally to upload the content.
Solution 1:
var AWS = require('aws-sdk');
var path = require("path");
var fs = require('fs');
const uploadDir = function(s3Path, bucketName) {
let s3 = new AWS.S3({
accessKeyId: process.env.S3_ACCESS_KEY,
secretAccessKey: process.env.S3_SECRET_KEY
});
function walkSync(currentDirPath, callback) {
fs.readdirSync(currentDirPath).forEach(function (name) {
var filePath = path.join(currentDirPath, name);
var stat = fs.statSync(filePath);
if (stat.isFile()) {
callback(filePath, stat);
} else if (stat.isDirectory()) {
walkSync(filePath, callback);
}
});
}
walkSync(s3Path, function(filePath, stat) {
let bucketPath = filePath.substring(s3Path.length+1);
let params = {Bucket: bucketName, Key: bucketPath, Body: fs.readFileSync(filePath) };
s3.putObject(params, function(err, data) {
if (err) {
console.log(err)
} else {
console.log('Successfully uploaded '+ bucketPath +' to ' + bucketName);
}
});
});
};
uploadDir("path to your folder", "your bucket name");
Solution 2:
aws s3 cp SOURCE_DIR s3://DEST_BUCKET/ --recursive
It's worth mentioning that if you are simply using S3 for backups, you should just zip the folder and then upload that. This Will save you upload time and costs.
If you are not sure how to do efficient zipping from the terminal have a look here for OSX.
And $ zip -r archive_name.zip folder_to_compress
for Windows.
Alternatively a client such as 7-Zip would be sufficient for Windows users
I do not see Python answers here. You can script folder upload using Python/boto3. Here's how to recursively get all file names from directory tree:
def recursive_glob(treeroot, extention):
results = [os.path.join(dirpath, f)
for dirpath, dirnames, files in os.walk(treeroot)
for f in files if f.endswith(extention)]
return results
Here's how to upload a file to S3 using Python/boto:
k = Key(bucket)
k.key = s3_key_name
k.set_contents_from_file(file_handle, cb=progress, num_cb=20, reduced_redundancy=use_rr )
I used these ideas to write Directory-Uploader-For-S3
I ended up here when trying to figure this out. With the version that's up there right now you can drag and drop a folder into it and it works, even though it doesn't allow you to select a folder when you open the upload dialogue.
You can drag and drop those folders. Drag and drop functionality is supported only for the Chrome and Firefox browsers. Please refer this link https://docs.aws.amazon.com/AmazonS3/latest/user-guide/upload-objects.html
You can use Transfer Manager to upload multiple files, directories etc More info on:
https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/examples-s3-transfermanager.html
You can upload files by dragging and dropping or by pointing and clicking. To upload folders, you must drag and drop them. Drag and drop functionality is supported only for the Chrome and Firefox browsers
Custom endpoint
if you have a custom endpoint implemented by your IT, try this
aws s3 cp <local-dir> s3://bucket-name/<destination-folder>/ --recursive --endpoint-url https://<s3-custom-endpoint.lan>
Drag and drop is only usable for a relatively small set of files. If you need to upload thousands of them in one go, then the CLI is the way to go. I managed to upload 2,000,00+ files using 1 command...
精彩评论