开发者

How can I get the total size of an Amazon S3 bucket using Node.js?

I'm working on a few Node.js projects where Amazon S3 buckets will be set up for each user who creates an account. I wanting to limit the total size of what a user can use.

I've been looking at the Knox client https://github.com/learnboost/knox which should make working with Amazon S3 easy when developing Node.js applications.

But after much research, I can't seem to find a way of getting back a bucket file size efficiently - on which I can do user account limitations etc.

I'm hoping this is the ri开发者_如何学Pythonght approach in the first, but maybe not? Conceptually, I'm wanting to store user account media files uploaded on Amazon S3 and I want to limit how much the user can upload and use in total.

Many thanks, James


There is no exposed API to get the size of a bucket. The only way to do it is to get all the keys, iterate through them, and sum up the size of all objects in the bucket.


As Elf stated, there is no direct way to get the total size of a bucket from any of the S3 operations. The best you can do is loop through all of the items in the bucket and sum their respecive sizes.

Here's a complete example of a program which lists all of the objects in a bucket and prints out a summary of file count and byte count at the end:

  • https://github.com/appsattic/node-awssum-scripts/blob/master/bin/amazon-s3-list.js

Feel free to copy it and change it to your requirements.


I know you asked about Node.js, but for those who just want ANY way to do this, check out the s3cmd tool: http://s3tools.org/s3cmd

Once you get that setup you can run s3cmd du s3://bucket-name

Try it on a small bucket first to be sure it is working. This command still loops through everything, so big bucket = big time.


I got to this page because I was looking for the same solution. The aws cli has a way of doing this: aws s3api list-objects --bucket $bucket --output json --query "[sum(Contents[].Size)]"

I wrote a super simple wrapper that will convert the bytes to KB, MB, or GB. I am not the most elegant coder on the planet but this works by going: s3du my-bucket-name g (for GB, k for KB, m for MB)

The larger the bucket, the longer this will take, but it works: https://github.com/defenestratexp/s3du.git

Obviously, you have to have the aws cli properly installed for this method to work. Cheers :D

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜