Large file upload to amazon s3 failing after 30 second limit set by heroku
I store my uploaded files in amazon s3 services with the following command
AWS::S3::S3Object.store(params[:uploadfile].original_filename, open(params[:uploadfile]), 'mybucket', :access => :private, :content_type => params[:uploadfile].content_type)
I can upload file's up to 30Mb without having a problem. I have read in other posts that this could be due to the fact the file is being loaded into memory(confused). The largest file i am going to upload is 40Mb, how can i achieve this without the upload failing.
My chrome browser returns the 开发者_如何学Pythonfollowing error to me
Error 101 (net::ERR_CONNECTION_RESET): The connection was reset.
When i tried uploading from my development machine(localhost), i could upload large file > 80-100Mb, however its not working from heroku, i don't understand why, because i am uploading files directly to s3.
Strangely my downloads fail after 30 seconds , which is the timeout limit that heroku sets, however i do not recieve any error of timeout or failed upload from heroku logs
Thank you for your help
After many months on this issue, i found a gem that works well, by uploading directly to amazon s3, without any complex flash, and javascript suff. I also integrates into carrierwave. The gem is called Carrierwave_direct
Works without a problem, however if you are using rails 3.0.x checkout this page for a solution.
If you are using rails rails 3.1.x, you are all set to go.
It appears that you're not actually uploading directly to S3, but rather uploading to Heroku, which is then uploading to S3.
You should use something like https://github.com/GreenAsJade/s3-swf-upload-plugin to help you implement to help you implement true direct to S3 uploading ( http://docs.amazonwebservices.com/AmazonS3/latest/dev/index.html?UsingHTTPPOST.html )
精彩评论