开发者

Ruby: Streaming large AWS S3 object freezes

I am using the ruby aws/s3 library to retrieve files from Amazon S3. I stream an object and write it to file as per the documentation (with debug every 100 chunks to confirm progress)

This works for small files, but randomly freezes do开发者_如何学Gownloading large (150MB) files on VPS Ubuntu. Fetching the same files (150MB) from my mac on a much slower connection works just fine.

When it hangs there is no error thrown and the last line of debug output is the 'Finished chunk'. I've seen it write between 100 and 10,000 chunks before freezing.

Anyone come across this or have ideas on what the cause might be?

Thanks

The code that hangs:

  i=1
  open(local_file, 'w') do |f|
    AWS::S3::S3Object.value(key, @s3_bucket) do |chunk|
      puts("Writing chunk #{i}")
      f.write chunk.read_body
      puts("Finished chunk #{i}")
      i=i+1
    end
  end


I have similar code pulling S3 objects and writing to local files. Have found that something in ruby is leaking memory. Watching "top" in another window, the resident size just goes up and up. It freezes at some point, seems to hang for a minute or more. Then it is killed by the linux OOM killer. Check dmesg outout so see if your process is being killed by OOM killer. You may see a line there like

Out of memory: Killed process 12345 (ruby).

I have not been able to determine why this leaks memory. My code isn't exactly like yours but very similar.


Try using right_aws gem instead. It does auto retry.

s3 = RightAws::S3Interface.new(@access_key_id, @secret_access_key)
open(local_file, 'wb') do |f|
  s3.get(@my_bucket, file_path) do |chunk|
    f.write(chunk)
  end
end
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜