开发者

Upload large files to BLOB

I'm working with saving big files(~200mb) directly into db.

I have issue with that.

Caused by increased huge use of free RAM(about 3gb of ram and 3gb of swap) on stage when file saves to 开发者_开发百科db:

@job.pdf = params[:job][:pdf].read

After this is completed there is still some RAM and swap in use.

Is there some way to optimize that?

p.s. project on rails 3.0.3, uses mysql, running on mogrel.


In MySQL, to be able to save or read BLOB fields with size more than 1MB, you have to increase server side parameter max_allowed_packet to be larger than default. In practice, you can't go much farther than 16-32MB for this parameter. Price for this increase is that every new db client will consume at least as much memory, and in general, server performance will greatly suffer.

In other words, MySQL does not really support handling BLOB fields larger than 1MB (if you can't or don't want to fiddle with server configuration) to around 16MB (even if you do want to do that).

This can be philosophical question - is it good idea or not to keep big blobs in database? I think for many tasks (but not for all) is it great idea, and because MySQL is so bad it this (and for host of other reasons), I simply avoid using it as my SQL server solution.

Instead, I use PostgreSQL, which perfectly supports BLOBs (actually, BYTEA) to advertized limit of 4GB without any tweaks on client or server. In addition to that, it will actually transparently compress them with LZ algorithm - slightly worse than gzip, but still much better than no compression at all.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜