开发者

Large Batches of Images to Resize and Store

I'm using PHP and Graphicsmagick and I have a bunch of users uploading batches of images.

These batches run from one image to hu开发者_运维知识库ndreds, maybe even a thousand.

I need to store these original uploads on Amazon S3 and I also need to resize each image to three different sizes and also store those copies on Amazon S3.

This needs to be as realtime as possible.

How would you architect this for best performance?


Pretty simple to achieve. When the user uploads an image add it to(lpush) the message queue. When you add a message to the queue you are not bothering the user of your website with the work(waiting), but instead do it offline. I would go for redis, because it is very powerful, fast, easy to use. You should look into redis because you can also use it for your caching needs and even more. It is nice. Next you spawn a couple of worker processes who do nothing else but process(blpop) the message in the queue(one by one). They get a message from the queue, do there job(image resizing) and when done get a next message from the queue. It is that simple and very fast. Especially if you use a PHP extension which compiles to C(C is very fast language) like for example phpredis.

P.S: good introductory tutorial explaining redis => http://simonwillison.net/static/2010/redis-tutorial/ (must read ;))

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜