开发者

Amazon SQS to funnel database writes

Assume I am building netflix and I want to log each view by the userID and the movie ID

The format would be viewID , userID, timestamp,

However in order to scale this, assume 开发者_StackOverflow中文版were getting 1000 views a second. Would it make sense to queue these views to SQS and then our queue readers can un-queue one by one and write it to the mysql database. This way the database is not overloaded with write requests.

Does this look like something that would work?


Faisal,

This is a reasonable architecture; however, you should know that writing to SQS is going to be many times slower than writing to something like RabbitMQ (or any local) message queue.

By default, SQS FIFO queues support up to 3,000 messages per second with batching, or up to 300 messages per second (300 send, receive, or delete operations per second) without batching. To request a limit increase, you need to file a support request.

That being said, starting with SQS wouldn't be a bad idea since it is easy to use and debug.

Additionally, you may want to investigate MongoDB for logging...check out the following references:

MongoDB is Fantastic for Logging

http://blog.mongodb.org/post/172254834/mongodb-is-fantastic-for-logging

Capped Collections

http://blog.mongodb.org/post/116405435/capped-collections

Using MongoDB for Real-time Analytics

http://blog.mongodb.org/post/171353301/using-mongodb-for-real-time-analytics

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜