Push received live feed data to 10,000 users?
I'm after some guidance on a new project I'm working on that requires low latency and high concurrency. The project involves receiving live data from a third party feed and after some basic processing and storage, sending these values to all users currently active on the website.
The data arrives via HTTP Push and my current plan is to use Node.js to receive this data, which then runs the data through an algorithm before updating related data in a database of some kind. Finally, updates are sent to all connected users of the website via a websocket.
Now, I'm trying to have this scalable to handle over 10,000 connected users at once, all connected via websocket and sent updates approximately once every 3 seconds. Given that each user can then interact with the web app during this, it's to result in many requests back and forth.
Now, apart from the high level basic idea I have, with the decision to have Ruby on Rails as the website framework and node js to handle the 'li开发者_如何转开发veness' of it all - I'm a little stuck. I don't know what kind of database to use (I imagine it'll be a non relational database for quick storage) and I don't know the specifics of how to architect such a set up, along with how to implement the logic.
So my question is: Given my goal, how do I go about structuring such an application and what do I need to know in order to have it scalable and real-time to the level I desire?
Thanks greatly for any help.
I would recommend a few libraries to look at.
now
remote RPC made trivial.cradle
couchdb database abstraction for persistant storage.- 'node_redis' redis database abstraction for cross machine communication
cluster
extend your program across multiple processes.
精彩评论