开发者

Node.js web sockets server: Is my idea for data management stable/scalable?

I'm developing a html5 browser multi-player RPG with node.js running in the backend with a web sockets plug-in for client data transfer. The problem i'm facing is accessing and updating user data, as you can imagine this process will be taking place many times a second even with few users connected.

I've done some searching and found only 2 plug-ins for node.js that enable MySQL capabilities but they are both in early development and I've figured that querying the database for every little action the user makes is not efficient.

My idea is to get node.js to access the database using PHP when a user connects and retrieve all the information related to that user. The information collected will then be stored in an JavaScript object in node.js. This will happen for all users playing. Updates will then be applied to the object. When a user logs off the data stored in the object will be updated to the data开发者_如何学运维base and deleted from the object.

A few things to note are that I will separate different types of data into different objects so that more commonly accessed data isn't mixed together with data that would slow down lookups. Theoretically if this project gained a lot of users I would introduce a cap to how many users can log onto a single server at a time for obvious reasons.

I would like to know if this is a good idea. Would having large objects considerably slow down the node.js server? If you happen to have any ideas on another possible solutions for my situation I welcome them.

Thanks


As far as your strategy goes, for keeping the data in intermediate objects in php, you are adding a very high level of complexity to your application.

Just the communication between node.js and php seems complex, and there is no guarantee this will be any faster than just putting things right in mysql. Putting any uneeded barier between you and your data is going to make things more difficult to manage.

It seems like you need a more rapid data solution. You could consider using an asynchronous database like mongodb, or redis that will read and write quickly (redis will write in memory, should be incredibly fast)

These are both commonly used with node.js just for the reason that they can handle the real time data load.

Actually redis is what your really asking for, it actually stores things in memory and then persists it to the disk periodically. You can´t get any faster than that, but you will need enough ram. If ram looks like an issue, go with mongodb which is still really fast.

The disadvantage is you will need to relearn the ideas about data persistance, and that is hard. I´m in the process of doing that myself!


I have an application doing allmost what you describe- I choosed to do it that way since th MYSQL drivers for node was unstable/ undocumented at the time of development.

I have 200 connected users - requesting data 3-5 times each second, and fetch entire tables through php pages (each 200-800 ms) returning JSON from apache , with approx 1000 lines and put the contents in arrays. I loop through the arrays and find the relevant data on request - it works, and its fast - putting no significant load on cpu and memory.

All data insertion/updating, which is limited goes through php/mysql.

Advantages: 1. its a simple solution, with known stable services. 2. Only 1 client connecting to apache/php/mysql each 200-800 ms 3. all node clients get the benefit of non-blocking. 4. Runs on 2 small "pc-style" servers - and handles about 8000 req/second. (apache bench)

Disadvantages: 1. many - but it gets the job done.

I found that my node script COULD stop -1-2 times a week- maybe due to some connection problems (unsolved) - but Combined with Upstart and Monit it restarts and alerts with no problems.....

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜