Fast datastore for logging
I'm c开发者_如何转开发urrently writing a questionnaire which gathers and stores a lot of data from different users and I'm looking into a way of efficiently storing the results. I have the following requirements :
- really fast write
- persistent
- usable from node.js
- small overhead
- no read authorized before storing the data (for performance sake, I need write-only)
Each user can POST
several results which I'll need to query by user id at a later point in time. Those requests will be handled by different node.js processes running in parallel. In the end, the data might look like:
user1:
result1
result2
result3
user2:
result1
user3:
result1
result2
And basically I'd need to be able to :
- Get the list of users
- Query the results from a given user
I first thought I'd use a file per user, but I'm afraid this won't scale as there could potentially be more users than the maximum allowed number of files.
Any suggestions ?
Edit : each result would be small, typically less than 50kb, if this might help.
Edit 2 : each result fits on a single ASCII line (no \n
in the data), otherwise the data should just be considered as a string with no particular structure.
A key value pair db like redis would help you. It can use used from node.js and you can query users with user id if you use userid as they key.
But more importantly.. Do you need performance or do you need to scale? :)
--Sai
I am surprised that no one hasn't tipped of an append only dbms like Couchdb.
As the writes only goes to the end of the database file, its not only very fast but also very robust. . Querying it is not a problem either as you have views (which you write in js). Talking to Couchdb is also quite simple from any language as you do it with REST/http.
Under some simple write bench I made I managed to utilize 100% of 10 cores, by inserting from several servers, which I think is quite powerful
I wouldn't use Tokyo Cabinet as the developing of it has officially stopped, in favor of Kyoto Cabinet
That's my 2cents
Not sure why this has to be NoSQL... Consider using SQLite instead of files. It's very fast, very durable, easy to query (SQL). It's a good fit for node, as node is single-threaded and SQLite is an in-process database.
Here's an API to access SQLite from node: http://code.google.com/p/node-sqlite/
Have a look at Elasticsearch. HTTP/JSON API, Lucene-backed, completely distributed. I've stored hundreds of TiB of data in it. It's even the default persistence for logstash, a common tool used for exactly what you are/were looking to do.
精彩评论