Node.js, (Hi)Redis and the multi command
I'm playing around with node.js and redis and installed the hiredis library via this command
npm install hiredis redis
I looked at the multi examples here:
https://github.com/mranney/node_redis/blob/master/examples/multi2.js
At line 17 it says
// you can re-run the same transaction if you like
which implies that the internal multi.queue object is never cleared once the commands finished executing.
My question is: How would you handle the situation in an http environment? For example, tracking the last connected user (this doesn't really need multi as it just executes one command but it's easy to follow)
var http = require('http');
redis = require('redis');
client = redis.createClient()
multi = client.multi();
http.createServer(function (request, response) {
multi.set('lastconnected', request.ip); // won't work, just an example
multi.exe开发者_StackOverflowc(function(err, replies) {
console.log(replies);
});
});
In this case, multi.exec would execute 1 transaction for the first connected user, and 100 transactions for the 100th user (because the internal multi.queue object is never cleared).
Option 1: Should I create the multi object inside the http.createServer callback function, which would effectivly kill it at the end of the function's execution? How expensive in terms of CPU cycles would creating and destroying of this object be?
Option 2: The other option would be to create a new version of multi.exec(), something like multi.execAndClear() which will clear the queue the moment redis executed that bunch of commands.
Which option would you take? I suppose option 1 is better - we're killing one object instead of cherry picking parts of it - I just want to be sure as I'm brand new to both node and javascript.
The multi objects in node_redis are very inexpensive to create. As a side-effect, I thought it would be fun to let you re-use them, but this is obviously only useful under some circumstances. Go ahead and create a new multi object every time you need a new transaction.
One thing to keep in mind is that you should only use multi if you actually need all of the operations to execute atomically in the Redis server. If you just want to batch up a series of commands efficiently to save network bandwidth and reduce the number of callbacks you have to manage, just send the individual commands, one after the other. node_redis will automatically "pipeline" these requests to the server in order, and the individual command callbacks, if any, will be invoked in order.
精彩评论