high performance (yet dumb) web server
I'm trying to write a very simple web server that does the following:
- Receive request.
- Respond with a tiny file; close the connection.
- Handle the request data.
In other words, the response doesn't depend on the request information, but the request information is still important. The data will be persisted and then used for analytics.
I've tried to do this with some event-driven networking frameworks, but they all seem to hold the connection until the handling code returns. This makes sense, because generally a server doesn't have to 开发者_Go百科do any work after responding, but in my case there's no need for this particular way of doing things.
Ideally, the server should keep responding to requests, whilst the request data is added to a stack which is emptied as it is persisted.
We expect to handle thousands of requests per second. Is event-driven programming really the way to go, or should I stick with (traditional) threads? Which language or framework is more appropriate for this kind of work?
Thanks.
Have you considered using Node.js? It allows you to write http-oriented server programs quickly and easily, using javascript. It seems to be pretty well suited to your needs, as its behavior is customizable and it is said to scale pretty nicely.
You might want to consider reading some tutorials.
I realized that instead of using the callback (or green thread, if you will) to do any sort of real work, I'd be better off just delegating the request data to an independent application. Some research into this pointed me to work queues like beanstalkd and RabbitMQ.
beanstalkd seems lighter and faster than the competition, so I'll probably stick to it.
精彩评论