开发者

Considerations for ASP.NET application with long running synchronous requests

Under windows server 2008 64bit, IIS 7.0 and .NET 4.0 if an ASP.NET application (using ASP.NET thread pool, synchronous request processing) is long running (> 30 minutes). Web application has no page and main purpose is reading huge files ( > 1 GB) in chunks (~5 MB) and transfer them to the clients. Code:

while (reading)
{
    Response.OutputStream.Write(buffer, 0, buffer.Length);
    Response.Flush();
}

Single producer - single consumer pattern implemented so for each request there are two threads. I don't use task library here but please let me know if it has advantage over traditional thread creation in this scenario. HTTP Handler (.ashx) is used instead of a (.aspx) page. Under stress test CPU utilization is not a problem but with a single worker process, after 210 concurrent clients, new connections encounter time-out. This is solved by web gardening since I don't use session state. I'm not sure i开发者_如何学JAVAf there's any big issue I've missed but please let me know what other considerations should be taken in your opinion ?

for example maybe IIS closes long running TCP connections due to a "connection timeout" since normal ASP.NET pages are processed in less than 5 minutes, so I should increase the value.

I appreciate your Ideas.


Personally, I would be looking at a different mechanism for this type of processing. HTTP Requests/Web Applications are NOT designed for this type of thing, and stability is going to be VERY hard, you have a number of risks that could cause you major issues as you are working with this type of model.

I would move that processing off to a backend process, so that you are OUTSIDE of the asp.net runtime, that way you have more control over start/shutdown, etc.


First, Never. NEVER. NEVER! do any processing that takes more than a few seconds in a thread pool thread. There are a limited number of them, and they're used by the system for many things. This is asking for trouble.

Second, while the handler is a good idea, you're a little vague on what you mean by "generate on the fly" Do you mean you are encrypting a file on the fly and this encryption can take 30 minutes? Or do you mean you're pulling data from a database and assembling a file? Or that the download takes 30 minutes to download?

Edit:

As I said, don't use a thread pool for anything long running. Create your own thread, or if you're using .NET 4 use a Task and specify it as long running.


Long running processes should not be implemented this way. Pass this off to a service that you set up.

IF you do want to have a page hang for a client, consider interfacing from AJAX to something that does not block on IO threads - like node.js.

Push notifications to many clients is not something ASP.NET can handle due to thread usage, hence my node.js. If your load is low, you have other options.


Use Web-Gardening for more stability of your application.

Turn-off caching since you don't have aspx pages

It's hard to advise more without performance analysis. You the VS built-in and find the bottlenecks.


The Web 1.0 way of dealing with long running processes is to spawn them off on the server and return immediately. Have the spawned off service update a database with progress and pages on the site can query for progress.

The most common usage of this technique is getting a package delivery. You can't hold the HTTP connection open until my package shows up, so it just gives you a way to query for progress. The background process deals with orchestrating all of the steps it takes for getting the item, wrapping it up, getting it onto a UPS truck, etc. All along the way, each step is recorded in the database. Conceptually, it's the same.

Edit based on Question Edit: Just return a result page immediately, and generate the binary on the server in a spawned thread or process. Use Ajax to check to see if the file is ready and when it is, provide a link to it.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜