开发者

Interrupted server-side perl CGI script when client-side browser closes

I've been trying to solve a little problem for quite a while now but it seems I'm not able to.

I wrote an HTML page which calls a perl CGI script upon submitting a form. This CGI executes a certain number of tasks server-side and I made the script print the steps of these tasks into an iframe on the HTML page. The thing is, if the client closes his browser or just goes out of the page, the CGI script is interrupted on server-side.

HTML code:

<form action="/path/to/script.cgi" method="post" enctype="multipart/form-data" target="processing">
 <p>File to analyse: <input type="file" name="filename" size="50"/></p>
 <p><input type="submit" name="Submit" value="Analyse" /></p>
</form></font></p>

<p style="margin-left: 20"><font face="Arial" size="2" color="#000000">Analysing the file may take a while</font></p>
<iframe name="processing" width="70%" height="300">

</iframe>

CGI script:

my $query = new CGI;
print $query->header(-type => 'text/plain');
function1();
function2_takesLongTime($parameter);
function3();

So lets say that "function1" is just preparing some things for the file analysis (creating folders etc.). "function2" is the big function (may last 15 minutes). It is a perl program and it has a lot of "prints" which, because of the "text/plain" header of the CGI, are redirected into the iframe of the html page (using no buffering $| in the program). httpd is configured so that the time-out is much longer than 15 minutes so it doesn't come from there. "function3" is clean-up.

If the client stays on the html page, the CGI script runs perfectly. if the client is stopped (e.g. the user closes the window), function 1 and 2 are executed server-side, but the script seems to be interrupted after that because no clean-up is made.

I tried launching "function2" as an independent program with a system command, or creating a perl library and calling main function of this library, it still ends up the same way. I thought that no matter if the client stays or not on the page, the server-side script would still 开发者_如何学Crun all the way through. Is it because of the "text/plain" header of the CGI script which can't be returned to the client that the script is interrupted?

If anyone can help me out with this, I'd be very grateful.


You need to architect your software differently. Use some sort of asynchronous programming, a job queue would do fine, and use the CGI program only to put a job in the queue. Make the transition by using the library you have already mentioned.

You don't seem to be familiar with the concept, so I recommend a very simple one: Qudo. In case you find it too restrictive, switch to Gearman, which I think is the most popular one in the Perl world (see interfaces Gearman::XS and Gearman).

TIMTOWTDI: If you can give up on the execution environment provided by CGI.pm altogether, see e.g. Catalyst's RunAfterRequest.


Watching long processes through CGI provides a straightforward way to solve this issue.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜