开发者

How to Increase the time till a read timeout error occurs?

I've written in PHP a script that takes a long time to execute [Image processing for thousands of pictures]. It's a meter of hours - maybe 5.

After 15 minutes of processi开发者_如何学编程ng, I get the error:


ERROR The requested URL could not be retrieved

The following error was encountered while trying to retrieve the URL: The URL which I clicked

Read Timeout

The system returned: [No Error]

A Timeout occurred while waiting to read data from the network. The network or server may be down or congested. Please retry your request.

Your cache administrator is webmaster.


What I need is to enable that script to run for much longer.

Now, here are all the technical info: I'm writing in PHP and using the Zend Framework. I'm using Firefox. The long script that is processed is done after clicking a link. Obviously, since the script is not over I see the web page on which the link was and the web browser writes "waiting for ...". After 15 minutes the error occurs.

I tried to make changes to Firefox threw about:config but without any success. I don't know, but the changes might be needed somewhere else.

So, any ideas?

Thanks ahead.


set_time_limit(0) will only affect the server-side running of the script. The error you're receiving is purely browser-side. You have to send SOMETHING to keep the browser from deciding the connection's dead - even a single character of output (followed by a flush() to make sure it actually get sent out over the wire) will do. Maybe once every image that's processed, or on a fixed time interval (if last char sent more than 5 minutes ago, output another one).

If you don't want any intermediate output, you could do ignore_user_abort(TRUE), which will allow the script to keep running even if the connection gets shut down from the client side.


If the process runs for hours then you should probably look into batch processing. So you just store a request for image processing (in a file, database or whatever works for you) instead of starting the image processing. This request is then picked up by a scheduled (cron) process running on the server, which will do the actual processing (this can be a PHP script, which calls set_time_limit(0)). And when processing is finished you could signal the user (by mail or any other way that works for you) that the processing is finished.


use set_time_limit

documentation here

http://nl.php.net/manual/en/function.set-time-limit.php


If you can split your work in batches, after processing X images display the page with some javascript (or META redirects) on it to open the link http://server/controller/action/nextbatch/next_batch_id.

Rinse and repeat.


batching the entire process also has the added benefit that once something goes wrong, you don't have to start out the entire thing anew.

If you're running on a server of your own and can get out of safe_mode, then you could also fork background processes to do the actual heavy lifting, independent of your browser view of things. If you're in a multicore or multiprocessor environment, you can even schedule more than one running process at any time.

We've done something like that for large computation scripts; synchronization of the processes happened over a shared database---but luckily enough, they processes were so independent that the only thing we needed to see was their completion or termination.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜