PHP set timeout for script with system call, set_time_limit not working
I have a co开发者_运维百科mmand-line PHP script that runs a wget request using each member of an array with foreach. This wget request can sometimes take a long time so I want to be able to set a timeout for killing the script if it goes past 15 seconds for example. I have PHP safemode disabled and tried set_time_limit(15) early in the script, however it continues indefinitely. Update: Thanks to Dor for pointing out this is because set_time_limit() does not respect system() calls.
So I was trying to find other ways to kill the script after 15 seconds of execution. However, I'm not sure if it's possible to check the time a script has been running while it's in the middle of a wget request at the same time (a do while loop did not work). Maybe fork a process with a timer and set it to kill the parent after a set amount of time?
Thanks for any tips!
Update: Below is my relevant code. $url is passed from the command-line and is an array of multiple URLs (sorry for not posting this initially):
foreach( $url as $key => $value){
$wget = "wget -r -H -nd -l 999 $value";
system($wget);
}
Try using the wget command line argument --timeout
in addition to set_time_limit()
.
Keep in mind set_time_limit(15)
restarts the timeout counter from zero so don't call it inside a loop (for your purpose)
from man wget
:
--timeout=seconds
Set the network timeout to seconds seconds. This is equivalent to specifying --dns-timeout, --connect-timeout, and --read-timeout, all at the same time.
When interacting with the network, Wget can check for timeout and abort the operation if it takes too long. This prevents anomalies like hanging reads and infinite connects. The only timeout enabled by default is a 900-second read timeout. Setting a timeout to 0 disables it altogether. Unless you know what you are doing, it is best not to change the default time-out settings.
All timeout-related options accept decimal values, as well as subsecond values. For example, 0.1 seconds is a legal (though unwise) choice of timeout. Subsecond timeouts are useful for checking server response times or for testing network latency.
EDIT: OK. I see what you're doing now. What you should probably do is use proc_open instead of system
, and use the time()
function to check the time, calling proc_terminate if wget tskes too long.
You can use a combination of "--timeout" and time(). Start off by figuring out how much time you have total, and lower the --timeout as your script runs.
For example:
$endtime = time()+15;
foreach( $url as $key => $value){
$timeleft = $endtime - time();
if($timeleft > 0) {
$wget = "wget -t 1 --timeout $timeleft $otherwgetflags $value";
print "running $wget<br>";
system($wget);
} else {
print("timed out!");
exit(0);
}
}
Note: if you don't use -t, wget will try 20 times, each waiting --timeout seconds.
Here's some example code using proc_open/proc_terminate (@Josh's suggestion):
$descriptorspec = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w")
);
$pipes = array();
$endtime = time()+15;
foreach( $url as $key => $value){
$wget = "wget $otherwgetflags $value";
print "running $wget\n";
$process = proc_open($wget, $descriptorspec, $pipes);
if (is_resource($process)) {
do {
$timeleft = $endtime - time();
$read = array($pipes[1]);
stream_select($read, $write = NULL, $exeptions = NULL, $timeleft, NULL);
if(!empty($read)) {
$stdout = fread($pipes[1], 8192);
print("wget said--$stdout--\n");
}
} while(!feof($pipes[1]) && $timeleft > 0);
if($timeleft <= 0) {
print("timed out\n");
proc_terminate($process);
exit(0);
}
} else {
print("proc_open failed\n");
}
}
You can direct the output of the wget program to a file, which then returns immediately.
Also see:
Note: The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Source: set_time_limit() @ php.net
I think the far best way is to handle the killing in the command itself, at least on Linux (Ubuntu 17.10). No other means worked for me... proc_terminate
from another answer worked but no output would be given.
In the example below, I adding a magic string that checks if your process is still running. If running longer than $sec
, it kills the process immediately, leaving a note 'Timeout!'.
Like this, I am sure the command won't exceed timeout and I receive all the stdout/stderr I need.
$wget = "wget -r -H -nd -l 999 $value"; // your command
$sec = 5; // timeout in seconds
$cmd = $wget . " & ii=0 && while [ \"2\" -eq \"`ps -p $! | wc -l \" ]; do ii=$((ii+1)); if [ \$ii -gt ".($sec)."0 ]; then echo 'Timeout!';kill $!; break;fi; sleep 0.1; done"
system($cmd); // system call
精彩评论