Fork safely from PHP
From the get go, let me say that I'm trying to avoid using pcntl_fork()
For this example, lets imagine that I'm trying to fork many instances of the 'dig' command line application. In reality the same script will be used for different command line apps.
Right now I'm using php exec and appending & to the command so that bash runs it in the background.
E.g
exec("dig google.com &");
exec("dig yahoo.com &");
and so on...
This successfully creates multiple processes of dig running parallel.
The problem I'm having is that the number of processes is rising steadily until the system crashes. Essentially it's a fork bomb.
I tried to combat this by checking the number of running processes using ps ax | wc -l
E.g (running on a loop)
if 80 processes are running, i'll launch another 20.
if 70 processes are running, i'll launch another 30.
The problem is, that even with this check in place, the number of processes cont开发者_C百科inues to rise until the system crashes or it hits the operating systems max user processes
Can anyone give me some hints on how I can fork effectively (mass) without raping all the system resources? I can't see why this current method isn't working tbh.
Since you have a management process, I suggest you watch over the created subprocesses. Save the PID of every subscript you start:
$running[] = exec("process www.google.com & echo $!");
Where $!
will return the PID of the backgrounded process, adding it to a list in PHP. Then in your management loop, just recheck if the processes are still active:
do {
foreach ($running as $i=>$pid) {
if (!posix_getpgid($pid)) {
unset($running[$i]);
// restart
}
} }
I don't think it's very elegant or reliable. pcntl_fork is often the better approach, but you din't elaborate on your actual scripts. But maybe this works in your case.
You may also want to use uptime
to check system load.
精彩评论