开发者

Why does an infinitely recursive function in PHP cause a segfault?

A hypothetical question for you all to chew on...

I recently answered another question on SO where a PHP script was segfaulting, and it reminded me of something I have always wondered, so let's see if anyone can shed any light on it.

Consider the following:

<?php

  function segfault ($i = 1) {
    echo "$i\n";
    segfault($i + 1);
  }

  segfault();

?>

Obviously, this (useless) function loops infinitely. And eventually, will run out of memory because each call to the function executes before the previous one has finished. Sort of like a fork bomb without the forking.

But... eventually, on POSIX platforms, the script will die with SIGSEGV (it also dies on Windows, but more gracefully - so far as my extremely limited low-level debugging skills can tell). The number of loops varies depending on the system configuration (memory allocated to PHP, 32bit/64bit, etc etc) and the OS but my real question is - why does it happen with a segfault?

  • Is this simply how PHP handles "out-of-memory" errors? Surely there mus开发者_运维百科t be a more graceful way of handling this?
  • Is this a bug in the Zend engine?
  • Is there any way this can be controlled or handled more gracefully from within a PHP script?
  • Is there any setting that generally controls that maximum number of recursive calls that can be made in a function?


If you use XDebug, there is a maximum function nesting depth which is controlled by an ini setting:

$foo = function() use (&$foo) { 
    $foo();
};
$foo();

Produces the following error:

Fatal error: Maximum function nesting level of '100' reached, aborting!

This IMHO is a far better alternative than a segfault, since it only kills the current script, not the whole process.

There is this thread that was on the internals list a few years ago (2006). His comments are:

So far nobody had proposed a solution for endless loop problem that would satisfy these conditions:

  1. No false positives (i.e. good code always works)
  2. No slowdown for execution
  3. Works with any stack size

Thus, this problem remains unsloved.

Now, #1 is quite literally impossible to solve due to the halting problem. #2 is trivial if you keep a counter of stack depth (since you're just checking the incremented stack level on stack push).

Finally, #3 Is a much harder problem to solve. Considering that some operating systems will allocate stack space in a non-contiguous manner, it's not going to be possible to implement with 100% accuracy, since it's impossible to portably get the stack size or usage (for a specific platform it may be possible or even easy, but not in general).

Instead, PHP should take the hint from XDebug and other languages (Python, etc) and make a configurable nesting level (Python's is set to 1000 by default)....

Either that, or trap memory allocation errors on the stack to check for the segfault before it happens and convert that into a RecursionLimitException so that you may be able to recover....


I could be totally wrong about this since my testing was fairly brief. It seems that Php will only seg fault if it runs out of memory (and presumably tries to access an invalid address). If the memory limit is set and low enough, you will get an out of memory error beforehand. Otherwise, the code seg faults and is handled by the OS.

Can't say whether this is a bug or not, but the script should probably not be allowed to get out of control like this.

See the script below. Behavior is practically identical regardless of options. Without a memory limit, it also slows my computer down severely before it's killed.

<?php
$opts = getopt('ilrv');
$type = null;
//iterative
if (isset($opts['i'])) {
   $type = 'i';
}
//recursive
else if (isset($opts['r'])) {
   $type = 'r';
}
if (isset($opts['i']) && isset($opts['r'])) {
}

if (isset($opts['l'])) {
   ini_set('memory_limit', '64M');
}

define('VERBOSE', isset($opts['v']));

function print_memory_usage() {
   if (VERBOSE) {
      echo memory_get_usage() . "\n";
   }
}

switch ($type) {
   case 'r':
      function segf() {
         print_memory_usage();
         segf();
      }
      segf();
   break;
   case 'i':
      $a = array();
      for ($x = 0; $x >= 0; $x++) {
         print_memory_usage();
         $a[] = $x;
      }
   break;
   default:
      die("Usage: " . __FILE__ . " <-i-or--r> [-l]\n");
   break;
}
?>


Know nothing about PHP implementation, but it's not uncommon in a language runtime to leave pages unallocated at the "top" of the stack so that a segfault will occur if the stack overflows. Usually this is handled inside the runtime and either the stack is extended or a more elegant error is reported, but there could be implementations (and situations in others) where the segfault is simply allowed to rise (or escapes).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜