Php memory usage using CLI verses Curl
So I tried executing a script two different ways:
1)
foreach($result_array as $arg){
exec("/usr/bin/php pathToScript firstArg $arg", $array);
echo "peak usage: " . memory_get_peak_usage() . "\n\r";
}
results:
peak usage: 5457324 peak usage: 7791212 PHP Fatal error: Allowed memory size of 335544322)
foreach($result_array as $arg){
curl_file_get_contents("website?query=$arg"); //just a cURL helper function
echo "peak usage: " . memory_get_peak_usage() . "\n\r";
}
results:
peak usage: 5241708 peak usage: 5241708 peak usage: 5241708 peak usage: 5241708 peak usage: 5241708 peak usage: 5241708 ... you get the ideaI must be mistaken about either the way exec() uses memory, or operates. It was my impress开发者_JAVA技巧ion that when the program is forked, using exec(), that the calling script's memory requirements wouldn't be effected... However, this seems to not be the case.
Can anyone shed some light on what is going on here so I know what's going on?
The CURL version isn't saving the response (the output of curl_file_get_contents
), but the exec
version is by appending the contents to exec
's second parameter of $array
:
http://us2.php.net/manual/en/function.exec.php
If the output argument is present, then the specified array will be filled with every line of output from the command. Trailing whitespace, such as \n, is not included in this array. Note that if the array already contains some elements, exec() will append to the end of the array. If you do not want the function to append elements, call unset() on the array before passing it to exec().
What happens is every response gets appended to the same array, ballooning the memory usage of the program.
The curl request is probably doing a full-blown HTTP request, so the script being requested is being run as a child of some completely independent webserver process. The memory usage of that child PHP process will be counted against the HTTP process handling the curl request, not your script.
精彩评论