PHP exec() not working - exiting early? no error?
i am using PHP to run exec() on a script which looks like this:
exec("pdftk xx.pdf fill_form xx.fdf output xx.pdf flatten");
the strangest thing is that when i log in to ssh and put the command in manually - it works fine! it outputs a 224k pdf. but when i use the exec() command, only the first 36k of the script comes out. (i checked - the first 36k of the good file is identical to the bad file)
no heres the strange thing - this was working fine with exec() until i added some more variables to the fdf file, making it longer. i thought it was a problem with the fdf because of the new data - but why would this process run fine from ssh?
update: also I tried running php -f test.php (which just had the one exec line in it). that output the entire file properly. but even if i go to http://mydomain.com/test.php i only get a part of the file.
the script is not timing out, because i make it ec开发者_StackOverflow中文版ho something after the exec() command and it works fine.
it can't be a permission issue (ssh logs in as root) because it is still able to write the file
also - when i try getting a return or exit value from exec or passthru, i get nothing. the return value is always 0.
update: in the apache error logs, I am getting
[Fri Sep 17 20:00:57 2010] [error] Unhandled Java Exception: [Fri Sep 17 20:00:57 2010] [error] java.lang.OutOfMemoryError [Fri Sep 17 20:00:57 2010] [error] <>
i changed the php_ini from 32M to 64M - still get it. considering these are all tiny files, i dont think that's it. but would PHP be able to limit the memory of a child process like that? is there another setting for that somewhere?
help!
It turns out this was a memory issue afterall. Apache had RLimitMEM set in the main conf file, which i just disabled for now. Now it works like a charm. Although it was set to about 89MB and since these files are under a meg I can't see how this app would be using that much memory.
I assume you're running off a shared web host, in which case you should watch out: lots of hosts use a custom php.ini file that restricts what exec() does (such as preventing it from being used at all) or there might be some system in-place that prevents processes spawned from exec from taking more than a couple of seconds to run, which it why it might work fine from the shell, but not within PHP's context.
update: in the apache error logs, I am getting
[Fri Sep 17 20:00:57 2010] [error] Unhandled Java Exception: [Fri Sep 17 20:00:57 2010] [error] java.lang.OutOfMemoryError [Fri Sep 17 20:00:57 2010] [error] <>
I don't understand why apache is giving a java errors? Could you elaborate that with us? I find that very strange.
i changed the php_ini from 32M to 64M - still get it. considering these are all tiny files, i dont think that's it. but would PHP be able to limit the memory of a child process like that? is there another setting for that somewhere?
I also have a feeling this could have to do something with the memory your application is using because it is calling pdftk which could get you over the memory limit? How much memory does the pdftk call use at peak? Maybe you should raise the memory even more?
Did you do something like this? http://www.wallpaperama.com/forums/how-to-change-memory-limit-php-apache-server-t53.html
update: also I tried running php -f test.php (which just had the one exec line in it). that output the entire file properly. but even if i go to http://mydomain.com/test.php i only get a part of the file.
I have a solution for you which also will not kill of your webserver(VPS) on high load. From the site(webserver side) you should push it on a blocking list(redis) using predis php client library because it support all the neccessary redis commands(BLPOP/LPUSH). From a PHP deamon(php -f) which is always running you should pop from the blocking list and executing the commands(pdftk).
精彩评论