开发者

Running job in the background from Perl WITHOUT waiting for return

The Disclaimer

First of all, I know this question (or close variations) have been asked a thousand times. I really spent a few hours looking in the obvious and the not-so-obvious places, but there may be something small I'm missing.

The Context

Let me define the problem more clearly: I'm writing a newsletter app in which I want the actual sending process to be async. As in, user clicks "send", request returns immediately and then they can check the progress in a specific page 开发者_StackOverflow社区(via AJAX, for example). It's written in your traditional LAMP stack.

In the particular host I'm using, PHP's exec() and system() are disabled for security reasons, but Perl's system functions (exec, system and backticks) aren't. So my workaround solution was to create a "trigger" script in Perl that calls the actual sender via the PHP CLI, and redirects to the progress page.

Where I'm Stuck

The very line the calls the sender is, as of now:

system("php -q sender.php &");

Problem being, it's not returning immediately, but waiting for the script to finish. I want it to run in the background and have the system call itself return right away. I also tried running a similar script in my Linux terminal, and in fact the prompt doesn't show until after the script has finished, even though my test output doesn't run, indicating it's really running in the background.

What I already tried

  • Perl's exec() function - same result of system().
  • Changing the command to: "php -q sender.php | at now"), hoping that the "at" daemon would return and that the PHP process itself wouldn't be attached to Perl.
  • Executing the command 'indirectly': "/bin/sh -c 'php -q sender.php &'" - still waits until sender.php is finished sending.
  • fork()'ing the process and executing the system call in the child (hopefully detached process) - same result as above

My test environment

Just to be sure that I'm not missing anything obvious, I created a sleeper.php script which just sleeps five seconds before exiting. And a test.cgi script that is like this, verbatim:

#!/usr/local/bin/perl
system("php sleeper.php &");
print "Content-type: text/html\n\ndone";

What should I try now?


Essentially you need to 'daemonize' a process -- fork off a child, and then entirely disconnect it from the parent so that the parent can safely terminate without affecting the child.

You can do this easily with the CPAN module Proc::Daemon:

use Proc::Daemon;
# do everything you need to do before forking the child...

# make into daemon; closes all open fds
Proc::Daemon::Init();


Use fork() and then call system in the child process.

my $pid = fork();
if (defined $pid && $pid == 0) {
    # child
    system($command);    # or exec($command)
    exit 0;
}
# parent
# ... continue ...


Sometimes STDERR and STDOUT can also lock the system... To get both, I use (for most shell environments (bash, csh, etc) that I use...):

system("php sender.php > /dev/null 2>&1 &");


Another option would be to set up a gearman server and a worker process (or processes) that do the emailing. That way you control how much emailing is going on simultaneously, and no forking is necessary. The client (your program) can add a task to the gearman server (in the background without waiting for a result if desired), and the jobs are queued until the server passes the job to an available worker. There are perl and php APIs to gearman, so it's very convenient.


Managed to solve the problem. Apparently what was keeping it from returning was that calling the sender that way didn't disconnect the stdout. So, the solution was simply changing the system call to:

system("php sender.php > /dev/null &");

Thanks everybody for the help. In fact, it was while reading the whole story about "daemonizing" a process that I got the idea to disconnect the stdout.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜