开发者

How to flush data in php and disconnect user but keep the script alive

This is a trick question, while developing a php+ajax application i felt into some long que开发者_StackOverflow中文版ries, nothing wrong with them, but they could be done in background.

I know that there's a way to just send a reply to user while throwing the real processing to another process by exec(), however it dosen't feels right for me, this might generate exploits and it's not pratical on making it compatible with virtual servers and cross platform.

PHP offers the ob_* functions although they help on flushing the cache, but the user will keep connected until the script is running.

I'm wondering if there's an alternate to exec to keep a script running after sending data to user and closing connection/thread with apache, or a less "dirty" way to have processing data sent to another script.


What I use on my websites is Gearman.

Gearman lets you run task workers in the background via the command line. These workers listen for requests for certain tasks, and then process them when they receive a request.

On the web application side, I just do $GEARMAN->doBackground("task_name","task_data"); and then the task is sent off to the worker and execution returns immediately to the script.

It is much safer than doing exec because the gearman task runs as a PHP function.


After doing some research I've found 3 answers for this question:

  1. Do an ajax request with a low timeout value, once the timeout is reached JS will proceed to the next step while php, in background with ignore_user_abort enabled, will keep processing.

  2. Break apart slow processes from the script that directly answers to the user doing a immediate reply or partial reply (if possible, depending on the case), storing the slow process variables into a _SESSION or database, sending it to worker by cURL with low timeout value OR having a cron task to call workers.

  3. Use Gearman.


PHP has a function called ignore_user_abort for this.

You could also store in the database a queue of tasks to perform, and have a cron task periodically check for new tasks and process them as they arrive.


I have a PHP class that is a POSIX daemon. I start it and essentially the script has

while(1)
{
 // run task
}

you can put database calls and everything in the loop - it responds to POSIX signals too - I usually set it to sleep for 120 seconds and then run again. I use this along with a small table that is my queue. So instead of using a deadly exec() I can use a safer prepared statement to insert an id into my queue table which the background script will catch in the next 30 seconds.

caveat - I do kill the daemon and restart it every 3 hours, as sometimes it gets stuck and doesn't process any more - but a simple kill and start every 3 hours is a good tradeoff for getting live queues running

edit I don't want to post the whole script but I can if someone wants it

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜