Ending a cURL Request
I'm trying to make a sort of PHP bot. The idea is to have to php files, named a.php and b.php. a.php does something, then sleeps 30 seconds, calls b.php, b.php ends the Http request开发者_JAVA百科, does some processing, and then calls a.php, which ends the Http request, and so on.
Only problem now is how to end the Http reqest, made using cURL. Ive tried this code below:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Will not work
flush(); // Unless both are called !
// At this point, the browser has closed connection to the web server
// Do processing here
echo('Text user will never see');
Slight problem is that it doesn't work, and I actually see "Text user will never see". I've tried cron jobs and such, but host doesn't allow it. I can't sent the script timeout limit either. So my only option is to create repeating php scripts. So how would I send the Http request?
Based on the new understanding of your problem. You are creating a system that checks a remote URL every 30 seconds to monitor a fragment of content. For this I recommend a CRON which can either be server based: http://en.wikipedia.org/wiki/Cron or web based if your host does not permit it: http://www.webbasedcron.com/ (example).
PHP scripts in this case run in the context if web server request, therefore you can't stop talking to the web connection and then continue doing stuff, which is what I think you're attempting to do with the connection close.
The reason you're seeing the output at the end is because at the end of a script PHP will call an implicit flush (see ob_implicit_flush in the manual), but you close the connection to the browser by ending the PHP script.
Ways around this:
You might be able to use set_time_limit to extend the execution limit. DO NOT USE ZERO. It's tempting to say "take all the time you need" on a post-process script, but that way lies madness and bitter sysadmins, plus remember you're still running on curl's timeout stopwatch (though you can extend that as an option). set_time_limit(5) will give you five more seconds, so doing that periodically will allow you to do your post-processing but - if you're careful - still protect you from infinite loops. Infinate loops with no execute limits in the context of apache requests are also likely to make you unpopular with your sysadmin.
It might be possible to build a shell script in your application, save it to disk, execute that in the background and have it delete itself after. That way it will run outside the web-request context, and if the script still exists when you next do the request, you can know that the other processing is still happening. Be really careful about things that might take longer than your gap between executions, as that way leads to sorrow and more bitter sysadmins. This course of action would get you thrown off my hosting environment if you did it without talking to me about it first, though, as it's a terrible hack with a myriad of possible security issues.
But you appear to be attempting to run a regular batch process on a system where they don't want you to do that - or they'd have given you access to cron - so your best and most reliable method is to find a host that actually supports the thing you're trying to do.
精彩评论