Starting an http request, but dropping out if no response after a certain time
I'm trying to write a python script that does the following from within a minutely cronj开发者_运维技巧ob:
- tries to execute a url
- after 10 seconds if there is no response yet, abandon the response and immediately issue a command via os.system to restart the webserver.
The problem is that when my server crashes, it doesn't return a response at all. If I were to just have my script time the response, the script will go on for 10 minutes or more. I want it to issue the restart immediately once it detects a slow response. I know such a script could be written in probably less than 5 mines of code, but I have no idea how to go about it.
From Python 2.6 on you can use the following, provide the timeout in seconds in the call to urlopen:
urllib2.urlopen(url[, data][, timeout])
This sets the timeout only for this request, not globally like with socket.settimeout. But for older versions of Python that is probably your only option.
Use settimeout()
I believe this will throw an exception, so just do:
try:
//all the magic you want
except socket.timeout:
os.system(...)
Here's a non-Python answer, suitable for a minute-ly cronjob:
curl > /dev/null --silent --max-time 10 http://myserver.com || sudo /usr/sbin/apache2ctl graceful
If the curl fetch takes more than ten seconds, it'll exit with an error. The bash "||" command will then fire, running the command to restart Apache gracefully. That is, it'll restart, but leave current connections running. You'll need to visudo to enable that command to run passwordless from cron.
精彩评论