开发者

Using wget to do monitoring probes

Before I bang my head against all the issues myself I thought I'd run it by you guys and see if you could point me somewhere or pass along some tips.

I'm writing a really basic monitoring script to make sure some of my web applications are alive and answering. I'll fire it off out of cron and send alert emails if there's a problem.

So what I'm looking for are suggestions on what to watch out for. Grepping the output of wget will probably get me by, but I was wondering if there was a more programmatic way to get robust status information out of wget and my resulting web page.

This is a general kind of question, I'm just lookin开发者_StackOverflowg for tips from anybody who happens to have done this kind of thing before.


Check the exit code,

wget --timeout=10 --whatever http://example.com/mypage
if [ $? -ne 0 ] ; then
    there's a pproblem, mail logs, send sms, etc.
fi


I prefer curl --head for this type of usage:

% curl --head http://stackoverflow.com/
HTTP/1.1 200 OK
Cache-Control: public, max-age=60
Content-Length: 359440
Content-Type: text/html; charset=utf-8
Expires: Tue, 05 Oct 2010 19:06:52 GMT
Last-Modified: Tue, 05 Oct 2010 19:05:52 GMT
Vary: *
Date: Tue, 05 Oct 2010 19:05:51 GMT

This will allow you to check the return status to make sure it's 200 (or whatever you're expecting it to be) and the content-length to make sure it's the expected value (or at least not zero.) And it will exit non-zero if there's any problem with the connection.

If you want to check for changes in the page content, pipe the output through md5 and then compare what you get to your pre-computed known value:

wget -O - http://stackoverflow.com | md5sum
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜