My python program always brings down my internet connection after several hours running, how do I debug and fix this problem?
I'm writing a python script checking/monitoring several server/websites status(response time and similar stuff), it's a GUI program and I use separate thread to check different server/website, and the basic structure of each thread is using an infinite while loop to request that site every random time period(15 to 30 seconds), once there's changes in website/server each thread will start a new thread to do a thorough check(requesting more pages and similar stuff).
The problem is, my internet connection always got blocked/jammed/messed up after several hours running of this script, the situation is, from my script side I got urlopen error timed out each time it's requesting a page, and from my FireFox browser side I cannot open any site. But the weird thing is, the moment I close my script my Internet connection got back on immediately which means now I can surf any site through my browser, so it must be the script causing all the problem.
I've checked the program carefully and even use del
to delete any connection once it's used, still get the same problem. I only use urllib2, urllib, mechanize to do network requests.
Anybody knows why such thing happens? How do I debug this problem? Is there a tool or something to check my network status once such situation occurs? It's really bugging me for a while...
By the way I'm behind a VPN, does it have something to do with this problem? Although I don't think so because my network always get back on once the script closed, and the VPN connection never dro开发者_JS百科ps(as it appears) during the whole process.
[Updates:]
Just found more info about this problem, when my program brings down the internet connection, well, it's not totally "down", I mean, I cannot open any site in my browser or always get urlopen error timed out, but I still can get reply using "ping google.com" in cmd line. And when I manually dropped the VPN connection then redial, without closing my program it starts to work again and also I can surf the net through my browser. Why this happening?
This may or may not be the problem but it's a good idea to always use context managers when dealing with things that opens resources, like files or urls.
Since Python 2.5 you can do this with files:
with open('/tmp/filename', 'rt') as infile:
data = infile.read()
whatever(data)
And the file will be automatically closed at the end of the block.
urllib2 doesn't support this automatically, but you can use contextlib to help you:
>>> import contextlib
>>> with contextlib.closing(urllib2.urlopen('http://www.python.org')) as page:
... for line in page:
... print(line)
<html> blablablabla</html>
This way the connection will be both closed and deleted at the end of the with-block, so you don't have to think about it. :-)
You could possibly be creating more threads than you expect - monitor the result of
threading.active_count()
to test this.If possible try to rule out the VPN at your end (or post the relevant guts of the code so we can test it).
(Nettiquete) If you're not doing so already, only use network.http.max-connections-per-server threads per monitored site/host.
(For reference)
urlopen
returns a file-like object - use.close()
ordel
on this object or the socket will be sat in a CLOSE_WAIT state until a timeout.
Hopefully these points are, well, pointers.
精彩评论