python FancyURLopener timeout
is there a way to set connection timeout for FancyURLopener()? I'm using FancyURLopener.retrieve() to download a file, but sometimes it just stucks and that's al开发者_Python百科l... I think this is because it's still trying to connect and it's not possible. So is there a way to set that timeout?
Thanks for every reply
If you want to use retrieve()
with a timeout, you can set it in the socket
module.
import socket
socket.setdefaulttimeout(5)
Source: http://docs.python.org/py3k/howto/urllib2.html#sockets-and-layers
Sorry, solved. I didn't realize that I could use something like this...
fileName = string.split(url, '/')[-1]
data = urllib2.urlopen(url, timeout = 5) //Connection timeout set to 5 secs
newF = open(os.path.join(os.getcwd(), fileName), "wb")
newF.write(data.read())
newF.close()
精彩评论