Make urllib retry multiple times [duplicate]
My Python application makes a lot of HTTP requests using the urllib2
module. This application might be used over very unreliable networks where latencies could be low and dropped packets and network timeouts might be very common. Is is possible to override a part of the urllib2
module so that each request is retried an X number of times before raising any exceptions? Has anyone seen something like this?
Can i achieve this without modifying my whole application and just creating a wrapper over the urllib2
module. Thus any code making requests using this module automatically gets to use the retry functionality.
Thanks.
Modifying parts of a library is never a good idea.
You can write wrappers around the methods you use to fetch data that would provide the desired behavior. Which would be trivial.
You can for example define methods with the same names as in urllib2 in your own module called myurllib2. Then just change the imports everywhere you use urllib2
精彩评论