开发者

Make urllib retry multiple times [duplicate]

This question already has answers here: 开发者_JAVA百科 is there a pythonic way to try something up to a maximum number of times? (10 answers) Closed 7 months ago.

My Python application makes a lot of HTTP requests using the urllib2 module. This application might be used over very unreliable networks where latencies could be low and dropped packets and network timeouts might be very common. Is is possible to override a part of the urllib2 module so that each request is retried an X number of times before raising any exceptions? Has anyone seen something like this?

Can i achieve this without modifying my whole application and just creating a wrapper over the urllib2 module. Thus any code making requests using this module automatically gets to use the retry functionality.

Thanks.


Modifying parts of a library is never a good idea.

You can write wrappers around the methods you use to fetch data that would provide the desired behavior. Which would be trivial.

You can for example define methods with the same names as in urllib2 in your own module called myurllib2. Then just change the imports everywhere you use urllib2

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜