开发者

Saving a file on desktop from a given url with a proxy server

My problem is that I want to save a file given by an url. say the url is something like 'http://www.somesitename.com/Something/filename.fileextension" for example开发者_如何学编程 some_url = 'http://www.fordantitrust.com/files/python.pdf' filename = myfile.pdf

I want to download this file. I know I can do it easily with urllib.urlretrieve(some_url,filename) as soon as you dont't have any proxy in between your system and the requested url.

I am having a proxy so each time I want to download this file I have to pass that proxy. I don't know how to do this.

Any help is appreciated.


Urllib is deprecated since Python 2.6, use urllib2 instead. Genereally, proxy is handled by urllib2 transparently if a global proxy is set. If not, try use urllib2.proxyhandler to set your proxy.

Sample code from python docs :

proxy_handler = urllib2.ProxyHandler({'http': 'http://www.example.com:3128/'})
proxy_auth_handler = urllib2.ProxyBasicAuthHandler()
proxy_auth_handler.add_password('realm', 'host', 'username', 'password')

opener = urllib2.build_opener(proxy_handler, proxy_auth_handler)
# This time, rather than install the OpenerDirector, we use it directly:
opener.open('http://www.example.com/login.html')
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜