开发者

How do I configure WWW::Mechanize to work behind a proxy and https?

I've written Perl code using WWW::Mechanize to retrieve a webpage. When I retrieve http webpages it works fine but it doesnt work for https. I've checked and I have the Crypt::SSLeay package. What else could be wrong?

The error message is..

Erro开发者_JS百科r GETing https://www.temp.com: Can't Connect to www.temp.com:443 <Bad hostname 'www.temp.com'> at scrape.pl line 8


I've seen in your related Mechanize question that you call the proxy method with only the http and ftp schemes. Try again with https included.

It's probably more useful to set up the proxy environment variables since then all programs can take advantage of this central configuration instead of configuring proxies for each program separately. Do not forget https_proxy. Call the env_proxy method instead of proxy to use them.


Apparently, I needed to add the following in my file

$ENV{'HTTPS_PROXY'} = 'http://proxy:port/';

for Crypt::SSLeay


In case someone stumbles over this old question: The situation has changed in the recent years.

  • starting with version 6 LWP uses IO::Socket::SSL as a backend
  • IO::Socket::SSL does not provide its own proxy hacks like Crypt::SSLeay does and LWP https proxy support as documented (e.g. using proxy method or env_proxy) was broken.
  • with version 6.06 of both LWP::UserAgent and LWP::Protocol::https (which are now seperate distributions!) https proxy works as expected and documented
  • for older versions of LWP one can use Net::SSLGlue::LWP
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜