On 2011/12/02 20:39:05, Augie fackler wrote: > I believe that https should respect http_proxy if ...
12 years, 5 months ago
(2011-12-02 21:08:14 UTC)
#3
On 2011/12/02 20:39:05, Augie fackler wrote:
> I believe that https should respect http_proxy if https_proxy isn't set, but I
> could also be making that up. Also, I've never actually heard of/seen anyone
use
> https_proxy.
That seems like a good suggestion. My intention was to implement the behavior
consistent with the community conventions and with the implementation in
urllib2. I'll double-check the spec and urllib2 behavior and make it consistent
or comment any justifiable deviance.
We use https_proxy where I work, though it's always set to the same thing as the
http_proxy.
Thanks for the suggestions.
On 2011/12/02 21:08:14, jaraco wrote: > On 2011/12/02 20:39:05, Augie fackler wrote: > > I ...
12 years, 5 months ago
(2011-12-02 22:45:17 UTC)
#4
On 2011/12/02 21:08:14, jaraco wrote:
> On 2011/12/02 20:39:05, Augie fackler wrote:
> > I believe that https should respect http_proxy if https_proxy isn't set, but
I
> > could also be making that up. Also, I've never actually heard of/seen anyone
> use
> > https_proxy.
>
> That seems like a good suggestion. My intention was to implement the behavior
> consistent with the community conventions and with the implementation in
> urllib2. I'll double-check the spec and urllib2 behavior and make it
consistent
> or comment any justifiable deviance.
I couldn't find any examples online where HTTPS requests fall back to the
http_proxy. The best I can tell, https_proxy must be set for HTTPS requests to
go through the proxy. This behavior is consistent with my findings in urllib2:
>>> import os
>>> del os.environ['https_proxy']
>>> import urllib2
>>> res = urllib2.urlopen('https://www.jaraco.com')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
return _opener.open(url, data, timeout)
File "/usr/lib/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/lib/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/lib/python2.6/urllib2.py", line 1169, in https_open
return self.do_open(httplib.HTTPSConnection, req)
File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
raise URLError(err)
urllib2.URLError: <urlopen error [Errno 110] Connection timed out>
In the example above, http_proxy is still set, and if urllib2 were to fall back,
it would be able to request that URL just fine, but it does not. It tries to
connect directly and times out (due to firewall restrictions for direct
connections).
Unless someone can find an authoritative specification or widely-used
representative implementation that suggests falling back to http_proxy, I
believe the current implementation is correct.
Issue 5451071: Automatic proxy detection support
Created 12 years, 5 months ago by jaraco
Modified 12 years, 5 months ago
Reviewers: jcgregorio, Augie fackler
Base URL:
Comments: 1