开发者

Python and urllib2: how to make a GET request with parameters

I'm building an "API API", it's basically a wrapper for a in house REST web service that the web app will be making a lot of requests to. Some of the web service calls need to be GET rather than post, but passing parameters.

Is there a "best practice" way to encode a dictionary into a query string? e.g.: ?foo=bar&bla=blah

I'm looking at the urllib2 docs, and it looks like it dec开发者_C百科ides by itself wether to use POST or GET based on if you pass params or not, but maybe someone knows how to make it transform the params dictionary into a GET request.

Maybe there's a package for something like this out there? It would be great if it supported keep-alive, as the web server will be constantly requesting things from the REST service.

Ideally something that would also transform the XML into some kind of traversable python object.

Thanks!


Is urllib.urlencode() not enough?

>>> import urllib
>>> urllib.urlencode({'foo': 'bar', 'bla': 'blah'})
foo=bar&bla=blah

EDIT:

You can also update the existing url:

  >>> import urlparse, urlencode
  >>> url_dict = urlparse.parse_qs('a=b&c=d')
  >>> url_dict
  {'a': ['b'], 'c': ['d']}
  >>> url_dict['a'].append('x')
  >>> url_dict
  {'a': ['b', 'x'], 'c': ['d']}
  >>> urllib.urlencode(url_dict, True)
  'a=b&a=x&c=d'

Note that parse_qs function was in cgi package before Python 2.6

EDIT 23/04/2012:

You can also take a look at python-requests - it should kill urllibs eventually :)


urllib.urlencode

And yes, the urllib / urllib2 division of labor is a little confusing in Python 2.x.


import urllib
data = {}
data["val1"] = "VALUE1"
data["val2"] = "VALUE2"
data["val3"] = "VALUE3"
url_values = urllib.urlencode(data)
url = "https://www.python.org"
print url + "?" + url_values

The url_values is an encoded values and can be used to post to the server along with url as a query string(url+"?"+url_values).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜