开发者

How to use same cookies in multiple request in python?

I am using this code:

def req(url, postfields):
    proxy_support = urllib2.ProxyHandler({"http" : "127.0.0.1:8118"})
    opener = urllib2.build_opener(proxy_support) 
    opener.addheaders = [('User-agent', 'Mozilla/5.0')]
    return opener.open(url).read()

To make a simple http get request (using tor as proxy).

Now I would like to know how to make multiple request using the same cookie.

For example:

req('http://loginpage', 'postfields')
source = req('http://pageforloggedinonly', 0)
#do stuff with source
req('http://anotherpageforloggedinonly', 'StuffFromSource')

I know that my function req d开发者_Python百科oesn't support POST (yet), but I have sent postfields using httplib so I guess I can figure that by myself, but I don't understand how to use cookies, I saw some examples but they are all one request only, I want to reuse the cookie from the first login request in the succeeding requests, or saving/using the cookie from a file (like curl does), that would make everything easier.

The code I posted I only to illustrate what I am trying to achieve, I think I will use httplib(2) for the final app.

UPDATE:

cookielib.LWPCOokieJar worked fine, here's a sample I did for testing:

import urllib2, cookielib, os

def request(url, postfields, cookie):
    urlopen = urllib2.urlopen
    cj = cookielib.LWPCookieJar()
    Request = urllib2.Request

    if os.path.isfile(cookie):
        cj.load(cookie)

    opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
    urllib2.install_opener(opener)
    txheaders =  {'User-agent' : 'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)'}

    req = Request(url, postfields, txheaders)
    handle = urlopen(req)
    cj.save(cookie)
    return handle.read()

print request('http://google.com', None, 'cookie.txt')


The cookielib module is what you need to do this. There's a nice tutorial with some code samples.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜