开发者

Checking a lot of URLs to see if they return 200. What's the cleverest way?

I need to check a lot (~10 million) of URLs to see if they exist (return 200). I've written the following code to do this per-URL开发者_如何学JAVA, but to do all of the URLs will take approximately forever.

def is_200(url):             
    try:
        parsed = urlparse(url)
        conn = httplib.HTTPConnection(parsed.netloc)
        conn.request("HEAD", parsed.path)
        res = conn.getresponse()
        return res.status == 200
    except KeyboardInterrupt, e:
        raise e
    except:
        return False

The URLs are spread across about a dozen hosts, so it seems like I should be able to take advantage of this to pipeline my requests and reduce connection overhead. How would you build this? I'm open to any programming/scripting language.


Have a look at urllib3. It supports per-host connection re-using. Additionally using multiple processes/threads or async I/O would be a good idea.


All of this is in Python, version 3.x.

I would create worker threads that check for 200. I'll give an example. The threadpool (put in threadpool.py):

# http://code.activestate.com/recipes/577187-python-thread-pool/

from queue import Queue
from threading import Thread

class Worker(Thread):
    def __init__(self, tasks):
        Thread.__init__(self)
        self.tasks = tasks
        self.daemon = True
        self.start()

    def run(self):
        while True:
            func, args, kargs = self.tasks.get()
            try: func(*args, **kargs)
            except Exception as exception: print(exception)
            self.tasks.task_done()

class ThreadPool:
    def __init__(self, num_threads):
        self.tasks = Queue(num_threads)
        for _ in range(num_threads): Worker(self.tasks)

    def add_task(self, func, *args, **kargs):
        self.tasks.put((func, args, kargs))

    def wait_completion(self):
        self.tasks.join()

Now, if urllist contains your urls then your main file should be along the lines of this:

numconns = 40
workers = threadpool.ThreadPool(numconns)
results = [None] * len(urllist)

def check200(url, index):
    results[index] = is_200(url)

for index, url in enumerate(urllist):
    try:
        workers.add_task(check200, url, index)

    except KeyboardInterrupt:
        print("Shutting down application, hang on...")
        workers.wait_completion()

        break

Note that this program scales with the other suggestions posted here, this is only dependent on is_200().

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜