开发者

Threading in python: retrieve return value when using target= [duplicate]

This question already has answers here: Closed 10 years ago.

Possible Duplicate:

Return value from thread

I want to get the "free memory" of a bunch of servers like this:

def get_mem(servername):  
    re开发者_如何学Pythons = os.popen('ssh %s "grep MemFree /proc/meminfo | sed \'s/[^0-9]//g\'"' % servername)  
    return res.read().strip()  

since this can be threaded I want to do something like that:

import threading  
thread1 = threading.Thread(target=get_mem, args=("server01", ))  
thread1.start()

But now: how can I access the return value(s) of the get_mem functions? Do I really need to go the full fledged way creating a class MemThread(threading.Thread) and overwriting __init__ and __run__?


You could create a synchronised queue, pass it to the thread function and have it report back by pushing the result into the queue, e.g.:

def get_mem(servername, q):
    res = os.popen('ssh %s "grep MemFree /proc/meminfo | sed \'s/[^0-9]//g\'"' % servername)
    q.put(res.read().strip())

# ...

import threading, queue
q = queue.Queue()
threading.Thread(target=get_mem, args=("server01", q)).start()
result = q.get()


For the record, this is what I finally came up with (deviated from multiprocessing examples

from multiprocessing import Process, Queue

def execute_parallel(hostnames, command, max_processes=None):
    """
    run the command parallely on the specified hosts, returns output of the commands as dict

    >>> execute_parallel(['host01', 'host02'], 'hostname')
    {'host01': 'host01', 'host02': 'host02'}
    """
    NUMBER_OF_PROCESSES = max_processes if max_processes else len(hostnames)

    def worker(jobs, results):
        for hostname, command in iter(jobs.get, 'STOP'):
            results.put((hostname, execute_host_return_output(hostname, command)))

    job_queue = Queue()
    result_queue = Queue()

    for hostname in hostnames:
        job_queue.put((hostname, command))

    for i in range(NUMBER_OF_PROCESSES):
        Process(target=worker, args=(job_queue, result_queue)).start()

    result = {}
    for i in range(len(hostnames)):
        result.update([result_queue.get()])

    # tell the processes to stop
    for i in range(NUMBER_OF_PROCESSES):
        job_queue.put('STOP')

    return result
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜