开发者

How to share a dictionary between multiple processes in python without locking

I need to share a huge dictionary (around 1 gb in size) between multiple processs, however since all processes will always read from it. I dont need locking.

Is there any way to share a dictionary without locking?

The multiprocessing module in python provides an Array class which allows sharing without locking by setting

lock=false

however There is no such option for Dictionary provi开发者_如何学Goded by manager in multiprocessing module.


Well, in fact the dict on a Manager has no locks at all! I guess this is true for the other shared object you can create through the manager too. How i know this? I tried:

from multiprocessing import Process, Manager

def f(d):
    for i in range(10000):
        d['blah'] += 1

if __name__ == '__main__':
    manager = Manager()

    d = manager.dict()
    d['blah'] = 0
    procs = [ Process(target=f, args=(d,)) for _ in range(10) ]
    for p in procs:
        p.start()
    for p in procs:
        p.join()

    print d

If there were locks on d, the result would be 100000. But instead, the result is pretty random and so this is just a nice illustration why locks are needed when you modify stuff ;-)

So just go ahead and use manager.dict().

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜