开发者

Multi-threading different scripts

I have aa few scripts written in python. I am trying to multi thread them.

When Script A starts. I would like scripts B, C, and D to start. After A runs, I would A2 to run. After B runs, I would B2 to run, then B3. C and D have no follow up scripts.

I have checked that the scripts are independent of each other.

I planning on using "exec" to launch them, and would like to use this "launcher" on Linux and Windows."

I have other multi thread scripts mainly do a procedure A with five threads. This throwing me because all procedures are different but could start and run at the same开发者_如何学Go time.


Ok I'm still not sure where exactly your problem is, but that's the way I'd solve the problem:

#Main.py
from multiprocessing import Process
import ScriptA
# import all other scripts as well

def handle_script_a(*args):
    print("Call one or several functions from Script A or calculate some stuff beforehand")
    ScriptA.foo(*args)

if __name__ == '__main__':
    p = Process(target=handle_script_a, args=("Either so", ))
    p1 = Process(target=ScriptA.foo, args=("or so", ))
    p.start()
    p1.start()
    p.join()
    p1.join()

# ScriptA.py:
def foo(*args):
    print("Function foo called with args:")
    for arg in args:
        print(arg)

You can either call a function directly or if you want to call several functions in one process use a small wrapper for it. No platform dependent code, no ugly execs and you can create/join processes easily in whatever way fancies you.

And a small example of a queue for interprocess communication - pretty much stolen from the python API but well ;)

from multiprocessing import Process, Queue

def f(q):
    q.put([42, None, 'hello'])

if __name__ == '__main__':
    q = Queue()
    p = Process(target=f, args=(q,))
    p.start()
    print(q.get())    # prints "[42, None, 'hello']"
    p.join()

Create the queue and give it one or more processes. Note that get() blocks, if you want non blocking you can use get_nowait() or specify a timeout as 2nd argument. If you want shared objects there'd be multiprocessing.Array or multiprocessing.Value, just read the documentation for specific information doc link If you've got more questions relative to IPC create a new question - a extremely large topic in itself.


So it doesn't have to be a Python launcher? Back when I was doing heavy sys admin, I wrote a Perl script using the POE framework to run scripts or whatever with a limited concurrency. Worked great. for example when we had to run a script over a thousand user accounts or a couple of hundred data bases. Limit it to just 4 jobs at a time on an 4-cpu box, 16 on a 16-way server, or any arbitrary number. POE does use fork() to create child procs, but on Windows boxes that works fine under cygwin, FWIW.

A while back I was looking for an equivalent event framework for Python. Looking again today I see Twisted--and some posts indicating that it runs even faster than POE--but maybe Twisted is mostly for network client/server? POE's incredibly flexible. It's tricky at first if you're not used to event driven scripting, and even if you are, but events are a lot easier to grock than threads. (Maybe over-kill for your needs? It's years later I'm still surprised there's not a simple utility to control throughput on multi-cpu machines.)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜