python multiprocessing can not control multiple long running console exe?
I am a newbie in Python. I recently tried to use Python script to call a console exe which is a process need long time. I will allow the exe being called as many times as the CPU can permit. And when the exe has finish its job. It should release the CPU to other new jobs. So I think I may need the multiple process control mechanism.
Since multiprocessing can only call python callable function. It can not directly call the console exe. I wrapped the subprocess.Popen(cmd) in a python function. However, after I did so, I found that before multiprocessing.Process.start() is used. The exe has already started. And problem is before it finish what it is doing (need long time), it does not return me the control. The program is freezed to wait. That is not what I want.
I am posting the codes as below:
import sys
import os
import multiprocessing
import subp开发者_StackOverflow社区rocess
def subprocessExe(cmd):
return subprocess.call(cmd, shell=False, stdout=subprocess.PIPE, \
stderr=subprocess.PIPE,creationflags=0x08000000)
if __name__ == '__main__':
p = multiprocessing.Process(target=self.subprocessExe(exeFileName))
p.start()
print p, p.is_alive()
Thank you for your time!
You're calling subprocessExec
when you create the multiprocessing.Process
object. You should instead do this:
p = multiprocessing.Process(target=subprocessExec, args=(exeFileName,))
And then it should work.
There are number of things that are wrong in your test case. The following work for me:
import multiprocessing, subprocess
def subprocessExe(cmd):
subprocess.call([cmd], shell=False)
p = multiprocessing.Process(target=subprocessExe, args=('/full/path/to/script.sh',))
p.start()
OR
subprocess.call([cmd], shell=True)
p = multiprocessing.Process(target=subprocessExe, args=('script.sh',))
OR
subprocess.Popen([cmd], shell=True, stdout=subprocess.PIPE, \
stderr=subprocess.PIPE)
精彩评论