Is it possible to run function in a subprocess without threading or writing a separate file/script.
import subprocess
def my_function(x):
return x + 100
output = subprocess.Popen(my_function, 1) #I开发者_运维技巧 would like to pass the function object and its arguments
print output
#desired output: 101
I have only found documentation on opening subprocesses using separate scripts. Does anyone know how to pass function objects or even an easy way to pass function code?
I think you're looking for something more like the multiprocessing module:
http://docs.python.org/library/multiprocessing.html#the-process-class
The subprocess module is for spawning processes and doing things with their input/output - not for running functions.
Here is a multiprocessing
version of your code:
from multiprocessing import Process, Queue
# must be a global function
def my_function(q, x):
q.put(x + 100)
if __name__ == '__main__':
queue = Queue()
p = Process(target=my_function, args=(queue, 1))
p.start()
p.join() # this blocks until the process terminates
result = queue.get()
print result
You can use the standard Unix fork
system call, as os.fork()
. fork()
will create a new process, with the same script running. In the new process, it will return 0, while in the old process it will return the process ID of the new process.
child_pid = os.fork()
if child_pid == 0:
print "New proc"
else:
print "Old proc"
For a higher level library, that provides multiprocessing support that provides a portable abstraction for using multiple processes, there's the multiprocessing module. There's an article on IBM DeveloperWorks, Multiprocessing with Python, with a brief introduction to both techniques.
Brian McKenna's above post about multiprocessing is really helpful, but if you wanted to go down the threaded route (opposed to process-based), this example will get you started:
import threading
import time
def blocker():
while True:
print "Oh, sorry, am I in the way?"
time.sleep(1)
t = threading.Thread(name='child procs', target=blocker)
t.start()
# Prove that we passed through the blocking call
print "No, that's okay"
You can also use the setDaemon(True)
feature to background the thread immediately.
You can use concurrent.futures.ProcessPoolExecutor
, which not only propagates the return value, but also any exceptions:
import concurrent.futures
# must be a global function
def my_function(x):
if x < 0:
raise ValueError
return x + 100
with concurrent.futures.ProcessPoolExecutor() as executor:
f = executor.submit(my_function, 1)
ret = f.result() # will rethrow any exceptions
精彩评论