remotely start Python program in background
I need to use fabfile to remotely start some program in remote boxes from time to time, and get the results. Since the program takes a long while to finish, I wish to make it run in background and so I dont need to wait. So I tried os.fork() to make it work. The problem is that when I ssh to the remote box, and run the program with os.fork() there, the program can work in background fine, but when I tried to use fabfile's run, sudo to start the program remotely, os.fork() cannot work, the program just die silently. So I switched to Python-daemon to daemonalize the program. For a great while, it worked perfectly. But now when I 开发者_如何学运维started to make my program to read some Python shelve dicts, python-daemon cannot work any longer. Seems like if you use python-daemon, the shelve dicts cannot be loaded correctly, which I dont know why. Anyone has an idea besides os.fork() and Python-daemon, what else can I try to solve my problem?
If I understand your question right, I think you're making this far too complicated. os.fork()
is for multiprocessing, not for running a program in the background.
Let's say for the sake of discussion that you wanted to run program.sh
and collect what it sends to standard output. To do this with fabric, create locally:
fabfile.py:
from fabric.api import run
def runmyprogram():
run('./program.sh > output 2> /dev/null < /dev/null &')
Then, locally, run:
fab -H remotebox runmyprogram
The program will execute remotely, but fabric will not wait for it to finish. You'll need to harvest the output files later, perhaps using scp. The "&" makes this run in the background on the remote machine, and output redirection is necessary to avoid a hung fabric session.
If you don't need to use fabric, there are easier ways of doing this. You can ssh individually and run
nohup ./program.sh > output &
then come back later to check output.
If this is something that you'll do on a regular basis, this might be the better option, since you can just set up a cron job to run every so often, and then collect the output whenever you want.
If you'd rather not harvest the output files later, you can use:
fabfile.py:
from fabric.api import run
def runmyprogram():
run('./program.sh')
Then, on your local machine:
fab -H remotebox runmyprogram > output &
The jobs will run remotely, and put all their output back into the local output file. This runs in the background on your local machine, so you can do other things. However, if the connection between your local and remote machines might be interrupted, it's better to use the first approach so the output is always safely stored on the remote machines.
For those who came across this post in the future. Python-daemon can still work. It is just that be sure to load the shelve dicts within the same process. So previously the shelve dicts is loaded in parent process, when python-daemon spawns a child process, the dict handler is not passed correctly. When we fix this, everything works again.
Thanks for those suggesting valuable comments on this thread!
精彩评论