开发者

Task Scheduling Across a Network?

Can you recommend on a python tool / module that allows scheduling tasks on remote machine in a network?

Note that the solution must be able to not only run certain jobs/commands on remote machines, but also verify that jobs etc are still running (for example, consider the case where a ma开发者_如何学Pythonchine dies after a task has been assigned to it?)


RPyC or Remote Python Call, is a transparent and symmetrical python library for remote procedure calls, clustering and distributed-computing. Here an example from Wikipedia:

import rpyc
conn = rpyc.classic.connect("hostname")  # assuming a classic server is running on 'hostname'

print conn.modules.sys.path
conn.modules.sys.path.append("lucy")
print conn.modules.sys.path[-1]

# a version of 'ls' that runs remotely
def remote_ls(path):
    ros = conn.modules.os
    for filename in ros.listdir(path):
        stats = ros.stat(ros.path.join(path, filename))
        print "%d\t%d\t%s" % (stats.st_size, stats.st_uid, filename)

remote_ls("/usr/bin")

# and exceptions...
try:
     f = conn.builtin.open("/non/existent/file/name")
except IOError:
     pass

To check if the remote server has died after assigning it a job, you can use the ping method of the Connection class. The complete API is described here.


Fabric (http://docs.fabfile.org/en/1.0.1/index.html) is a pretty good toolkit for various sys admin and deployment tasks. It comes with a few pre defined tasks but also gives you the flexibility to add what you need.

I highly recommend it.


Should be able to use Python WMI, for *NIX based systems it's a wrap around SSH and CRON.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜