Paramiko and "remote python"
I would like to be able to control a remote Python interpreter over an SSH connection, and drive it from Python itself.
I've got a basic template:
ssh.connect(servername, serverport, 开发者_如何学Pythonusername, key_filename=key_filename)
transport = ssh.get_transport()
channel = transport.open_session()
channel.exec_command(PATH_TO_EXEC)
while True:
r, w, e = select.select([channel], [], [], 1)
if channel in r:
try:
if channel.recv_ready():
x = channel.recv(64)
elif channel.recv_stderr_ready():
x = channel.recv_stderr(64)
else:
continue
if len(x) == 0:
print '\r\n*** EOF\r\n',
break
sys.stdout.write(x)
sys.stdout.flush()
except socket.timeout:
pass
which allows me to talk to the remote application with pdb
: channel.set("command\n")
.
It works perfectly with bash
, with gdb
, but there is nothing I can do to get an output stream from python
(v2)
How does Python handle its output stream, why my code doesn't work with it?
Unless this is an academic exercise, or you have some specific requirement to use ssh, have a look at pushy. I've never used it but it seems mature.
Depending from what your goal is, you could follow one of these two ways (I'm sure there are several other alternatives, though!).
If you want to control execution of scripts on remote machines via python you could try Fabric. From their website:
Fabric is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks. It provides a basic suite of operations for executing local or remote shell commands (normally or via sudo) and uploading/downloading files, as well as auxiliary functionality such as prompting the running user for input, or aborting execution.
If you want to control remote processes and integrate their output into the flow of your main program, you could use the multiprocessing module. From PEP 371:
The package also provides server and client functionality (processing.Manager) to provide remote sharing and management of objects and tasks so that applications may not only leverage multiple cores on the local machine, but also distribute objects and tasks across a cluster of networked machines.
精彩评论