Read a file from server with SSH using Python
I am trying to read a file from a server using SSH from Python. I am using Paramiko to connect. I can connect to the server and run a command like cat filename
and get the data back from the server but some files I am trying to read are around 1 GB or more in size.
How can I read the file on the server line by line using Python?
Additional Info: What is regularly do is run a cat filename
command and store the result in a variable and work off that. But since the file here is quite big, I am looking for a way to read a file line by line off the server.
EDIT: I can read a bunch of data and split it into lines but the problem is that the data received in the buffer does not always include the complete lines. For eg, if buffer has 300 lines, the last line may only be half of the line on the server and the next half would be fetched in the next call to the server. I want complete lines
EDIT 2: what command can I use to print lines in a file in a certain range. Like print first 100 lines, then the next 100 and s开发者_运维百科o on? This way the buffer will always contain complete lines.
Paramiko's SFTPClient
class allows you to get a file-like object to read data from a remote file in a Pythonic way.
Assuming you have an open SSHClient
:
sftp_client = ssh_client.open_sftp()
remote_file = sftp_client.open('remote_filename')
try:
for line in remote_file:
# process line
finally:
remote_file.close()
Here's an extension to @Matt Good's answer, using fabric:
from fabric.connection import Connection
with Connection(host, user) as c, c.sftp() as sftp, \
sftp.open('remote_filename') as file:
for line in file:
process(line)
old Fabric 1 answer:
from contextlib import closing
from fabric.network import connect
with closing(connect(user, host, port)) as ssh, \
closing(ssh.open_sftp()) as sftp, \
closing(sftp.open('remote_filename')) as file:
for line in file:
process(line)
#!/usr/bin/env python
import paramiko
import select
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('yourhost.com')
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command("cat /path/to/your/file")
while True:
rl, wl, xl = select.select([channel],[],[],0.0)
if len(rl) > 0:
# Must be stdout
print channel.recv(1024)
It looks like back in Sept 2013 paramiko added the ability for these objects to support context managers natively, so if you want both Matt's clean answer with jfs's context manager, now all you need is:
with ssh_client.open_sftp() as sftp_client:
with sftp_client.open('remote_filename') as remote_file:
for line in remote_file:
# process line
What do you mean by "line by line" - there are lots of data buffers between network hosts, and none of them are line-oriented.
So you can read a bunch of data, then split it into lines at the near end.
ssh otherhost cat somefile | python process_standard_input.py | do_process_locally
Or you can have a process read a bunch of data at the far end, break it up, and format it line by line and send it to you.
scp process_standard_input.py otherhost
ssh otherhost python process_standard_input.py somefile | do_process_locally
The only difference I would care about is what way reduces the volume of data over a limited network pipe. In your situation it may, or may not matter.
There is nothing wrong in general with using cat
over an SSH pipe to move gigabytes of data.
I lost almost half a day of work trying to use paramiko and fabric to do this. But thanks to this answer I was able to come up with the following answer:
from ftplib import FTP_TLS
source = '/file/path/in/FTP/server.txt'
destiny = '/file/path/in/local/machine.txt'
with FTP_TLS() as ftps:
ftps.connect(host, port)
ftps.sendcmd(f'USER { username }')
ftps.sendcmd(f'PASS { password }')
with ftps as conn:
with open(destiny, 'wb') as file:
conn.retrbinary(f'RETR { source }', file.write)
精彩评论