开发者

Copying data from STDOUT to a remote machine using SFTP

In order to backup lar开发者_如何学编程ge database partitions to a remote machine using SFTP, I'd like to use the databases dump command and send it directly over using SFTP to a remote location.

This is useful when needing to dump large data sets when you don't have enough local disk space to create the backup file, and then copy it to a remote location.

I've tried using python + paramiko which provides this functionality, but the performance much worse than using the native openssh/sftp binary to transfer files.

Does anyone have any idea on how to do this either with the native sftp client on linux, or some library like paramiko? (but one that performs close to the native sftp client)?


If you have remote shell access (ssh), you can do something like the following:

fancy-sql-dump-command --to-stdout | ssh me@remotehost "cat > my-dql-dump.sql"

Google "pipe over ssh" for more examples, e.g. this example using tar.


I'd recommend sshfs, which operates over SFTP protocol.

Some OS distributions have this packaged, for others you'll need to compile, for example on RedHat Enterprise Linux 5.4+ or its clones like CentOS:

sudo yum install fuse-devel glib-devel
sudo usermod -a -G fuse "$USER"

cd /tmp
tar xzf sshfs-fuse-2.2.tar.gz
cd sshfs-fuse-2.2
./configure
make
sudo make install

# relogin

mkdir /tmp/servername
sshfs servername:directory /tmp/servername/
ls /tmp/servername/
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜