开发者

Alternative to scp, transferring files between linux machines by opening parallel connections

Is there an alternative to scp, to transfer a large file from one machine to another machine by opening parallel connections and also able to pause and r开发者_开发知识库esume the download.

Please don't transfer this to severfault.com. I am not a system administrator. I am a developer trying to transfer past database dumps between backup hosts and servers.

Thank you


You could try using split(1) to break the file apart and then scp the pieces in parallel. The file could then be combined into a single file on the destination machine with 'cat'.

# on local host
split -b 1M large.file large.file. # split into 1MiB chunks
for f in large.file.*; do scp $f remote_host: & done

# on remote host
cat large.file.* > large.file


Take a look at rsync to see if it will meet your needs.

The correct placement of questions is not based on your role, but on the type of question. Since this one is not strictly programming related it is likely that it will be migrated.


Similar to Mike K's answer, check out https://code.google.com/p/scp-tsunami/ - it handles splitting the file, starting several scp processes to copy the parts and then joins them again...it can also copy to multiple hosts...

 ./scpTsunami.py -v -s -t 9 -b 10m -u dan bigfile.tar.gz /tmp -l remote.host

That splits the file into 10MB chunks and copies them using 9 scp processes...


The program you are after is lftp. It supports sftp and parallel transfers using its pget command. It is available under Ubuntu (sudo apt-get install lftp) and you can read a review of it here:

http://www.cyberciti.biz/tips/linux-unix-download-accelerator.html

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜