开发者

File Replication/Synchronization between multiple sites using BitTorrent

I need to build a distributed system which relies on replication o开发者_StackOverflowf large files between the sites.

I thought of using p2p technology like bittorrent to save on bandwidth, and increase reliability.

Am I terribly wrong?

Have anyone ever architected such a solution?

What libraries do you recommend?


A new promising solution from the developers of BitTorrent: BitTorrent Sync.

It has the following features:

  • Unlimited and free!
  • Currently supports Windows, Mac and Linux. Mobile platforms are in the works.
  • Specifically designed to handle large files.
  • Private and secure: all traffic is encrypted.
  • Peer discovery protocols.
  • Supports traffic relay for disconnected nodes.


I just found this open-source project from Twitter which hits the nail perfectly:

https://github.com/lg/murder

From the docs:

Murder is a method of using Bittorrent to distribute files to a large amount of servers within a production environment. This allows for scaleable and fast deploys in environments of hundreds to tens of thousands of servers where centralized distribution systems wouldn't otherwise function. A "Murder" is normally used to refer to a flock of crows, which in this case applies to a bunch of servers doing something.


If you have more than 2 sites, then p2p is better solution IMHO.

Just install rtorrent, deluge or any other high-performance torrent client at every site. Than you can distribute only .torrent files with scp/sftp and enjoy.

In order to secure the content from third party torrent clients, set the private flag when producing the .torrent file and use your own tracker. opentracker is a good choice.

One more hint: if your torrent client supports super seeding mode (aka BEP-16 or initial seeding), enable it. It'll help distribute the content with minimum duplication between nodes.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜