开发者

File server distributed caching

We have a very big file server (HTTP and/or FTP). Some files are then used by around 5 systems. For example, system A will use files A and B. Then System B will use files A and C.

Are there applications, preferably free or open source, that can cache those commonly used files inside the sy开发者_StackOverflow社区stem?

I'm looking for Squid alternatives. Thanks.


Have you looked at Hadoop? I haven't used it myself but it seems to do exactly what you want.


If we are talking about storing files in 100s of MBs then Hadoop would be my recommended choice to solve this problem. But according to the community Hadoop is not very suitable for files in kB or < 200-300MB.

For such cases most recommend HBase built on Hadoop. The combo provides high availability and scalability at the same time. But saying so Hadoop setup might be a bigger than one might want, e.g. a development/test cluster size is recommended to consist of 4~5 servers, while production environment minimum is 10+ servers.

An effective alternate to squid web cache is Varnish.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜