开发者

Best tech to use for a database that stores large files

I am interested in finding candidate software that can help me build a program that will do this:

  • simple key-value store, with key being a hash, and value being a potentially large file (10-100mb. total dataset can easily run to 200gb and up)
  • very low volume of requests. maybe 1000 per hour, probably less
  • between 2x-5x more reads than writes
  • automatically remove data that hasn't been queried for a while to keep diskspace under control
  • it's ok for the system to lose data.
  • easy install / few dependencies / easy to make xplatform

Sofware like Redis and MongoDB seem like interesting candidates, but they also very much seem to try to solve the problem of efficiently dealing with many requests per second, usually powering websites. A requirement I do not have at all.

I am wondering if you know of a tool tha开发者_运维问答t would be a better match to the specific problem I am trying to solve.


Based on your requirements, the simplest solution is to use the file system to store your data. Use hash key as file name.

Lookup will be efficient, and data will be cached in memory for you automatically.

If your file system supports it, you can do regular clean up based on last access time for each file.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜