开发者

Database choice for 30000 x 5000 Big table, each item may have 100M in size

I have a 30000 x 5000 big table, and each item in the table may have 100M (even mor开发者_JAVA百科e) in size, can anyone give me some advice to choose a database?


13 Petabytes of data? I'm impressed.

Without knowing how you're going to query the database it is hard to say what would work, but a standard filesystem can handle 100-megabyte objects, duplicate objects can be handled with hard or soft links, 'sparse' entries just not populated, and 30,000 directories in a directory should be fine in ext3 with htree turned on. (tune2fs option dir_index.)

But perhaps your SAN vendor will have some strong opinions about what sorts of systems work well when you've scaled to 13 Petabytes -- I suggest talking with your system vendor's sales engineers, the sales engineers I have known have been scary good.


If you're really serious about this, your best bet is Cassandra. Can't help you much more though.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜