开发者

Database solution for large infrequently-accessed data sets

We use MongoDB to store daily logs of statistics about 10s of thousands of items in our database--the collection is currently approaching 100 million records. This data is useful for data mining, but is accessed infrequently. We recently moved it from our main MySQL database to a Mongo d开发者_Go百科atabase; this turned out not to be ideal--Mongo is optimized for fast reads, keeping all of its indexes in memory, and the index on this table is very large.

What is a good way to store large amounts of data for daily large writes, but infrequent reads? We are considering a separate MySQL installation on a separate system. Another possibility might be a NoSQL solution that did not require an index kept in memory.


You are correct, a nosql is good for fast reads of simple data. Since will need to query and possibly do relational operations on this data, I'd recommend a separate mysql installation for this.

You will want to minimize the sql indexes for fast writes.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜