开发者

How to effectively store a high amount of rows in a database

What's the best way to store a high amount of data in a database? I need to store values of various environmental sensors with timestamps. I have done some benchmarks with SQLCE, it works fine for a few 100,000 rows, but if it goes to the millions, the selects will get horrible slow. My actual tables:

Datapoint:[DatastreamID:int, Timestamp:datetime, Value:float]  
Datastream: [ID:int{unique index}, Uint:nvarchar, Tag:nvarchar]

If I query for Datapoints of a specific Datastream and a date range, it takes ages. Especially if I run it on a embedded WindowsCE device. And that is the main problem. On my development machine a query took's ~1sek, but on the CE device it took's ~5min

every 5min I log 20 sensors, 12 per hour * 24h * 365days = 105,120 * 20 sensors = 2,102,400(rows) per year

But it could be even more sensors!

I thought about some kind of webservice backend, but the device may not always have a 开发者_如何学Goconnection to the internet / server.

The data must be able to display on the device itself.

How can I speed up the things? choose an other table layout, use an other database (sqlite)? At the moment I use .netcf20 and SQLCE3.5

Some advices?


I'm sure any relational database would suit your needs. SQL Server, Oracle, etc. The important thing is to create good indexes so that your queries are efficient. If you have to do a table scan just to find a single record, it will be slow no matter which database you use.

If you always find yourself querying for a specific DataStreamID and Timestamp value, create an index for it. That way it will do an index seek instead of a scan.


The key to quick access is using one or more indexes.

A Database of two million rows in a year is very manageable.

Adding indexes will slow, somewhat, the INSERTS, but your data isn't coming in all that quickly, so it should not be an issue. If the data were coming in faster, you might have to be more careful, but it would have to be far more data in a far faster rate than you have now in order to be a concern.

Do you have access to SQL Server, or even MySQL?


Your design must have these: Primary key in the table. Integer PK is faster.

You need to analyze your select queries to see what is going on behind the scene. Select must do a SEEK instead of a scan

If 100K makes it slow, you must look at the query through analyzer.

It might get little slow if you have 100M rows, not 100K rows

Hope this helps


Can you use SQL Server Express Edition instead? You can create indexes on it just like in the full version. I've worked with databases that are over 100 million rows in SQL Server just fine. SQL Server Express Edition limits you database size to 10 GB so as long as that's okay then the free one should work for you.

http://www.microsoft.com/express/Database/

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜