开发者

Fast interaction with database

I'm developing a project which gets some data from a hardware every 100 milliseconds. I'm using Visual Studio 2010 and C#. The data size is about 50KB in each round. The customer wants to log all the data in the database for statistical purposes.

I prefe开发者_Python百科r using SQL Server 2005+ since I'm familliar with it and the project should be done in about 15 days it's a small size project.

  • Is this a reasonable speed for such a data size to be inserted into db? Do you suggest any generic approaches to speed up the interactions? (using sql commands, EF, other technologies which could have a positive effect on speed).
  • If this is way too fast for SQL Server to handle, what do you suggest I should use which:

    1-has a quick learning curve.

    2-could accept queries for statistical data.

    3- could satisfy my speed interaction needs.

I'm thinking about System.Data.SQLite If it's a no go on SQL Server. But I don't know about the learning curve and speed enhancements.


500kb per second is nothing. I work with Sql databases that does gigabytes per second, it all depends on the hardware and server configuration underneath, but lets say you were to run this on a standard office desktop, you will be fine. Even then I would say you can start thinking about new hardware if you look at 20Mb per second or more.

Second part of your question. Since you are using c#, I suggest you use SQL 2008 and then use a table valued parameter (TVP), and then buffer the data in the application, in a dataset or datatable until you have say 10K rows, and then you call the proc to do the insert, and all you do is pass it the datatable as a parameter. This will save hundreds or thousands of ad-hoc inserts.

Hope this is clear, if not, ask an I will try to explain further.


50kB every 100 millseconds is 500kB a second. These days networks run at gigabit speeds (many megabytes per second) and hard drives can cope with hundreds of MB per second. 500kB is a tiny amount of data, so I'd be most surprised if SQL server can't handle it.

If you have a slow network connection to the server or some other problem that means it struggles to keep up, then you can try various strategies to improve things. Ideas might be:

  • Buffer the data locally (and/or on the server) and write it into the database with a separate thread/process. If you're not continually logging 24 hours a day, then a slow server would catch up when you finish logging. Even if you are logging continuously, this would smooth out any bumps (e.g. if your server has periods of "busy time" where it is doing so much else that it struggles to keep up with the data from your logger)

  • Compress the data that is going to the server so there's less data to send/store. If the packets are similar you may find you can get huge compression ratios.

  • If you don't need everything in each packet, strip out anything "uninteresting" from the data before uploading it.

  • Possibly batching the data might help - by collecting several packets and sending them all at once you might be able to minimise transmission overheads.

  • Just store the data directly to disk and use the database just to index the data.

... So I'd advise writing a prototype and see how much data you can chuck at your database over your network before it struggles.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜