开发者

Inserting massive number of records into database

I need to run some algorithms over data massive number of times and store each result in database.

Number of algorithm runs开发者_如何转开发 is 80,000-90,000, each cycle takes about 2 seconds (just the algorithm ). So it's very time consuming.

My database SQL server 2008. I want to use ado.net entity framework (is it good for this task it's not good?)

Right now the output data (that needs to be stored in DB) is plain raw (not very big), plus some maintain columns like date and time.

What is the best practice for that?

Insert row by row, as each algorithms completes? store the results in memory and after work is finished insert the data?


Could you not try BulkInsert after running your algorithm against all records first? It is very efficient at getting the data into the database.

http://msdn.microsoft.com/en-us/library/ms188365.aspx


You could use SqlBulkCopy class and use a DataTable as source data. It's realy fast compared with multiple INSERTs.


In case you didn't use sqlbulkcopy, you could do next:

  1. Store data to local variables
  2. After all data is collected, begin SQL transaction and insert each row to db. After all insert querys are executed, commit.
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜