开发者

Fast Insert of raw binary data from multiple files to access database

I'm looking for the fastest method to read from multiple binary files and write everything inside single MS Access database within multiple tables. SQL Server or any other database are not an option. I'm using C#.

Let's say I have only one file containing 1 GB of data.

  1. File starts with header (containing column names code) followed by more than 700 000 blocks of measurement data. Header is less than 1开发者_Python百科00 byte long. One data block is 1500 bytes long ( 372 x single float + 12 bytes for date and time)

  2. first 84 single values must be inserted in first database table

  3. remaining 288 single values are divided equally in 9 tables, 32 values each
  4. every table row has time and date as first column

For now, I'm using binary reader. I'm reading block by block and executing INSERT statement (ADO.NET) 10 times per block. That would be more than 7 millions of INSERT.

Execute time for 1.5MB is about 1 minute - 1GB would be more than 10 hours.

Is there any way to do this much faster? What do you think is possible to expect?


You should use the SqlBulkCopy Class See. http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx


You are probably committing on every insert. It is better to begin a transaction, process some fixed number of data blocks, say 100 or 500, and then commit the transaction and begin another. Keep a text file log of which records have been committed so that if you need to restart for some reason, you know where to start.

You could try changing the Jet engine properties for sycing but it would be better to simply begin an SQL transaction on the connection, and then commit after a large chunk of records. You will also need to turn of Auto Commit as in this VB example.

In addition, if you specified sorted keys then undo that. Use ADOX to make them index keys after you load all the data.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜