开发者

Insert records into a database from a hashtable

I'm looking for any advice on what's the optimum way of inserting a large number of records into a database (SQL2000 onwards) based upon a collection of objects.

Currently my code looks something similar to the snippet below and each record is inserted using a single sql simple INSERT command (opening and closing the database connection each time the function is called! - I'm sure this must slow things down?).

The routine needs to be able to cope with routinely inserting up to 100,000 records and I was wondering if there is a faster way (I'm sure there must be???). I've seen a few posts mentioning using xml based data and bulk copy routines - is this something I should consider or can anyone provide any simple examples which I could build upon?

foreach (DictionaryEntry de in objectList)
{
  eRecord record = (eRecord)de.Value;

  if (!record.Deleted)
  {
    createDBRecord(record.Id,开发者_运维知识库               
                   record.Index,
                   record.Name,
                   record.Value);
  }
}

Thanks for any advice,

Paul.


Doing that way will be relatively slow. You need to consider a bulk INSERT technique either using BCP or BULK INSERT, or if you are using .NET 2.0 you can use the SqlBulkCopy class. Here's an example of using SqlBulkCopy: SqlBulkCopy - Copy Table Data Between SQL Servers at High Speeds

Here's a simple example for SQL Server 2000:

If you have a CSV file, csvtest.txt, in the following format:

1,John,Smith

2,Bob,Hope

3,Kate,Curie

4,Peter,Green

This SQL script will load the contents of the csv file into a database table (If any row contains errors it will be not inserted but other rows will be):

USE myDB
GO

CREATE TABLE CSVTest
(
    ID INT,
    FirstName VARCHAR(60),
    LastName VARCHAR(60)
)
GO

BULK INSERT 
CSVTest
FROM 'c:\temp\csvtest.txt'
WITH
(
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n'
)
GO


SELECT *
FROM CSVTest
GO

You could write out the dictionary contents into a CSV file and then BULK INERT that file.

See also: Using bcp and BULK INSERT

If you implement your own IDataReader method, you could avoid writing to an imtermediate file. See ADO.NET 2.0 Tutorial : SqlBulkCopy Revisited for Transferring Data at High Speeds

Related SO question: how to pass variables like arrays / datatable to SQL server?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜