开发者

Import DataSet into SQL Server 2008 Express

I have a very large DataSet containing about 160.000 records. If I loop trough the dataset and import every record, it can take about 20 minutes before the complete dataset is imported into the SQL Server.

Isn't there a faster way for importing the dataset at once in the database?

The dataset is created from a file I process which the user provides, then I have 1 table called lets say "ImportTable" containing about开发者_运维问答 14 columns. The columns correspond with the columns in the DataSet.

I use Visual Studio 2010 professional with c#.

Thanks in advance!


You should take a close look at the SqlBulkCopy class.

It's a C# component (no external app), it takes a DataSet or DataTable as input, and copies that into SQL Server in a bulk fashion. Should be significantly faster than doing a row-by-agonizing-row (RBAR) insert operation...

Better yet: you don't even need to import your entire data set into memory - you can define a SqlDataReader on your base data, and pass that to SqlBulkCopy to read from the SqlDataReader and bulk insert into SQL Server.


You may want to take a look at the bcp command line utility. It lets you load data directly from a file into a table in the database. Depending on how the user generated file looks, you may need to re-format it, but if it has a simple delimited format you can probably use it as-is with bcp.


You can use make dataset xml using DataSet.getXml function. and pass input paremeter for SP.

for example

Create PROCEDURE dbo.MyInsertSP
(
        @strXML varchar(1000) 
)
AS
Begin
    Insert into publishers
   Select     * from OpenXml(@intPointer,'/root/publisher',2)
    With     (pub_id char(4), pub_name varchar(40), city varchar(20), 
                  state char(2),9) country varchar(20))
    exec sp_xml_removedocument @intPointer
RETURN
End

Hope this make sense.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜