Segmented Data Load: Table records to xml
I have a sequence of sql queries that result in very large datasets that I have to query against a database and write them to files. I have about 80 queries and each one produces somewhere between 1000 records to 10,000,000 records. I cannot change the queries themselves. What I'm trying to do is read 500,000 records at a time for each query and write to a file. Here's what I have so far
void WriteXml(string tableName, string qu开发者_如何学编程eryString)
{
int pageSize = 500000;
int currentIndex = 0;
using (
SqlConnection connection =
new SqlConnection(CONNECTION_STRING))
{
using (SqlCommand command = new SqlCommand(queryString, connection))
{
try
{
connection.Open();
SqlDataAdapter dataAdapter = new SqlDataAdapter(command);
int rowsRead = 0, count = 0, index = 0;
do
{
DataSet dataSet = new DataSet("SomeDatasetName");
rowsRead = dataAdapter.Fill(dataSet, currentIndex, pageSize, tableName);
currentIndex += rowsRead;
if (dataSet.Tables.Count > 0 && rowsRead > 0)
{
dataSet.Tables[0].WriteXml(string.Format(@"OutputXml\{0}_{1}.xml", tableName, index++),
XmlWriteMode.WriteSchema);
}
}
while (rowsRead > 0);
}
catch (Exception e)
{
Log(e);
}
}
}
}
This works but it's very very slow. I'm pretty sure I'm doing something wrong here because when I run it, the application hogs up most of my memory (I have 6GB) and it takes for ever to run. I started it last night and it is still running. I understand I'm dealing with a lot a records but I don't think it's something that would take so many hours to run.
Is this the right way to do paged/segmented data read from a database? Is there any way this method could be optimized or is there any other way I can approach this?
Do let me know if I'm not clear on anything and I'll try to provide clarification.
The paging overloads for DataAdapter.Fill still get the entire result set beneath the covers. Read here:
http://msdn.microsoft.com/en-us/library/tx1c9c2f%28vs.71%29.aspx
the part that pertains to your question:
The DataAdapter provides a facility for returning only a page of data, through overloads of the Fill method. However, this might not be the best choice for paging through large query results because, while the DataAdapter fills the target DataTable or DataSet with only the requested records, the resources to return the entire query are still used. To return a page of data from a data source without using the resources required to return the entire query, specify additional criteria for your query that reduces the rows returned to only those required.
In Linq2Sql, there are convenient methods Skip and Take for paging through data. You could roll your own by using a parameterized query constructed to do the same thing. Here is an example to skip 100, and take 20 rows:
SELECT TOP 20 [t0].[CustomerID], [t0].[CompanyName],
FROM [Customers] AS [t0]
WHERE (NOT (EXISTS(
SELECT NULL AS [EMPTY]
FROM (
SELECT TOP 100 [t1].[CustomerID]
FROM [Customers] AS [t1]
WHERE [t1].[City] = @p0
ORDER BY [t1].[CustomerID]
) AS [t2]
WHERE [t0].[CustomerID] = [t2].[CustomerID]
))) AND ([t0].[City] = @p1)
ORDER BY [t0].[CustomerID]
精彩评论