"Thread was being aborted" 0n large dataset
I am trying to process 114,000 rows in a dataset (populated from an oracle database). I am hitting an error at around the 600 mark - "Thread was being aborted".
All I am doing is reading the dataset, and I still hit the issue. Is this too much data for a dataset? It seems to load into the dataset ok though. I welcome any better ways to process this amount of data.rootTermsTable = entKw.GetRootKeywordsByCategory(catID);
for (int k = 0; k < rootTermsTable.Rows.Count; k++)
{
string keywordID = rootTermsTable.Rows[k]["IK_DBKEY"].ToString();
...
}
public DataTable GetKeywordsByCategory(string categoryID)
{
DbProviderFactory provider = DbProviderFactories.GetFactory(connectionProvider);
DbConnection con = provider.CreateConnection();
con.ConnectionString = connectionString;
DbCommand com = provider.CreateCommand();
com.Connection = con;
com.CommandText = string.Format("Select * From icm_keyword WHERE (IK_IC_DBKEY = {0})",categoryID);
com.CommandType = CommandType.Text;
DataSet ds = new DataSet();
DbDataAdapter ad = prov开发者_如何学Cider.CreateDataAdapter();
ad.SelectCommand = com;
con.Open();
ad.Fill(ds);
con.Close();
DataTable dt = new DataTable();
dt = ds.Tables[0];
return dt;
//return ds.Tables[0].DefaultView;
}
A few thoughts:
- Depending on what you are doing, a DataReader may be faster than a DataTable. A DataReader allows you to process one row at a time as it is read from the database. If you are doing a fair bit of processing on each row, then your current method (a DataTable) is likely the better approach, as it allows you to pull all rows into memory and process them without keeping the connection open.
- Consider "SELECT IK_DBKEY" instead of "SELECT *", as you will be pulling back less data over the wire and into memory.
- Finally, you might put a try/catch around your code to ensure it is the actual source of the error message. If you are using ASP.Net , "Thread was being aborted" is a pretty common error, which usually indicates that a processing request was interrupted (e.g. someone navigated away from away from a long-running request).
Is your SqlCommand possibly timing out? Try playing around with the CommandTimeout property to see if that's possibly a cause:
com.CommandTimeout = 300; // sets a timeout to complete the command of five minutes
First, reduce the amount of data that you get by not using select *
in the query. Specify which fields you actually need from the table. At least one field is not needed at all as you already know it's value.
Instead of using a DataTable
you can use a DataReader
. That way you can process the records one by one so that you don't have to get all the record into memory at once.
精彩评论