Suggestion on optimizing C# code
I have written a generic database helper method, which returns the records of a particular entity.
here is how I do it:
I have a class called Customer having 10 properties also having a property called TableName.
There is a Method which just take Type parameter, and return an array of passed type.
How the method work is, by using reflection it got a table name, and fire a select statement, and on the basis of DataReader it loops through each colum and Properties of passed Type.
So, the problem is suppose there are 1 million records and 10 properties. It loops for 10 (Properties) * (1,000,000 records) = 10,000,000 times
is there any optimized way to do this, something like using LINQ against a Datareader?
Here is a code
object[] LoadAll(Type type)
{
try
{
object obj = Activator.CreateInstance(type);
SqlConnection conn = new SqlConnection("connection string");
string tableName = type.GetField("TableName").GetValue(obj) as string;
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = string.Format("select * from {0}", tableName);
conn.Open();
List<object> list = new List<object>();
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
object obj1 = Activator.CreateInstance(type);
foreach (PropertyInfo propertyInfo in type.GetProperties())
{
obj.GetType().GetProperty(propertyInfo.Name).SetValue(obj1,reader[propertyInfo.Name],null);
}
开发者_如何学C list.Add(obj1);
}
}
Thanx
Try an object-relational mapper like NHibernate.
Sounds like you are calculating the size for every instance of an entity. You should have some meta-data controller which caches the size of an entity for each table name. Assuming I understand your problem right, for the same table name, the size will always be the same.
If I understad the problem correctly, it sounds like you could simply make the DB do the work for you. You say "and fire a select statement": Can you not fire a smarter select statement that does what you are explaining?
I don't fully understand what you are trying to do when you say that you are looping through each column. But look into the "Group by" and "aggrop" aka "Aggregate operators" and see if any of those can help you out.
For optimization point of view , there is no need to maintain a connection to db for 1 million records, mean u are interacting with database until your loop ends. :( . for optimization , you cache the whole table record in some dataset and then iterate it. not take the connection live to database for long time . hope it will be ans of your question . :)
You could probably tighten up the loop a bit to reduce calls involving reflection. You don't need to create that initial obj
either:
PropertyInfo[] properties = type.GetProperties();
while (reader.Read())
{
object obj = Activator.CreateInstance(type);
foreach (PropertyInfo propertyInfo in properties)
{
propertyInfo.SetValue(obj, reader[propertyInfo.Name], null);
}
list.Add(obj);
}
But to get it even faster you could give pass the LoadAll()
function a way to map the row to a new object, something along the lines of:
IEnumerable<T> LoadAll<T>(Func<DataReader, T> map) {
var tablename = typeof(T).GetField("TableName)......
// other connection and query stuff
while (reader.Read()) {
yield return map(reader);
}
}
// use it like:
var kittens = LoadAll<Kitten>(reader => new Kitten {
Name = (string)reader["Name"],
Colour = (string)reader["Colour"]
});
This also gives you more control over the mapping from the data layer to your domain object, for example your method using reflection would take a lot of modification to handle an enum
property, which would be straightforward to code in an explicit map function.
You could try installing the free trial of ReSharper, its inspection tools can suggest a variety of ways to optimize your code from the ground up.
精彩评论