Sql Connection open slow down or make application performance better
I am querying to a table for specific records having 5,00,000 records in application. When ever i query
eg:
........... Some Code
for i = 0 ; i < 60,000
............. Some code
connection.Open();
select * from ERet where [Key] = Variable+i ;
connection.Close();
-----------开发者_如何学编程---some calculation
.............some code
next
Should this connection.open/close will affect the application performance? Should i need connection to open before the loop and close after loop for the better application performance.
In this particular case, because you have a tight loop (well, it might not be so tight, but you definitely are performing a large amount of operations in that scope), you should keep the connection open outside of the loop before you enter the loop, and then make sure to close it when the loop is done. For example:
using (var connection = new SqlConnection("connection string"))
foreach (...)
{
// Do your work here.
}
While connections might or might not be recycled/pooled (depending on your settings), there is still some overhead in pooling connections (they need to be reset when you pull them from a pool), and doing anything 60,000 times is going to have some overhead; you might as well take it out where you can and where you know it won't impact you negatively.
Also, as Mitch Wheat points out in his answer, and important question to ask is whether or not you have to perform 60,000 queries; it would appear from your code you are performing the same exact query over and over, when just once might suffice, or you might be able to collect the conditions you need to query on into one query and then process your data.
You're creating 60,000 connections, and it doesn't appear you need to.
Do a
Using(open connection)
{
for(i = 1; i < 60,000; ++i)
{
query
}
}
Now you've got one connection open for the life of your loop, and it will dispose nicely when it's done.
Yes, open the connection before the loop (using()
) and close after the loop completes.
On the other hand, why are you doing 60,000 SELECT's that way?
My recommendation is usually to measure what's going on, and see if you have a problem. It's entirely possible that the overhead of creating and closing the connection is trivial compared to the other things going on, and by refactoring your code to optimize for connections, you're missing (or even making worse) bigger issues.
For instance - if you can do your select in a single statement, and iterate over the resultset, your application should go a lot faster. If you're worried about a recordset with 500.000 records in memory - test it.
Optimizing without testing "before" and "after" is pointless, and often counterproductive.
精彩评论