Most efficient way to run several SQL statements
I need to run a stored procedure for every item in a list (from an ASP .NET app). I have a framework of services which allow me to do this with minimal lines of code, but I'm afraid using that method will be to inefficient. Each time I call the service method, a database connection is opened, the stored procedure is run, and then the database connection is closed. It's safe and all, but calling that for 20 items might be a bit slow, and I need this to be as fast as poss开发者_JAVA技巧ible.
What would be a more efficient way of doing this?
Well, if you are using ADO.Net, then even though you are "closing" the connection The ADO.Net engine by default only puts the connection back into it's cache (connection pool) to be reused, so it isn't quite as expensive as you think...
But, if you want to avoid the round trip for each item, pass all the items in a pipe delimited list (or some other technique), and write a stored procedure that will process them all at once in a single set-based operation. That will get you the best performance.
Do you know that there is a performance issue here at all?
I wouldn't worry about it unless you've actually found it to be a bottleneck. By default, connections to SQL Server from the System.Data.SqlClient.SqlConnection
class (and, I'd imagine, other providers like Oracle) are pooled, so there's not a 1:1 correspondence between opening a SqlConnection
and an actual database connection.
Strictly speaking, though, it does take some time (even if it might be negligible), so the only way to make it faster would be to either:
- Leave the connection open and resuse it (easy, but likely not going to make much of a difference)
- Prepare a single SQL batch that calls the procedure multiple times, reducing your round-trips to the server. (Might actually have an impact--though probably not for only 20 items--but more difficult)
For example:
using(SqlCommand cmd = conn.CreateCommand())
{
cmd.CommandType = CommandType.Text;
cmd.CommandText = @"
exec YourProcedure @param1;
exec YourProcedure @param2;
exec YourProcedure @param3;";
cmd.ExecuteNonQuery();
}
Given that your number of items is likely dynamic, though, you'd essentially have to generate this SQL dynamically (though still parameterize all of your user-supplied values!).
Twenty items really isn't that many even if you were working hard to do it inefficiently. If you're using SQL server and the ADO.NET provider, connection pooling will help you out automatically. I don't think you have too much to worry about.
- You can write the data to a table and process all of it with one call of the procedure.
- You can alter the client code to reuse the connection.
- If you retrieve the data via SQL, replace the procedure by a function (which can be included in the SQL) so the Database Engine can pipeline the calls.
- You can alter the procedure to take all the data as an argument so you just need one call.
精彩评论