开发者

Why is this sql script so slow in SQL Server Management Studio?

I have a SQL Script that inserts about 8000 rows into a TABLE variable.

After inserting into that variable, I use a WHILE loop to loop over that table and perform other operations. That loop is perhaps 60 lines of code.

If I run the TABLE variable insert part of the script, without the while loop, it takes about 5 seconds. That's great.

However, if I run the entire script, it takes about 15 minutes.

Here's what is interesting and what I can't figure out:

When I run the entire script, I don't see any print statements until many minutes into the script.

Then, once it figures out what to do (presumably), it runs the inserts into the table var, does the loop, and that all goes rather fast.

Then, toward the end of the loop, or even after it, it sits and hangs for many more minutes. Finally, it chugs through the last few lines or so of the script that come after the loop.

I can account for all the time taken during the insert, and then all the time taken in the loop. But I can't figure out why it appears to be hanging for so many minutes before and at the end of the script.

for kicks, I added a GO statement after the insert into the temp table, and everything up to that point ran 开发者_运维知识库as you'd expect; however, I can't do that because I need that variable, and the GO statement obviously kills that variable.

I believe I'm going to stop using the table variable and go with a real table so that I can issue the GO, but I would really like to know what's going on here.

Any thoughts on what SQL Server is up to during that time?

Thanks!


You can always check what a script is doing from the Activity Monitor or from the sys.dm_exec_requests view. The script will be blocked by something, and you'll be able to see what is that is blocking in the wait_type and wait_resource columns.

There are several likely culprits, like waiting on row locks or table locks, but from the description of the problem I suspect is a database or log growth event. Those tend to be very expensive once the database is a big enough and the default 10% increase means growth of GBs. If that's the case, try to pre-size the database at the required size and make sure Instant File Initialization is enabled for data files.


PRINTs are buffered, so you can't judge performance from them.
Use RAISERROR ('Message', 0, 1) WITH NOWAIT to see the output immediately.

To understand what the process is doing, I'd begin with calling sp_who2 a few times and looking at the values for the process of interest: isn't it being blocked, what are the wait types if any, and so on. Also, just looking at the server hardware load (CPU, disk activity) might help (unless there're other active processes).

And please post some code. Table var definition and the loop will be enough, I believe, no need for INSERT stuff.


If you are using the table variable, can you try substituting it with temp table and see if there is any change in performance?

And if possible, please post the code so that it can be analysed for possible area of interest.


From the wording of your question, it sounds like you're using a cursor to loop through the table. If this is the case, issuing a "SET NOCOUNT ON" command before starting the loop will help.

The table variable was mentioned in a previous answer, but as a general rule, you should really use a temp table if you have more than a few rows.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜