How to avoid the SQL Server's "rebuild statistics" when doing performance testing?
I'm doing some SQL tuning these days and find one weird sql during the test:
SELECT StatMan([SC0],[SC1], [SB0000])
FROM (SELECT TOP 100 PERCENT [SC0],[SC1], step_direction([SC0]) over (order by NULL) AS [SB0000]
FROM (SELECT [tableA] AS [SC0],[tableB] AS [SC1]
FROM [dbo].[url] WITH (READUNCOMMITTED,SAMPLE 3.408654e+000 PERCENT)
) AS _MS_UPDSTATS_TBL_HELPER
ORDER BY [SC0],[SC1], [SB0000]
) AS _MS_UPDSTATS_TBL
OPTION (MAXDOP 1)
Looks this is doing some "reindex" or "rebuild" some db index according to SQL Server. But my开发者_JAVA百科 question is how can we avoid this during the long load test besides "reindex" for each tables before the testing.
And this SQL will consume 16862ms because of my table contains enough rows. And there are many insert action in my test.
This seems to be from updating statistics.
Will updating statistics happen in a normal production environment? If so, shouldn't a load test, to reflect a production environment, update statistics as well?
To turn off the AUTO_UPDATE_STATISTICS option, use sp_autostats on the desired table(s) (see http://msdn.microsoft.com/en-us/library/ms188775.aspx ).
Before changing anything, I would let it run as is, with SQL Profiler and then get Tuning Advisor to analyze results. It can find some inefficiencies in table design. Also some pretty innocent things can affect it: e.g. having "too many indexes" or using GUID as PK on a table that has more inserts than selects...
If all inefficiencies are resolved, and you still have performance hit, then yea, you can turn auto-stats off and get stats updated at off peak time. But then you must do the same on production, as person before have said.
精彩评论