开发者

Why does the duration of a unit test result not equal the difference between start and end time?

I've written a test for a long running process. when it completes, I get the foll开发者_开发问答owing displayed in the test results:

Test Run: [blah]
Test Name: PopulateDataTest
Result: Passed
Duration: 00:03:17.0017261
Computer Name: [name here]
Start Time: 3/8/2011 12:54:18 PM
End Time: 3/8/2011 1:02:31 PM

Doing some math on the start and end dates, I get about 8 minutes, not 3. What am I missing?


In addition to what @Matt Spinelli said, I suspect that the reported duration value also reflects the amount of time that the CPU actually spent executing the test, rather than the End Time - Start Time of the test. That is, I think that CPU time is the metric being reported, since that is the more important thing. If your computer happens to start updating Adobe Acrobat (or whatever) during the test then that could inappropriately be reflected in the End Time - Start Time value.

On the other hand, I have little knowledge of Visual Studio testing framework. I do not know how the test mechanism would deal with the execution time of multi-threaded processes, for example. So coming from me, this is all just speculation.


MSTest has some initialization that goes on each time you run the test suite (i.e. creating folders for the test run, copying files/assemblies, starting and updating the unit testing pane, etc).

I agree that this is irritating as I've seen this behavior too. I presume you have a pretty good number of tests, files, and/or assemblies. If you are using Microsoft Moles, this too seems to slow down the initiation speed.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜