How slow is too slow for unit tests?
Michael Feathers, in Working Effectively With Legacy Code, on pages 13-14 mentions:
A unit test that takes 1/10th of a second to run is a slow unit test... If [unit tests] don't run fast, they aren't unit tests.
I can understand why 1/10th a second is too slow if one has 30,000 tests, as it would take close to an hour to run. However, does this mean 1/11th of a second is any better? No, not really (as it's only 5 minutes faster). So a hard fast rule probably isn't perfect.
Thus when considering how slow is too slow for a unit tests, perhaps I should rephrase the question. How long is too long for a developer to wait for the unit test suite to complete?
To give an example of test speeds. Take a look at several MSTest unit test duration timings:
0.2637638 seconds
0.0589954
0.0272193
0.0209824
0.0199389
0.0088322
0.0033815
0.0028137
0.0027601
0.0008开发者_如何学JAVA775
0.0008171
0.0007351
0.0007147
0.0005898
0.0004937
0.0004624
0.00045
0.0004397
0.0004385
0.0004376
0.0003329
The average for all 21 of these unit tests comes to 0.019785 seconds. Note the slowest test is due to it using Microsoft Moles to mock/isolate the file system.
So with this example, if my unit test suite grows to 10,000 tests, it could take over 3 minutes to run.
I've looked at one such project where the number of unit tests made the system take too long to test everything. "Too long" meaning that you basically didn't do that as part of your normal development routine.
However, what they had done was to categorize the unit tests into two parts. Critical tests, and "everything else".
Critical tests took just a few seconds to run, and tested only the most critical parts of the system, where "critical" here meant "if something is wrong here, everything is going to be wrong".
Tests that made the entire run take too long was relegated to the "everything else" section, and was only run on the build server.
Whenever someone committed code to the source control repository, the critical tests would again run first, and then a "full run" was scheduled a few minutes into the future. If nobody checked in code during that interval, the full tests was run. Granted, they didn't take 30 minutes, more like 8-10.
This was done using TeamCity, so even if one build agent was busy with the full unit test suit, the other build agents could still pick up normal commits and run the critical unit tests as often as needed.
I've only ever worked on projects where the test suite took at least ten minutes to run. The bigger ones, it was more like hours. And we sucked it up and waited, because they were pretty much guaranteed to find at least one problem in anything you threw at them. The projects were that big and hairy.
I wanna know what these projects are that can be tested comprehensively in seconds.
(The secret to getting things done when your project's unit tests take hours is to have four or five things you're working on at the same time. You throw one set of patches at the test suite and you task-switch, and by the time you're done with the thing you switched to, maybe your results have come back.)
I've got unit tests that takes a few seconds to execute. I've got a method which does very complicated computing and billions and billions of operations. There are a few know good values that we use as the basis for unit testing when we refactor this tricky and uber-fast method (which we must optimize the crap out of it because, as I said, it is performing billions and billions of computations).
Rules don't adapt to every domain / problem space.
We can't "divide" this method into smaller methods that we could unit test: it is a tiny but very complicated method (making use of insanely huge precomputed tables that can't be re-created fast enough on the fly etc.).
We have unit tests for that method. They are unit tests. They takes seconds to execute. It is a Good Thing [TM].
Now of course I don't dispute that you use unit testing libraries like JUnit for things that aren't unit testing: for example we also use JUnit to test complex multi-threaded scenario. These ones aren't "unit test" but you bet that JUnit still rules the day :)
EDIT See my comment to another answer (Link). Please note that there was a lot of back and forth about Unit Testing so before you decide to upvote or downvote this answer, please read all the comments on that answer.
Next, use a tool like Might-Moose (Mighty-Moose was abandoned but there are other tools) that only runs the tests affected by your code change (instead of your entire test library) every time you check-in a file.
So what's your question? :-) I agree, the true metric here is how long developers have to wait for a complete run of the unit tests. Too long and they'll start cutting corners before committing code. I'd like to see a complete commit build take less than a minute or two, but that's not always possible. At my work, a commit build used to take 8 minutes and people just started only running small parts of it before committing - so we bought more powerful machines :-)
How long is too long for a developer to wait for the unit test suite to complete? It really depends how long the devs are happy to wait for feedback of their change. I'd say if you start talking minutes than it's too slow and you should probably break up the test suite into individual test projects and run them separately.
精彩评论