Finding patterns of failure in a Unit Test
I'm new to Unit Testing, and I'm only getting into the routine of build开发者_开发知识库ing test suites. I have what is going to be a rather large project that I want to build tests for from the start.
I'm trying to figure out general strategies and patterns for building test suites. When you look at a class, many tests come to you obviously due to the nature of the class. Say for a "user account" class with basic CRUD operations, being related to a database table, we will want to test - well, the CRUD.
- creating an object and seeing whether it exists
- query its properties
- change some properties
- change some properties to incorrect values
- and delete it again.
As for how to break things, there are "fail" tests common to most CRUD classes like:
- Invalid input data types
- A number as the ID key that exceeds the range of the chosen data type
- Input in an incorrect character encoding
- Input that is too long
And so on and so on.
For a unit test concerned with file operations, the list of "breaking things" could be
- Invalid characters in file name
- File name too long
- File name uses incorrect protocol or path
I'm pretty sure similar patterns - applicable beyond the unit test one is currently working on - can be found for most units that are being tested.
Now my question is:
Am I correct in seeing such "breaking patterns"? Or am I getting something completely wrong about Unit testing, and if I did it right, this wouldn't be an issue at all? Is Unit Testing as a process of finding as many ways to break the unit as possible the right way to go?
If I am correct: Are there existing definitions, lists, cheat sheets for such patterns?
Are there any provisions (mainly in PHPUnit, as that's the framework I'm working in) to automate such patterns?
Is there any assistance - in the form of check lists, or software - to aid in writing complete tests?
You are basically right. Looking for ways that could possibly break your code is a key part and skill in unit testing. However, unit testing as applied in TDD works a bit differently. In TDD you first write a test for a new piece of functionality, then create the code to make that test pass. So the emphasis is different here, even though the end result is similar.
In TDD, one "changes hats" constantly - a little bit of testing, a little bit of coding. So in this method, testing is not an automatable part, but one could almost say it is the key to the creative process. While writing tests, you are also designing the interface of your unit and are thinking from the point of view of its (future) clients - what can they expect, and what are they supposed to provide? Then you switch hats and go inside the unit to fulfill these expectations.
So I don't feel this can be replaced by simply checking items on a list. Of course, once you run out of ideas to test the actual unit, it never hurts to check such a list. However such sheets by their nature can only contain generalizations which may or may not apply to a specific project and a specific class to test. But you apparently have the experience and the mindset to find good test cases for your specific units anyway :-)
精彩评论