Update, 30 Nov: Microsoft has removed their guidelines, saying, "It's a bug and we're fixing it."
Sam Gentile pointed me in the direction of Microsoft's new Guidelines for Test-Driven Development (TDD). Apparently there's already been quite a reaction to it. Sam references some blog entries and I want to make the point even more clear.
Microsoft has completely missed the point of TDD. They got it wrong. Do not follow their guidelines: they will decrease productivity. You'll find that the process they've described doesn't work. If you stick with it, you'll find yourself writing increasingly bad code to work around its problems.
A year or two ago, my brother told me that he tried TDD and "it didn't work." Naturally, I had to find out more. "What did you do?" I asked.
"First I designed my class," he said, "then I wrote a bunch of tests, then I tried to write my class to pass the tests." (Does this sound familiar? It's the process Microsoft is recommending.)
I could see the problem already. "That doesn't sound like TDD to me. What happened? Why do you say it didn't work?"
"It was a complete waste of time," he said, frustrated. "I spent days writing the tests when I could have been productively writing code. Once I finally started writing code, the tests didn't fit quite right and I had to change them. Or I would have if I didn't have real work to do. I ended up throwing away a bunch of tests and the end result didn't really test anything important. I wasted nearly a week on just one class and it didn't get me anything useful."
I started to explain how TDD really works, but my brother didn't want to hear it. He had already decided that TDD was a waste of time. And he was right, in a way. The thing he was doing, that he called TDD, was a waste of time. It decreased his productivity and didn't work. Good thing he didn't stick with it.
Real TDD is well documented, but I'll describe it anyway. It's a highly-iterative process in which you design, test, and code more or less at the same time. You start out by thinking about what features your class should support and what tests you need to get there, true. Then, in very rapid and short cycles, you write tests, code, and improve the design. This is famously described as "red-green-refactor." I call out another step at the beginning: "think."
- Think: Figure out what test will best move your code towards completion. (Take as much time as you need. This is the hardest step for beginners.)
- Red: Write a very small amount of test code. Only a few lines... usually no more than five. Run the tests and watch the new test fail: the test bar should turn red. (This should only take about 30 seconds.)
- Green: Write a very small amount of production code. Again, usually no more than five lines of code. Don't worry about design purity or conceptual elegance. Sometimes you can just hardcode the answer. This is okay because you'll be refactoring in a moment. Run the tests and watch them pass: the test bar will turn green. (This should only take about 30 seconds, too.)
- Refactor: Now that your tests are passing, you can make changes without worrying about breaking anything. Pause for a moment. Take a deep breath if you need to. Then look at the code you've written, and ask yourself if you can improve it. Look for duplication and other "code smells." If you see something that doesn't look right, but you're not sure how to fix it, that's okay. Take a look at it again after you've gone through the cycle a few more times. (Take as much time as you need on this step.) After each little refactoring, run the tests and make sure they still pass.
- Repeat: Do it again. You'll repeat this cycle dozens of times in an hour. Typically, you'll run through several cycles (three to five) very quickly, then find yourself slowing down and spending more time on refactoring. Than you'll speed up again. 20-40 cycles in an hour is not unreasonable.
This process works well for two reasons. First, you're working in baby steps, constantly forming hypotheses and checking them. ("The bar should turn red now... now it should turn green... now it should still be green... now it should be red...") Whenever you make a mistake, you catch it right away. It's only been a few lines of code since you made the mistake, which makes the mistake very easy to find and fix. We all know that finding mistakes, not fixing them, is the most expensive part of programming.
The other reason this process works well is that you're always thinking about design. Either you're deciding which test you're going to write next, which is an interface design process, or you're deciding how to refactor, which is a code design process. All of this thought on design is immediately tested by turning it into code, which very quickly shows you if the design is good or bad.
Microsoft's process, on the other hand, won't work well. On the practical front, it's boring. Nobody wants to spend hours or days writing tests. Second, it requires too much accuracy. You'll learn new things about your design as you write code. If you've already written all of the tests, you'll have to keep changing them. Either you'll give up on the tests or you'll waste extra time up front trying to predict all of the details of your design... essentially duplicating the effort of coding using pictures.
Worse, though, Microsoft's process misses the two biggest advantages of TDD. It doesn't work in baby steps, so you won't catch and fix mistakes quickly. And it doesn't increase the amount of time you spend thinking about design. It doesn't even include refactoring!
The saddest thing about Microsoft's TDD foul-up is that TDD is well understood and very well documented. Anybody who understands TDD could tell you what I've written above. A minimum amount of research would have allowed Microsoft to learn it, too. Whoever wrote those guidelines should be ashamed of themselves. Perpetrating ignorance at such a grand scale is a travesty. Microsoft, if you're going to provide guidance to millions of programmers, you should at least recognize when you're clueless... and keep your damn mouth shut. I'm appalled.
PS: When you're writing unit tests, be sure to follow Michael Feathers' guidelines.