I got some input from a commenter on my suggestion that debugging skills are more important than the use of unit tests. He disagreed, suggesting that by using unit tests you can all but avoid the use of a debugger.
This is the original post.
I gracefully accept that we might not come to agreement on this.
Writing unit tests does not mean you have to have less sense about debugging. In fact, what if the error is in your test?
What about operations that are driven by meta-data created on-the-fly at run time that alter program flow and logic? By third party inputs?
What if you use composition patterns and allow several layers of dynamic composition? Would you write a test for every mutation? Good job security, but not practical.
What happens when a library you've been given to use isn't working per the documentation?
What about when a library is being used for something or in a way that wasn't conceived in the original design?
What if you have to maintain a code base that wasn’t developed with TDD in mind and is next-to-impossible to write unit tests for?
While I do mocking and unit tests and am a staunch supporter of behaviour-based design and TDD, you cannot predict nor test for every pre-condition that might crop up. Even if you're really good, and your write LOTS of tests you might miss some.
And when that happens, you'd better know your head from your buttocks in your debugger!
It is these conditions that may, in the future, warrant more complex tests that provide coverage for these types of scenarios.
I would argue, however that there are times – especially in a smaller shop – when you need to pull out a Swiss Army knife and leave the combat tank at home.