My suggestion originally was to write tests for classes "when touched" -- if you change the code... you must write a unit test as part of that change.
And I think that's a fine suggestion. You might expand on that to ensure that the new unit test covers the whole class being modified instead of just the local change, but the idea of moving forward on an incremental, as-needed basis is dead on.
How do you explain that it is ok that this class doesn't have a test, but you MUST write a test for your new code?
It ain't a perfect world.
Unless there are huge advantages to stopping new development in favor of retrofitting unit tests (maybe due to government regulations, or an impending lawsuit, or something like that), it's probably not cost-effective to do so. So you do the next best thing; approach the problem gradually and incrementally.
Also, after some initial looks at the code, it some of it seems very difficult to go back and write tests for. Is there anyone who has done this and has some words of wisdom to offer?
That's very often the case. Unit testing improves the design of code, period. You realize this once you start comparing the design of code written with and without unit tests.
Now this brings up an additional problem. Do you make the investment to redesign the legacy code while you're adding tests? It may or may not make sense to do that depending on your project. If it's possible to refactor the legacy code gradually, "just to make it easier to test", then that might be the most practical way forward.
[ February 18, 2004: Message edited by: Andy Hunt ]