Follow @RoyOsherove on Twitter

Employing TDD and unit testing with Waterfall methodologies

One of the things I've encountered lately is large organizations (1000's of developers) trying to get "Agile" while still trying to retain some sort of sanity in the ranks. I do believe that such moves should be made not all at once (unless you'r pretty small) but at small incremental steps.
One of the easiest steps almost any organization can take without taking too much burden on themselves and changing existing processes too much is Test Driven Development. You get much of the benefits involved with unit tests(decoupled interfaces, a safety net for maintenance, thinking through spec requirements) but can still leave yourself using good ol' waterfall to do things. Because most of the changes occur at the individual developer level and not in project management level, you can slowly train your troops to write goo tests, and perhaps even write them before they write the actual production code.
 
One of the things I was asked by such an organization was "How do you enforce such Agile rules as 'Write unit tests for your code'?"
While it may sound a little counter-intuitive (enforcing rules is counter-intuitive with Agile development practices - the team should want to accomplish these tasks by themselves) we need to realize that with every large organizational change, there are "stakeholders" that drive this change from within. A CTO, project managers, developers and others maybe single handedly responsible for somebody up there making the decision to actually change something. Usually, these stakeholders need to be backed up by some processes that the organization can easily digest so the change can occur in a way that fits their way of work. Traceability and accountability in waterfall models is what we're talking about here and we need to realize that in makes perfect sense in those regards to have such demands as being able to oversee that people are doing "the right thing" from above.
 
There are several ways to incorporate TDD and unit testing in particular in a non-agile process:
  • Specs and design documents contain up-front descriptions of Several key unit tests that must pass in order for the spec to be completed. For example, a spec for a class in charge of logins can contain an appendix listing 5-20 unit tests with required criteria (ASSERTS) that needs to pass. The spec does NOT contain the actual tests, but a short 1-2 sentence description of each test, required parameters perhaps and expected outcome. A developer coding to the spec can then use TDD or simple test writing to accomplish the spec and add their own unit tests to match other sequences of actions not specified in the descriptions.
  • An automated tool can be run on the compiled source code using reflection (or other introspective engines in the language) that has specific search criteria for tests. This tool can alert us if there are specific public classes in production code that have methods that do not have at least one matching test method in the "Testing" project. For example, for a class named "LogingManager" with Method "IsLoginOk" the tool would look for a class names "LoginManagerTests" that has at least one method that begins with "testIsLoginOk". We need to remember that such a tool would only provide us with important information and red flags on tests that might be missing. It doe NOT tell us if tests are good enough, if there are enough, just if there are too little. Again - consider this more of a "test coverage" tool to spot potential problems and find red flags automatically rather than a quality control tool. Large organizations making the move might need this to have the stakeholders something to report about and give at least some sort of deterministic numbers on the state of unit testing adoption in the organization. Again - if the developers truly don't want to do testing they'll simply write such methods with no implementation. So the motivation to do testing should still be there, this is just another way to make upper management more relaxed by keeping things at least somewhat "visible".
  • Coding guidelines that contain specific instruction on the use of TDD methodologies. That means that all developers must explicitly follow a common way of doing things which again, might not be the best way to do this, but it's still better than not doing this at all. At least this way unit tests and TDD gain some sort of recognition and slowly people will adapt to doing them correctly. This is more of a "foot in the door" sort of way and will probably fail if the developers don't have proper training and realization why this is a good way of doing things (at lest, it's better than the old way).
  • Have a group of skilled "code warriors" which have undergone extensive training in the method you'd like to promote, and have them spread the word in the organization. These are your code gurus, your team leaders, your highly motivated and skilled developers who love to try out new things. They'll be the best ambassadors of all to do this.
I'm finishing this in the feeling that I'm leaving some things out. I'll update as needed.

Congratulations to Yosi Taguri!

I'm not really here, sorry.