I was consulting for a project today, and I was helping the team figure out how to test a method inside their business logic. Here's the thing - it was one of the least testable method's I've seen. It took me about 4 -5 hours to refactor the code to make it testable,and in the end it pretty much looked horrible.
We tested and and saw all the tests pass, but the feeling wasn't very good. This was legacy code at its most glorified being. Hard to test, hard to follow, and lots of dependencies sprinkled around.
I ended up make a couple of methods static on the class, make them take lots of parameters and send in lots of stubs (we used Rhino.Mocks). It worked, but we all felt kinda dirty in the end.
The thing that really kept coming up in my thoughts (and words) was how TypeMock could have easily solved the problems in this scenario. It would have taken less than 15 minutes to figure out how to test that thing with TypeMock. And yes, I wholeheartedly agree that using a powerfull such as TypeMock can help you 'forget' to do proper design, but in the end its about using the right tool for the right job.
For messy legacy situations, TypeMock will beat out any other mocking framework I know hands down because it allows simply to mess around with the internals of a system without changing its code base. That's a powerful thing for legacy scenarios, and as such it is a perfect fit. I'll be going there next week and plan on showing them some of this tool's wonders.
Still, some interesting thoughts about what they can do next when they write the code, to make sure it's more testable:
- Don't inherit from classes you can't control. Encapsulate and wrap it up. They inherited from a lot of ESRI stuff and it was impossible to instantiate many of the classes involved because the constructors kept on taking problematic dependencies to be created. Had they wrapped the dependencies instead of derive from them, it would be much easier to use "Extract & Override" on the code to make it testable.
- Make methods virtual by default. The Java way. That makes it easy for tools such as Rhino.Mocks to create stubs and mocks from your real classes and override those virtual methods easily (making it much more easily testable because you can override any method that does something problematic like talk to a web service).
- Don't use 'Sealed' unless you really have to. And then still don't use it unless it's a security issue. Sealed means you can't inherit from that class, and that leads to ugly stuff like I did today.
- Make Singletons Testable.
- Add a setter to the singleton instance, or make the factory that creates it configurable to create your own instance.
- Make sure a singleton always return an interface rather than a concrete class. It's easier to stub or mock an interface than to do the same with a class you'll need to inherit from. It also creates less complications down the road.
- Use internal keyword and the [assembly:internalsVisibleTo()] attribute to hide those setters from production code, but still make them visible to test code.
- If possible, create an interface per class. You never know when you're gonna need it.
- If possible, write the code in a test driven manner. That's the best way to ensure code coverage,having real tests for the code and having the code testable by default.
- Read this article about design for testability
Some general testing tips:
- In Rhino.Mocks, using MockRepository.Stub<IMyInterface>() cane create an object that can have its properties be set and get freely on it, with no extra line of code. More on this here.
- TypeMock is a lifesaver when it comes to Legacy code.
- Writing interaction tests (using mocks and stubs) will help you (force you) know the code your testing intimately. Sometimes a little too much, actually.
- If you're planning to test the web UI, have a look at Selenium, Fit and Watin
- If you're planning to test database related code, look here.
- Some jobs are better left for the QA Department, such as visual testing (that something is drawn correctly for example), load testing, and sometimes data driven inputs for tests.