Fog Creek Software
Discussion Board




Encouraging Testable Code

After being a developer for several years, I recently started a new job as a tester on a .NET project that is just getting started.  Since I'm in on this project from the very beginning, I think I've got a wonderful opportunity to encourage more test-oriented development than I would otherwise get if I were brought in at the end.  In an interesting twist of circumstance, I actually have more real-world .NET experience than the developers (and they're not yet object-oriented guys) so I may be able to use the mentor role as additional influence.

Although this seems like an ideal setup for a tester, I don't want to blow my chances by either making unconvincing cases for good development practices, or picking the wrong battles.

I'm considering focusing my educational efforts on design-by-contract using asserts, and exception handling, since I think those will be the most helpful for testing.  But if they skimp on object-oriented design, that probably won't help as much as it could.

I'll most likely have to influence programming style from afar, since I don't think I'll have much opportunity to show by example as a tester, at least not at the start.  Any thoughts on what I'll realistically be able to accomplish, and any suggestions on increasing the odds of being successful would be greatly appreciated, as well as any thoughts on what I should be pushing for at the design phase.

ODN
Wednesday, April 09, 2003

Why not jump abroad the development team since you've got more experience and it's greenfields development, then hire somebody who is purely test focused and be a team leader across both areas.

Realist
Wednesday, April 09, 2003

The project manager wouldn't like that.  He wants my input on the design phase, but that's it.  Besides, I'm already mentally committed to being the best tester I can be.  I really want to do this right.  I figure it'll make me a better developer.

One other thing I'm definitely going to do is encourage unit testing by the developers as they're writing the classes.  Sure, I'll be using the testing framework too, but wearing a different hat.  They'll be validating that it works, while I'll be actively trying to break it... two very different categories of unit tests.

ODN
Wednesday, April 09, 2003

The second paragraph of your last comment is the key.

Achieve that and you'll have achieved the virtuous circle. 

Simon Lucy
Wednesday, April 09, 2003


If you can get them to write unit & integration tests using a testing framework like cppUnit, you're on your way.

As a "tester", then, you just extend the integration tests a bit and do system/gui testing.

It sounds like you might be in pretty decent shape.  If the coders are writing true automated unit tests, that's a great first step.  Good luck!

Matt H.
Wednesday, April 09, 2003

If you can get access to tools that will run the project's unit tests and tell you what percent of the application code is covered by them, that might be useful.

"Write a bunch of unit tests" isn't a concrete goal. "Write enough unit tests to cover 50% of the code" (up from 25% when you start) is a goal against which the team can track progress and feel like they've accomplished something when they're done.

Beth Linker
Wednesday, April 09, 2003

I wouldn't use energy on test coverage. I would say as long as there are proof of broken code, the developer should be (self) encouraged to add a new unit tests to locate the break. Then fix the code.

Those tests would be added to the existing unit tests, thus increasing code coverage. When you, the tester can't break any more code by adding more functional tests, there's no need for more tests and the code coverage is large enough.

Thomas Eyde
Wednesday, April 09, 2003

Actually, having a code coverage target is a very good thing. Just because QA clicks on buttons in the right order to execute a safe code path, doesn't mean the user will do the same. Users find ways to do things you don't expect. If it's worth coding extra paths to handle special conditions, then it's worth testing them. Like anything else, setting a target is a good way to gauge progress.

It's also worth noting that there are two ways to increase code coverage: writing more unit tests, and deleting dead code. Dead code rots over time, and makes the codebase as a whole more difficult to maintain. Removing it is a perfectly reasonable way to improve coverage.

Jeremy
Wednesday, April 09, 2003

Developers are busy and lazy. They often won't want to write their own unit tests. When I worked in QA, we tried to "close the loop" and give some pre-checkin "smoke tests" to the developers. If you give them smoke tests that don't require much time or thought on their part, honest developers won't mind running your automated smoke tests before checking in new code.

This will save both you and them lots of time. They won't be giving you steaming piles of broken code to test. You won't be bugging them all the time to fix the broken tests.

I use the term smoke tests to differentiate basic, automated tests that are not as rigorous as unit tests should be. The smoke tests should touch some big pieces of code and see if anything is majorly broken and "smoking".

runtime
Wednesday, April 09, 2003

I second Realist. Marvelous and important as testing is, there's something odd going on if the project manager explicitly stops you doing development when you have more .Net experience than the others. I'm presuming you are a good developer.


Wednesday, April 09, 2003

Yes we are busy and lazy. But we also want to be effective. When I say unit tests, I mean programmer's tests used as a code design tool. They are very effective. QA's tests would then be functional tests. They could use the same unit test framework, but does not have to. See http://www.xprogramming.com/xpmag/acsIndex.htm for an adventure in C#. Nice reading and a good tutorial.

Thomas Eyde
Wednesday, April 09, 2003

I always write a test module in my application that can be executed via a command line parameter that self tests the application, basically it would run a heap of pre coded  unit tests firing a range of paramaters at procedures and expecting certain results etc, any coded unit tests were added to the application itself, I think it's a great idea and I lost count of how many times it pointed to issues instantly.

I started doing this about 3 years ago, and I think I got the idea from Fowlers refactoring book.

Realist
Wednesday, April 09, 2003

Realist, same here.  Only I do it out of necessity.  Can you imagine performing integration with other developer's code without running your own unit tests and test apps first?  Life is hard enough...  its just necessity.

Nat Ersoz
Thursday, April 10, 2003

Whoops, no spite intended in my last post.  After re-reading it, it has an edge that wasn't intended.  Sorry.

Nat Ersoz
Thursday, April 10, 2003

No worries Nat, can you remember where you got the idea from? I wouldn't mind reading some of that stuff again.

Realist
Thursday, April 10, 2003

*  Recent Topics

*  Fog Creek Home