Fog Creek Software
Discussion Board

How to automate testing of non-class code?


I recently read the Pragmatic Programmer (excellent book!) and am starting to automate more of my testing.

I've seen posts on Vunit and Junit for UNIT TESTING. However, my understanding is that these are for testing COM interfaces (CLASS interfaces). Therefore, they couldn't test  "normal" procedures and functions.

Now, most (90%+) of my code is non-class. Just plain old functions and procedures.

Do most people automate testing of every procedure and function? If so, any suggestions on this?

I'm thinking I'd have a test harness (really just a form in my project) and a test function for every procecure or function.  Ex: for Sub Foo ()  I'd have Test_Foo () which tests Sub Foo().  Then the test harness would call all of these functions.

This seems straight forward, but clearly the vUnit and Junit, etc. do more than that, otherwise you wouldn't need the mysterious *Unit for testing.

1.  How do people write automated tests for proc/func's
2.  Any suggestions to make that testing easier?


Wednesday, November 5, 2003

> Therefore, they couldn't test  "normal" procedures and functions.

Have you tested this theory?

Thursday, November 6, 2003

Most unit testing frameworks can test whatever you want. If you can call it from code, you can test it, and the framework will help automate it for you. A typical unit test will exercise your public interface -- how others will use your class/procedure -- rather than every internal implementation detail.

Also, the Pragmatic Programmers have just put out a couple of additional books focused on particular topics[1], and one is on unit testing. (The other is on Version Control, a frequent topic in these parts.) You might want to check it out.


Chris Winters
Thursday, November 6, 2003

What I'm going to say is hardly startling, but it is commonly ignored.

Do not start by worrying first about test software or languages or equipment. Start by making sure you have an analytically sound and validated set of requirements for the system. Without this, all else is egregious bullshit.

From valid and unambiguous system-level requirements, decompose the system by sub-system, component, and to class to ensure that requirements at these levels are verified **and** validated.

It's easier to write automated tests at the unit/class level because it's easier for the one person working on that class to flesh out the requirements for properties and behaviors. The developer was probably handed the typical shitty set of lousy requirements to begin with. "Knowing" that they'd never get anyone to prepare a proper set for them, they make whatever assumptions and decisions they have to to 'fill in the blanks' in the crap they were handed. All the stuff they made decisions on when they coded is still fresh for them, so it's easier for them to write tests. At the component and system level, however, requirements are rarely any better, and because it's not the responsibility of a single person, too often nobody ever fleshes out the requirements at that level enough to permit meaningful and efficient automated testing. QC/QA people will try to because they, like the developer at the class level, will have to fill in the blanks for the system properties and behaviors before they can write any code to test aggregated behaviors and properties.

Automated testing tools do not handle the ambiguity present in most requirement sets. The human brain does, which is why automated testing is so often mis-applied. In fact, trying to write meaningful automated tests at any level forces everything to be disambiguated just because you have to be specific when you write the test code. This is one thing that the XP folks have correct -- write the tests first. Tests and requirements are the same thing, so when XP folks say to write tests first, what they're really saying is to ensure you get the requirements specified, then code to them. Duh. How much more logical can you get?

Of course, remember that just because a developer has made assumptions, decisions, and whatnot to effectively 'fill in' the requirements they got so that they can implement something, does not mean that what they implement is valid, even if it is verified. You can generally only determine validity at a lower level of abstraction by ensuring you know what valid behavior at the higher level of abstraction is, else all you've ended up doing is correctly solving the wrong bunch of problems. Doing this normally requires applying some sort of logical model to the situation, as well as the time and resources to crunch the analysis. All too often, this whole part is left out.

This applies to tester-types writing automated test code for system-level tests -- with shitty requirements, they assume, decide, and implicitly 'complete' the requirements themselves, though their tests may not be valid, even if they run and pass (and, they may make different assumptions/decisions than the developers did, which is one reason it's so important to keep the QA/QC folks dialed into what's going on during the project, especially early-on when many of the assumptions/decisions get made). They need to be testing against what the system behavior is **supposed** to be. When nobody decides in common on the team exactly what that means, then there's no way to tell for sure if your're wrong -- or right.

So to automate tests of non-class code, first make sure you know exactly what it's supposed to do. Once you have truly good requirements for the higher levels, then you can worry about how to write the automated tests and what to do it with. If you can't get the requirements problem fixed in your shop above the class level, then you're probably better off to stick with manual, largely ad-hoc testing at the sub-system and system level at least, lousy and  painful as that normally is.

Thursday, November 6, 2003

Ooops, sorry - I reacted too quickly. I just went back and re-read the initial post more closely. My previous comment presumed you were talking about automating code testing at above-class levels, not for lower-level code that simply isn't written in classes.

So, I just committed an example of a verfied, but invalid implementation. I answered correctly the wrong question. Damn. Happens all to easily -- "ready, fire, aim."


Thursday, November 6, 2003


I have not tested this theory.  The unit testing programs seem to be quite complex and I have BEEN TOLD, by people who used them, that they test only COM/Class interfaces (documentation states the same).

Therefore, it would take a lot of time to confirm that this is not an efficient thing to do.  (Since Classes are such a small part of my code, a high overhead tester would be an inefficient use of my time).

If someone comes along and says there IS an automated test for ANY old function/subproc then I'll investigate that. So far, none mentioned.

Saturday, November 8, 2003

> would be an inefficient use of my time

Well you're in the perfect spot, then: surrounded by people who are willing to spend *their* time for some money. Put up some cash, and I bet you'll get all the details you need on how to do it. Or is "Enterpeneur" a mis-nomer?

Saturday, November 8, 2003

I'm confused.  I'm asking for people's experiences.  I'm not asking them to spend any time (other than answering a question). I've answered many questions on this forum.

Wednesday, November 26, 2003

*  Recent Topics

*  Fog Creek Home