Fog Creek Software
g
Discussion Board




Do many programmers find writing TEST PLANS hard?



I've had some difficulties with a couple of programmers who don't test thier software before sending it to me. (I developed 18 programs and sell those along with some other third party programs).

I'm wondering if writing test plans is something that a lot of programmers either find very difficult, or perhaps don't know how to do. Or, whether they just find it boring so they avoid it. Either is understandable.

I've never had any difficulty with that particular task because I've had a lot of experience with troubleshooting in the Navy and as an electrical engineer. I just sort of "know" what needs to be tested. However, it is a challenge to write it all down in a cook-book fashion, where every step is explained. ("Clear out registry settings for the program, located at XYZ", "rename source code directory to catch hard coded references", etc.).

My suspicion is that testing is very neglected:
No one wants to do it.
Mgmnt doesn't want to admit that it's important/valuable, so testers aren't treated well or compensated.

(I do tech support so I KNOW how valuable testing is, first hand.  1 minute to fix a problem during debug. 10 minutes during testing, 100 minutes  after the customer sees it. )


Thoughts?

Mr. Analogy
Tuesday, May 18, 2004

Testing after development is the domain of system
tests and most developers won't be involved in
writing these tests.

Testing needs to be done while developing or
it doesn't work. See http://c2.com/cgi/wiki?TestDrivenDevelopment.

son of parnas
Tuesday, May 18, 2004

"Testing needs to be done while developing or
it doesn't work"

Son..

Are you saying that it testing needs to be done during development, but current is not?

(If so, I highly agree).

Pace is too fast near the end of the schedule.  Tests should be written earlier and executed during development (unit testing) and then broad tests (that test a bunch of things a once) near the end, when (ostensibly) there won't be many errors so you don't have to worry about a test indicating one of 100 things that could be wrong.

Mr. Analogy
Tuesday, May 18, 2004

All of our "grunt work" tests like:

1. Set server side data set like xxx.
2. Exercise client like this...

These are all automated now.  It has taken 2 (very capable) software engineers 18 months to complete this work, but it is a thing of beauty.

And yes developers don't want to be bothered.  Developers will document an interface, and leave it up to the testers to create a test plan on the interface.  I see nothing wrong with that - in fact it seems to make sense to me.

hoser
Tuesday, May 18, 2004

It is hard if you make it hard. It gives me great relieve to be able to test cases to the most critical behaviors of various programs I have had to write in the past, I don't find it too difficult to come up with real test cases. It really depends on the domain though. Most companies now days can't afford to perfect software, so there are a lot of gaping holes and lack of polish in the produced softwrae, which admitly will create some confusion as to what needs to be tested. By talking to your clients and understanding intimately how your clients will intend to use it you should be able to build a most like case usage model.

Li-fan Chen
Tuesday, May 18, 2004

>Are you saying that it testing needs to be done during >development, but current is not?

The only way people will test is during active development.
Just like the only way people will document is during
active development. Do things completely when you
are doing them. Anything else isn't consistent
with observed human nature :-)

son of parnas
Tuesday, May 18, 2004

The programmer who writes a test plan against his own software runs a pretty high risk of finding flaws. So why should he?

But a programmer who takes pride in his work or has some incentive to produce high-quality software is (imho) more likely to thoroughly test the software than the programmer who is just getting paid by the hour.

Derek
Tuesday, May 18, 2004

Let's clarify the difference between writing the test plan, writing the test code to the plan, and performing QA (running the tests).

Writing a test plan isn't terribly difficult for 80% of the testcases.  But it is kind of boring, and it is *really* hard to feel motivated to write test plans if you know from prior experience that they are going to be ignored (or merely given lip service).  It can be interesting (read: challenging) if you're writing a test plan for really complex software, or for the 20% of testcases that require some real ingenuity and creativity.  A "bug reward" can also be a motivator -- if you have several people coming up with testcases, offer a reward for whoever uncovers the most bugs with their testcases, whoever writes the most testcases, whoever finds the largest number of showstopper bugs, etc.  (Obviously they have to be valid tests.)

As for not wanting to write test code, it's easy to get attached to the code and want to 'protect' it from those nasty tests that will expose its weaknesses (read: your mistakes) to the world.  (Or at least the team and your management.)  It's the "Heisenbug principle": the bugs don't exist until the test is run.  If the test is never run, the code must be perfect.  Plus, it's usually much more fun and interesting to create something than to try to poke holes at it and see how to break it.

I was once in a group where we flipped the logic: if your code wasn't tested, it wasn't considered trustworthy.  It actually encouraged you to write test code to prove its robustness, because that meant positive recognition from your peers.

Take that FWIW.

Should be working
Tuesday, May 18, 2004

Man, is everybody here but Parnasson living in the stone age and wearing animal skins tailored with flint knives?

http://junit.sourceforge.net/doc/testinfected/testing.htm

http://c2.com/cgi/wiki?TestDrivenDevelopment
http://c2.com/cgi/wiki?TestDrivenDevelopment
http://c2.com/cgi/wiki?TestDrivenDevelopment
http://c2.com/cgi/wiki?TestDrivenDevelopment

Keep up folks, this stuff isn't exactly new.
No wonder we're sending all the work to India.

Tony Randall
Tuesday, May 18, 2004

Tony lives!

Jack Klugman
Tuesday, May 18, 2004

"it is *really* hard to feel motivated to write test plans if you know from prior experience that they are going to be ignored (or merely given lip service)"

You are quite right.


EVERYONE HATES "BUSY WORK"
This sort of beauracratic rot is indemic in corporations.

How often have you worked on SOMETHING that you KNEW was not ever really going to help a person (even indirectly):

Products that clearly were just to satisfy someone's ego.
Products that were misguided, but hey you're consulting firm is making money on the project so they keep quiet.


ALL ROWING TOGETHER

IMAGINE what the economy would be like if most people were working on things that would actually help (again, even indirectly) people.  Everyone would be more motivated and, instead of yet another web editor, we'd get a Knowledge Base program that was %$# useable.

Imagine if all that freeware stuff out there actually helped the common citizen instead of techies.

(Not meaning to start a flamewar on freeware/open source, but it does seem to be aimed at techies solving techie problems. Just look at all of the CVS software out there. Hey, I'm glad it's there. I use Tortoise and live it.  But there's also a lot of freeware that just misses being useful.

All that effor  to please a few thousand people when a world of Billions could benefit.

Mr. Analogy
Tuesday, May 18, 2004

It's a lot harder to write software that a common person will want to use or even be able to find. Probably one reason most free stuff is technical...

Phil Larson
Tuesday, May 18, 2004

I really like testing since it's like an opportunity to play defense like in football, finding holes in the offense.

In fact I'd like to test other peoples' code, given a team that's good, and where it won't lead to psychological problems for the team.

So yeah, I think testing is all in how you look at it.

Tayssir John Gabbour
Tuesday, May 18, 2004

Test plans are great, if it is reasonably simple to create automatic regression tests.
Automatic tests are ok with stuff that has a well defined input / output.

(otherwise you will have to run the test by hand, over and over again, a developer has usually some other things to do, like fixing bugs for example).

with gui stuff it is difficult to have a test plan, unless there are automatic tools for regression testing - and writing these tests is usually beyond the scope of a developer.
That's what you have qa for.

The other option is that the gui developer writes a test plan, and passes it over to the qa department.
Now, as it happens, the GUI will change during that time, and you can cycle through again.

I guess writing test plans for gui is a bit of a problem.

Michael Moser
Wednesday, May 19, 2004

Ummm. I'm not sure I'm on the same page. If I write code to do X then certainly I write it to do X and I'll usually test it to do X as experience has stripped away my feeling of infalliblity.

In my experience I'm told to do x,y,z, and not x,y,z,a,b,c,1,2,3 yet when I'm done someone will test for a,b,c,1,2,3. I usually rejoin with "why didn't you tell me you wanted that" and they'll usually say "did I have to say that".

On my company wiki I've started a "dev culture" where I store the assumptions so an analyst (using the word loosely) doesn't have to repeat common requirements.

However, there are many apps where neither the original documentation (as bad as it usually is) nor the common culture account for things which can go bad. You as the programmer can see that if you don't do X then you  will be held accountable even though it is not in the documentation. I tend to get a bit peeved at that as it usually indicates people not thinking through the application before pushing it to a programmer.  And often when you bring up X it requires a concentration that people may not be able or willing to bring to the subject.

I would prefer the analysts to tell me how they are going to test the application. You code to the test. You test the code, then release it.  Ummm, let me think how many times I've been told how an app is going to be tested (other than verbally) .... once and it was wrong.

me
Wednesday, May 19, 2004

"I tend to get a bit peeved at that as it usually indicates people not thinking through the application before pushing it to a programmer. "

Very understandable frustration.  Unfortunately many users have a hard time visualizing the program and mentally "playing with it".

I'm reminded of the scene from "Searching for Bobby Fischer" where the chess instructor wants him to visualize several moves ahead, but the kid can't. So the chess master violently brushes all the pieces off the board and says "now, imagine the board".  I know, it's not a solution, but it illustrates the difficulty.

However, that's the skill that's paramount in a good analyst. 

I'm having the same difficulty right how with designing the object/patterns for an existing program. It's really hard to get my mind to "hover" a just the right technical detail level. Complex enough to be realistic, but not bogged down in details.

Mr. Analogy
Wednesday, May 19, 2004

*  Recent Topics

*  Fog Creek Home