Fog Creek Software
Discussion Board




QA in an organization with no QA dept

Any ideas?  We are a software organization with no QA department which isn't getting one anytime soon.  (Don't ask)... 

Are there any best practices for this kind of organization?  We develop .NET n-tier software which is accessed via the web.

Building test cases before building the code a'la XP just doesn't seem logical to me.  I'm really not interested in theories people come up with in random books - what are developers on extremely limited timeframes ACTUALLY doing day to day to improve their software quality.  It's not that I'm against books, but many books assume you have the ability to design an organization from scratch.  Unrealistic IMO, especially given the current economy.

Another related question:

How do you judge the quality of the software you ship?  What techniques do you use?  I'm not talking about guesses....

Software Developer
Tuesday, November 12, 2002

Hey, that sounds like us!

What we did was get our support group to QA for us.  Between phone calls, they test the latest stuff.

This, of course, fails badly.  Using the law of 'less pain' (a person will take the path that causes the least amount of pain), we could have forseen that internal betas would leak out to customers if those betas fixed customers problems.  Never mind if they had been QA's properly.  We even started putting (BETA DO NOT SHIP) labels on the splash screens and customers still recieved thems!

Whats next?  I say get a real QA guy in.  Make sure that he/she has the backbone and the mandate to be tough with releases.  No releasing before its QA'd, period.

Gregor

Gregor Brandt
Tuesday, November 12, 2002

The only way I know of to judge quality of anything is to have some kind of spec and some kind of tests, and make sure the tests cover everything that's in the spec.

Personally I am happy to make sacrifices to get *good* specs and *good* tests, or even to get better than I would have otherwise.  I think it's interesting and perhaps telling that on both those subjects, there's only evangelists and people who haven't tried it. :)

In an environment where it's not time to try and force in *good*ness, it's still worth it to get something on paper, even if the "spec" is a narrative description from one of the main programmers, annotated with all the clarifying questions someone had to ask since.  The test part - using support techs' time is probably more do-able in that kind of environment than programmers' time.  You could do a cheap student intern or something, but then, you have to consider whether anyone would listen to them.  If you don't respect your testing department, it's a waste of time to have one.

Mikayla
Tuesday, November 12, 2002

Nontechnical management has just alienated the last tester, so we're in that situation.  So you generally want to decrease the number of errors that developers make before checking it in.  Plus whatever piddly testing you can do.

Try out unit tests before you deem them illogical.  They control the defects well.  Do it for new features or classes you write.  We have two subprojects here, one of which uses unit tests.  Guess which one is better documented and easier to repair, by just having simple testcases that double as docs.

At least, do code reviews.  Look at other peoples' checkins.  CS people complain about how IT people just dump 'n test.  Well, you can't test, so you'd better start thinking like a CS person.

Get someone to at least test the main user paths.  Otherwise your customers will be testers for raw code.  And they will h8 u.

Still, I believe we're dead in the water for not having testers.  It's a fact that bugs exist that can't be found a priori.  "Even perfect code has bugs."  Turn one of your programmers into a tester, who writes automated tests and support tools.  I'm trying to do that now.

Tj
Tuesday, November 12, 2002

BTW, what I wrote mainly applies for shrinkwrapped software that runs on multiple platforms.  Which my product does.  Obviously my advice will need modification in your environment, the more different they are.

Tj
Tuesday, November 12, 2002

Thanks guys.  I don't feel so alone now ;)

So surely there are other organizations out there without QA departments.  What do you to do ensure product quality?

Software Developer
Tuesday, November 12, 2002

We did have QA, but a lot of times, some really great testers, are in Sales, because often they don't sweat the technical details...either it works or it doesn't, and alot of times they need to look at the product to create demo scripts anyway. The only problem is you might get serious feature creep if youre not careful

Daniel Shchyokin
Tuesday, November 12, 2002

I have pretty much that situation.  In actual fact my company does claim to test, but I've seen no evidence of it.

The simple solution is to code belt and braces style.  While developing turn all debug options on, warning levels on compilers to maximum.  Bounds checking and any other avaible checking on.  Add asserts in all the obvious no go areas - although in a release the code should be able to recover from these parts of code being reached.  Run the code through sanity/beauty checkers like lint. 

Since I am writing for an embedded environment I created my own little 'emulator' by creating a dummy library for the curses like functions that the gun used and plugged them into curses libraries, I now have an executable I can debug properly on both windows and linux.  I use a tool called valgrind on linux to check for uninitialized variables.

Then theres my own testing, test every path you can think of to make sure that the code does not break. 

Did I forget to mention the most important thing?  Keep it simple, write simple easy to read code.

A combination of writing simple code and running the program with every single debug tool you can find enabled generally helps find the bugs earlier rather than later.
By adding time to your schedule estimates for your testing implicitly you will find time to do all this stuff, and by finding the bugs early you have more time for development rather than debugging.

Joe
Tuesday, November 12, 2002

Depends on your goals; I presume your sole focus is getting the work out the door (if your company were really interested in improving their process, they'd have a QA group in place).

Four small words: Au To Ma Tion.

I normally use perl and regular expressions, grepping the source code for possible problems - expressions known to be suspect.  Each time a new one appears (just run diff on today's report and yesterday's), verify it by inspection, then bless the new report.

If you are running with warnings maxed and keeping everything lint clean, the above may lose much of it's oomph, but it's proved useful to me even in those circumstances - the tools are general, but the automation scripts are tuned to the errors made by your developers.

The second battery of tests to set up are regressions.  The same general pattern works here - send a query to the system, get the response, and dump it.  Compare the dump to the previous version, again using regex to mask away dynamic bits as necessary.  Verify that the expected changes appear, and bless.  You should also validate the dumps - there are many tools available that will check your HTML, or XML, or whatever to ensure that it is well formed.

Danil
Tuesday, November 12, 2002

I'm a professional QA Tester and would love to help.  Email me at MercREV69@aol.com and let me know more specifics about what you're looking for.    I work for an organization which FINALLY put together a QA department, but unfortunately we are WAY understaffed. I think we have 10 developers or more per tester.  I'm trying to get my programmers to implement more testing procedures on their end (unit test).  Would love to chat!  Thanks.

Jennifer Keys
Tuesday, November 12, 2002

*  Recent Topics

*  Fog Creek Home