Fog Creek Software
Discussion Board




Automated testing...

A small question... What would be a good way to automate testing for these types of apps -

1. GUI testing
2. When the i/p and o/p is not standard (eg. voice input and speech output)
3. Device drivers

Also any good pointers on developing test cases, because I can literally think of hundreds of things that could be tested, but its obviously not cost effective to test a hundred combinations, most of which would properly work especially when using standard controls.

Siddharta Govindaraj
Wednesday, December 18, 2002

The JUnit page - http://www.junit.org/ - has a number of excellent resources on these questions, particularly on what to test. (Of course, it has lots of resources on how to test as well.) Also, you might want to look at mock object - http://www.mockobjects.com/ - to deal with your funky I/O issues.

Chris Winters
Wednesday, December 18, 2002

Regarding your first question, much of that can't be completely tested automatically.  However, you can partially automate it, by at least digitizing the test cases.  See:

http://c2.com/cgi/wiki/wiki?PartiallyAutomatedTesting

With GUI apps, it's recommended that the GUI be separated from the logic as much as possible.  If you have an event-driven GUI app, split off the event responses into separate functions, which can be tested automatically.  This will minimize the amount of non-automatic testing that has to be done.

Regarding test cases:

Often, one test case can encapsulate several test possibilities.  For example, let's say you have a text box that should only allow an integer of up to six digits.  You can test whether it accepts letters, and whether it enforces the text length, by asking the tester to type "abcdefgh".

You also may need to prioritize your test cases.  Sometimes, it's not worth running a lot of the obvious tests, especially ones that exercise the behavior of standard OS controls and/or functions.

You might also want to consider having separate sets of test cases, some of which you run more frequently than others.  So, you can exercise everything under the sun every so often, but you can also run smaller sets of tests more often to check for certain kinds of problems.  How you categorize them depends on the application and the userbase.

If you're automating your test cases, none of this will be a problem, because computers don't mind performing hundreds of boring test cases. :-)

See also:

http://c2.com/cgi/wiki/wiki?TestingPatterns (lots of testing ideas)

Brent P. Newhall
Wednesday, December 18, 2002

3) Device drivers:

This can be automated, and in my experience gets done by the development engineer, not QA.  That might not be ideal to a testing purist, but I think it reflects reality.

1. A device driver is often handling a device based on ITU, ISO, IETF, ... standards.  Whatever the governing body, get the specification.  From there you'll have to create a test suite for exercising the environment in which your device lives.  Often, there are 3rd party test environments, streams, scenairos that you can buy.

2. Test application.  You need to write some sort of test application that loads the driver and creates system calls into the driver.  Open, close, read, write, map, unmap, ioctl.  All of the ioctls that you support - they all need to be tested.  Usually, I have a group of command line and text based menu based applcations for driver testing.  This is only semi-automated.  This is nothing new to device driver writers, as its usually how debugging is attacked to begin with.

3. Test specification.  Keep a running document (I just use a text editor, nothing fancy), of all the test cases you've been creating as time goes.  Keep it in the same directory as the source code.  You'll lose track of them as you find new ways for your code to fail.  Things like degenerate cases (non-compliant signals and situations) need to be handled without severe failures.

4 Interop testing.  This is usually not automated, but necessary to get to market.  No matter how well you test against the specificaiton, you want to test against your partners' (and/or potential partners') hardware.  Get them into your lab, or get into their lab as soon as possible during an integration phase.  Better to figure out problems early rather than in the field.

Since device drivers are system call driven, creating automated test suites and harnesses (applications) are straight forward, though often tedious.  Its often part of the development process, because its much more difficult to debug the device driver once its part of a larger body of code.

Nat Ersoz
Wednesday, December 18, 2002

Interesting. I'm going through all the links now. Thanks for all the info.

Siddharta Govindaraj
Wednesday, December 18, 2002

*  Recent Topics

*  Fog Creek Home