Fog Creek Software
Discussion Board




Automated Tests for Data Extracts


Ok folks, here's the question:

I'm a huge fan of Xtreme Programming, but I've had few chances to use it.  I'd like to write an automated test handler for my next script.  The psudeo code will look something like this:

#----------------------
my $dbh = Make_Data_Connection;
$dbh->Sql = "Select CustID, Last_Name, First_name from Customers"
$dbh->Execute;
$rst = $dbh->First;
while ($rst is not null)
{
    print $rst[0], $rst[1], $rst[2] to some_file;
    $rst = $rst->Next_Record;
}
#---------------------

So, how can I automate this?  Do I have to create a "test" environment that has like 16 records in it, run it, then compare the results to a file I have on disk somewhere?

There's got to be a better way to automate the unit test.

Thoughts?

regards,

Matt H.
Wednesday, March 05, 2003

I'm not sure if this is what you need, but take a look at the various unit testing frameworks at:

http://www.xprogramming.com/software.htm

Go Linux Go!
Wednesday, March 05, 2003

I think you're ahead of yourself. :-)

According to XP principles (as I understand them), the developer begins with the customer's statement of desired functionality (the user story).  The user story is then broken down into small programmer tasks.  The tests are derived from the tasks -- and this is important -- before any code is written.

What is the script *supposed* to do?  Are there any restrictions on its operation, such as a maximum run time or the exact format of the outputted file?  Don't worry about how it will work; worry about what it will do.

Here's what I think you're trying to do:

You're writing a script that extracts all customers' ID, first name, and last name from the database, and writes that information to a file.

I would split this up for easier testing.  Write one function that writes a customer's record to a file.  Your test could run that function and ensure that its output is in the proper format.

Then, you'll need to test the entire function.  I think the best route is to create a development database (which is a good idea in any case), which your automated test script populates with data.  When the test script runs, it populates the database with known data, runs this function, then tests the outputted file to ensure that it has the right number of lines and that it contains the appropriate data.

Brent P. Newhall
Wednesday, March 05, 2003

>I think you're ahead of yourself. :-)

Obviously!  I'm supposed to "Test FIRST" blah blah blah.

In this case, I've chosen something trivilally simple where the inputs, outputs, and data transformations are so clear that it was possible to write the pseduo code from the business requirements.  (In other words, someone looked at the User Story and said "Hey, this is like, 10 lines of code ...")

Again, how do I test it?

I like the idea of a separate function.  As for the dev environment ... I'll have to look into that.

Other advice?

Matt H.
Wednesday, March 05, 2003

I've never written unit tests for simple accessor functions.  get/set etc.  They're pretty straightforward.  You end up testing the language implementation, and not your code, as its just a couple lines.  If it gets more complex than just a 'return foo;' however, I'll test it.

Now to what you were talking about:  I think getting data from a database like that is an accessor function.  Granted, you have to get a db handle, prepare it, and execute it, but thats about it.  You'd be testing your select statement, and the libraries that return the data to your script.  Unless you're doing something more complex than get/return you shouldn't need to test it.  Its a waste of time.

Now, if its particularly complex SQL, you might want to set up some tests to put in some dummy data, then run that function and make sure it returns the correct value.  Though, even that is arguable, imho.

Though I just thought of something.  Maybe it would be a good idea for a test to just make sure that those columns in the database exist, and that they have the data types you expect.  That way if anything changes in the database your tests fail immediately and you know why.  Any thoughts?

Andrew Hurst
Wednesday, March 05, 2003

Andrew,

Your heading in the right direction about the database changing. Many people quickly test their code and say "I don't need unit tests. I just tested my code. It works fine."

It may work fine today,  but what about tomorrow about Chuck the Chucklead Coder checks in his latest changes? This is where having a suit of unit tests come in handy. Just re-run all of your unit test to make sure that no other changes have broken your code.

Go Linux Go!
Wednesday, March 05, 2003

Matt - one thing to keep in mind is that the diehards design their solutions with testability in mind.  Thus "how do I test this implementation?" isn't really the right question; instead ask "how do I test this feature?"

That said, what do you think is wrong with the test you described?

Danil
Wednesday, March 05, 2003

GLG,

  So are you saying that the unit tests for accessor like functions (like our DB example) should just test the columns etc exist?  Or that they should test the full functionality of every function.  I still think its bad practice to test a function if its just a straight get or set.

  I know my theory of just testing the columns would fail if Chuck came in and changed what columns that function retrieved.  What might be handy is to make sure the data structure returned has all of the right columns/entries etc.  Without testing the data. That should catch much of what Chuck could do to mess things up.

  The extreme of testing your db accessor functions is that you end up with a test database, where all of your test code knows all of the data in it.  With larger databases, this gets unwieldy, and very prone to error.  Or, you could have your tests insert this data into the database for you.  But, you'd end up with more db accessor functions in your tests, than in the functions you're trying to test.  Thus you'd be more prone to having a bug in your tests (!) than in your regular code.

  I'm rambling, I know.  But to sum up: I'm not advocating not testing your database code.  I am advocating not testing database code that does the equivalent of a simple get function.

i.e.
# don't test this:
sub getCategories() {
  @rows = db_select( "SELECT CATEGORY_NAME FROM CATEGORIES");
  return \@rows;
}

# do test this, but don't test the data returned, test the
# data structure returned, num parameters, etc.
sub getSomethingComplex {
  my ($parm1, $parm2, $parm3, $parm4) = @_;

  @rows = db_select("SOMETHING COMPLEX WITH MANY PARAMS");

  return \@rows;
}

Andrew Hurst
Wednesday, March 05, 2003

I never used unit test on data returned but I have some thoughts to share:

* use database dump with well known data: drop database, create empty database, load data from dump, run unitests (querys), validate results

* database for unit test should not be large (opposite to load test) but rather with well selected exceptional data with all boundary conditions covered; it should be easier to track down error if you know that "rent for customer Jack Welch exeeds allowed threshold" or "employee Lucky One has birthday on Febryary 29th"; and it is easier to make accepted answer table/file with 10 record than with 50K;

* each time you discover new boundary or exceptional data you need to add this into test database (dump) and into answer tables/files;

* you could insert results from select query into separate table and compare this to right anwer table; (at least you can compare at database level - no need to dump file);

I have no idea how can one handle this conviniently or if there is framework or applications available.

nobody you know
Wednesday, March 05, 2003

Check with the QA department if they are using a functional test automation tool like Segue SilkTest. 

Most functional automation test tools like SilkTest let you drive the GUI or a DLL then use the ODBC capability to do queries on the backend database.  It has many features that can be wrapped in your own libraries to do database dumps and compare either in memory to expected results or by dumping to file and comparing there.  I've done it both ways and it depends on the scenario which is best.

This way say if your use case is to add a person, you can capture the person information in a data structure to drive the GUI with it, and reconcile the database information from select statements.

If you're developing in Java then you could do this as well in JUnit by manipulating the objects directly and use JDBC to check results.

Steven Tyler
Wednesday, March 05, 2003

*  Recent Topics

*  Fog Creek Home