Fog Creek Software
Discussion Board




Show me hard data...

I've been waiting to hear other comments on the most recent book of the month recommendation, but haven't seen much other than the "wawa" bit.

The "hard data" bit stood out for me.  Data can fool you.  Big time.  If you have data, even good, hard data, at most you can demonstrate a correlation.  What's really lacking is an explanation; a chain of cause and effect that leads from what you are controlling (whether you do wawa or not) to the measured impact (people quitting or the stock booming).  In addition, the explanation should also allow you to deduce other necessary outcomes that you can use to test the validity of the explanation.  This is what separates practitioners of voodoo from practitioners of science.

Constructing correlations seems to be built into the way we think, so it's natural.  We all do it.  It's a fallacy, though, to treat this as science or to trust it as you would real science.  Any sixth grade science class that's covered the scientific method should laugh down ALL of this data.  Hard or otherwise, without the theory to explain it.

so there.

Manager's Bane
Wednesday, November 19, 2003

After reading the first chapter (I got my copy in the mail yesterday), its not "hard facts."  As we've discussed before on this board ad-nauseum, its extremely difficult to do studies that can correctly test the effect of different tools and approaches in software.

Each section in the book (for each fact or fallacy) is structured like this:
Fact/Fallacy: ...
Explanation: ...
Controversy over it: ...
References: List of books and quotes that support the fact or fallacy.

No real "hard data" per se like you could find in an FDA study, but enough to give you a good idea that its right or wrong.

Andrew Hurst
Wednesday, November 19, 2003

Is it worth reading?  In his review, Joel mentions that the book is, "...the best summary of what the software profession should have agreed upon by now. "  He then gives four illustrative bullets, three of which I disagree with. 

interested
Wednesday, November 19, 2003

"...three of which I disagree with."

Well don't leave us in suspense, old chap.  Do tell.

.
Wednesday, November 19, 2003

As I generally respect what Joel has to say, even when I disagree with him, I put an order in for the book yesterday (http://chapters.indigo.ca has it for $32.89 CDN, minus 5% if you have their loyalty card, with free shipping, which is a pretty good deal when Amazon has it for $29.99 US [38.99 US]). While I was there I also ordered Steve McConnell's new book "Professional Software Development".

Because I went with the Canadian supplier, obviously Joel missed out on the referral commission for this, which I apologize for. Having said that, am I the only one that finds referral tags in links to Amazon (et all) somewhat unsavoury when they come from respected figures? I'm not religious about this, but it's sort of like going to your friend's house and he pulls out the Amway...

Dennis Forbes
Wednesday, November 19, 2003

Manager's Bane,

You missed Joel's point. He didn't say we should get some hard data and then we would know the truth. He said we don't have hard data only years of anecdotal wisdom, because that's all we have about anything, most of the time.
Scientific research is a difficult art and determining causality requires years of controlled experiments and/or complicated statistics. Even then the conclusions are usually at least partly wrong and almost always incomplete.

In deciding what method to use in developing software it might help to collect lots of data and try to separate out variables and get some vague idea of what works and what doesn't.

The Real PC
Wednesday, November 19, 2003

Real PC,

Thanks for the reply.  I'm not sure I was arguing with Joel.  Actually, I think I was just saying that the problem is even worse than what was described.  Even if we have "hard data" we're not really in a much better position than if we rely on anecdotal evidence.  It's just a more sophisticated form of voodoo if it's not coupled with a theory or model of how the system works. 

Cause and effect is sometimes (not always!) hard to establish, as you observe.  Does that mean we shouldn't try to understand, though?  And there's nothing wrong with incomplete or limited theories, provided you are aware of the context in which they are valid.  A lot of the time and expense involved in scientific research is spent trying to make local theories more universal.

There are good methods to use for trying to make improvements based on data gathering.  Manufacturers use tools like Design of Experiments, Statistical Process Control, and all the crap that comes with applying Six Sigma to your processes very effectively.  Even Deming, one of the fathers of all these things, noted that a necessary condition to using these tools correctly is what he called "profound knowledge" of the system.  Without it you are doomed to marginal results, at best.

Manager's Bane
Wednesday, November 19, 2003

Interesting you mention the bullet points, it seems to me that reuse-in-the-small required free software and the internet.  Reuse-in-the-large is achieved by everyone having a copy of Windows or some popular framework.

I agree with the other two, especially since "adding people to a late project" is open to interpretation.  I'll take it to mean the reflex of adding warm bodies.

These last books are useful for clubbing flamewarriors over the head (Book of the month CLUB).  But they probably don't say many new things to this crowd.  Something racier would be great next month.

Tayssir John Gabbour
Wednesday, November 19, 2003

The book doesn't claim to be saying anything new.  In the intro he says most of the stuff is rather old.  The other title for the book, was going to be "55 Frequently Forgotten Facts and Fallacies about Soft. Eng." if I remember correctly.  It was the frequently forgotten that he was going for.  I find the book entertaining, and it will be a reference for when I'm talking to customers, thats for sure.  Especially the parts about changing requirements and changing the estimates.

Andrew Hurst
Wednesday, November 19, 2003

[Cause and effect is sometimes (not always!) hard to establish, as you observe.  Does that mean we shouldn't try to understand, though?  And there's nothing wrong with incomplete or limited theories, provided you are aware of the context in which they are valid.]

That is true. I just meant to say that whether you rely on anecdotes or hard data, your conclusions will be imperfect. Controlled experiments are supposedly the best evidence, but very often they do not test exactly what they were meant to test. Correlational data is very hard to interpret. Of course this is why people devote their lives to research. Expecting science to give clear simple answers is a mistake.

The Real PC
Wednesday, November 19, 2003

Robert Glass is a GOD.  Over 20 years ago I read a collection of tiny paperbacks he authored with names like "Software Projects Which Failed", "More Software Projects Which Failed". They taught me more about the people and practices of software than anything I've read since, including the oft-mentioned "Peopleware", etc.

I think all of these early books are out of print. I ordered this new book as soon as I saw it mentioned on JOS, and I don't care what it cost, anything by Glass is worth the money.

HeWhoMustBeConfused
Wednesday, November 19, 2003

i took the free shipping, and ordered from bookpool, so it will be another week before i can comment :-)  looking forward to it though... even if i already know most of the things in the book, it is always worth a little money and time to save much more over the course of my career.

Scot Doyle
Wednesday, November 19, 2003

If you asked somebody the question "Do you think it will hurt if I poke you in the eye with a stick?" that person will probably say "Yes".

There's no hard data to justify the response though.

Realist
Thursday, November 20, 2003

*  Recent Topics

*  Fog Creek Home