Fog Creek Software
Discussion Board




why n why

why a software manufacturer will sacrifice quality in a rush to get a new product to market

dunk
Tuesday, July 29, 2003

Because sometimes the math tells that delivering late will cost more than delivering sub-optimal quality on time.

In a software project the main factors are:
- Time
- Money
- Functionality
- Quality

You cannot optimize on all of them and because of that software contracts are usually bound to one or two of them. Software developers will favor a functionalty / quality optimization. Business managers will favor time and money, things they can measure (forgive the generalisations).
It is the good ol' tug of war mentioned so often on this forum.

Do not complain about it, deal with it. Be aware of it and negotiate a deal that satifies *all* needs to a satisfactory level.

Geert-Jan Thomas
Tuesday, July 29, 2003

Because many clients would rather have something that works NOW, even if it has bugs, than something perfect after a few months.

For example, many people criticize Windows 1.x - 3.x for being unstable, etc.

But the product was available years before Linux with a graphics interface.

And people wanted a GUI, even if it crashed. They didn't want to wait 10 years before having a GUI.

So, Microsoft succeeded.

John K.
Tuesday, July 29, 2003

If the product is revolutionary, or unique in some way, by all means rush to get it to market.

Once you have it on the market, start work on a very well designed version 2.

Rhys Keepence
Tuesday, July 29, 2003

John, history check:

X-Windows on Unixes, Mac OS on Mac (and even on the Lisa!), Intuition on the Amiga, GEM on Atari ST and PCs were all more stable and more usable than Windows at the time. (Xerox STAR predates them all, but it was before the age of "personal computing" as we know it today).

It's not availability (or lack of it) that won Microsoft that war. It probably has much more to do with them being the supplier of Dos and having better access than anyone else to hardware vendors (not to mention [ab]using that access)

He-who-would-not-be-named
Tuesday, July 29, 2003

Because if he waits long enough, he is sure to be doomed like Netscape (Does anyone remember it???)

'Better buggy than late'

The One You Loved (TOYL)
Tuesday, July 29, 2003

> If the product is revolutionary, or unique in some way, by
> all means rush to get it to market.
>
> Once you have it on the market, start work on a very
> well designed version 2.

No. That should read

If the product is revolutionary, or unique in some way, by all means rush to get it to market AND LABEL IT VERSION 2

Once you have it on the market, start work on a very well designed version 3.

Ignore my ignorance
Tuesday, July 29, 2003

All the firms that waited for their software to be perfect before shipping went bankrupt waiting. Survivors ship bugs. The goal is not to get something perfect, the goal is to get  products out the door that will be bought by the market at a pricepoint that brings you a net positive revenue.

Just me (Sir to you)
Tuesday, July 29, 2003

> All the firms that waited for their software to be perfect before shipping went bankrupt waiting <

Joel: Are you going bankrupt soon or were your software loaded with bugs?

Ignore my ignorance
Tuesday, July 29, 2003

Customers can frequently put up with quality issues of a product but they obviously cannot put up with the product not being in the marketplace.

pb
Tuesday, July 29, 2003

I think there must be a threshold of usability/quality, below that it is not worth releasing. The world is not black & white, you know...

I have seem (and used) some softwares that where clearly hushed out the door and where veery crappy. I simply disregarded then and never bothered about version 2 or 3.

But, yes, if there is just a few bugs that may affect, say, 10% of your customers, but it would take months to fix then, I think it is better to release anyway and release the fix as a patch or other version.

It is always about weighting the pros/cons....

whatever
Tuesday, July 29, 2003

Also often times, time to market can give significant competitive advantage.  If you know that another company is working on a similar product, it is usually very much to your advantage to ship first, even if your product is of lower quality.  If this is software people have been waiting for, they will often buy the first available product that meets their basic needs, even if another which promises to be better is coming in the near future.  This is a perfectly reasonable decision for consumer to make, both because it lets them get their work done sooner, and because they can not be sure of the quality of the future product.

So there are times when shipping even a very buggy product soon, is much better than shipping a high quality product later.  This is sort of unfortunate, but that's the way it is.  Of course, the higher quality product will almost certainly have a higher customer satisfaction, and longer shelf life.  But those things only matter if you survive the initial release

Mike McNertney
Tuesday, July 29, 2003

I think the problem is that by the "rush to market instead of quality" stage that it's too late to be doing any decision-making on the issue. Generally the "rush to market" happens because the project is behind schedule and there are contracts at stake or the company needs money. The PROBLEM is that they tried to put too much in 1.0.

Again, a pitch for the "iterated development" concept behind XP - instead of trying to deliver Word 97 in the first package, start small, deliver, build, deliver, add features, deliver...

Not only does this allow you to focus on quality, but by keeping the development effort focused and low-key, you reduce burn rate.

Philo

Philo
Tuesday, July 29, 2003

> For example, many people criticize Windows 1.x - 3.x for
> being unstable, etc.
> But the product was available years before Linux
> with a graphics interface.

Linux 0.1 was announced 3 months before Windows 3.0 was released, much less 1.0 and 2.0.

I fail to see how this is a relevant comparison. 

Alex
Tuesday, July 29, 2003

> John, history check:

> X-Windows on Unixes, Mac OS on Mac (and even
> on the Lisa!), Intuition on the Amiga,

In those times, Unix did not run on personal computers, but on machines more expensive than God.

Mac OS also ran on very expensive machines.

So, the market Windows attacked was the market for a GUI on PCs.

A GUI which runs on a machine more expensive than God does not count as competition for Windows.


> GEM on Atari ST and PCs were all more stable and
> more usable than Windows at the time.

Allow me to disagree. I have used GEM on the PC and it was painful!


> It's not availability (or lack of it) that won Microsoft
> that war. It probably has much more to do with them
> being the supplier of Dos and having better access
> than anyone else to hardware vendors (not to
> mention [ab]using that access)

There is also the fact that MS wrote lots of good software for Windows, including Excel and Word.

What stopped DR from doing the same thing for GEM?


> Linux 0.1 was announced 3 months before
> Windows 3.0 was released, much less 1.0
> and 2.0.

Windows was a GUI for the PC.

Linux was an Unix-like OS, which, at that time ran (reasonably fast) only in text mode.

So, comparing Windows 3 with Linux 0.1 is a comparison between a banana and a hippopotamus.

My point was different, however:

At the time when Windows 3 appeared, people wanted a good GUI for the PC, and Microsoft gave it to them.

I belive that at that time, it was NOT possible to give them a GUI which ran reasonably fast (that is, not painfully slow), and which was also fast.

You had to pick only one from stable / fast.

And, I belive "fast" was what most people would have picked at that time.

(Please forgive my poor English. I'm VERY tired and English is not my native language.)

John K.
Tuesday, July 29, 2003

*  Recent Topics

*  Fog Creek Home