Fog Creek Software
Discussion Board

Requirements vs Goals

"The failure of IT organizations in the U.S. to effectively service the needs of the business community has reduced the perceived "value" of IT. Because IT services are not valued, the natural response is to look for the cheapest provider of services (Third World) making IT services a commodity.

The root of the problem (as I see it) is IT's reactive approach to service delivery. Frequently, IT responds to whatever is requested by the business or blindly follows technical trends without a thorough evaluation of long-term impacts. In other words, IT is not proactively adding enough value to the decision-making process but simply reacting to requirements or trends. Business executives share in the problem by demanding immediate response to their needs without sufficient time for analysis or justification. "

Anyone besides myself remember when one of the goals of IT groups was to determine requirements instead of sitting back waiting for them to be handed to them by the customer on a silver platter?

I'm thinking that "developer's bill of rights" that stresses the right to clear, concise etc. requirements is hurting us more than it is helping.


Joe AA
Thursday, July 17, 2003

If I was a consumer of IT, I would be used to expecting projects that were:  Late, Over budget, and Not what I wanted.  Then I would expect the IT people to blame ME for the failures.  I consider this group to be relatively business-aware, but still people vociferously defend lousy development practices.  Check any thread about schedules, usability, estimates.

Johnny Simmson
Thursday, July 17, 2003

I sort of agree.  I once worked on a big project where the management asked for a "very simple" app to perform a job.  And, so, I went to work, and came up with a very simple app to do the job.

And you know what?  It was a failure, it was too hard to use because the very simple interface didn't have enough bells and whistles to make the app usable.

I wish now that I'd gone in knowing what the goal was ("make an application to do X that our clients will love to use and pay us lots of money for the privlege') rather than just blindly following the stated requirements.

Much as we all like to whine about management not giving us good specs, I think the truth of the matter is that management doesn't really know what it is they want except in vauge terms, and if we want to consider ourselves "professionals", it's our responsibility to figure out the right way to fufill their dream.

To put it more trollishly, if you're just going to demand a high salary for being an introverted geek with no knowledge of the world outside your dual 21-inch LCD screens, then yes, you should be replaced by Indians and Russians who can do the same damn thing for much less.

Foolish Jordan
Thursday, July 17, 2003

Determining Goal of the management from the task given to developer cannot easy all the time. Some question answer session and introspection is required.  Also  the "extrapolation  guide line for the developer" sort of material would be helpful.  Even the person who flip the burger knows that what is the ultimate goal of the burger after a while.  In programming world that may not be always the case as the requirements vary from case to case.

You can equate this to many other scenes. If you are asked to get milk for example, you would also buy the butter if you think it's necessary for your home.  The whole hearted totallity approach towards the business of the company can put your effort in the right direction and earn the respect beyound a developer's scale.

Thursday, July 17, 2003

Perhaps, but you may wish to consider that IT has a history of being in the "back room."  That CIOs were an invention of the 90s, not the 70s.  That a place in the boardroom was in getting projectors running, not a partnership. 

My view on this is the opposite of yours.  IT appears to have little business value because:
- It was not understood by many until recently
- Its success is determined by the very people who determine the outcome.

The first one has changed a lot in the past 10 years, but the last one has only gotten worse.  In what other industry is an expansion of scope or requirements expected to occur without an increase in cost? Yet, in a typical IT department the discussion goes like this:
Business Unit: I need you to add FooBar access to the FOO project.  Can you do that?
IT: No problem.  It should take about two weeks.
Business Unit:  Two weeks?!! I don't have enough in my budget for two more weeks.  You need to figure out something else.

In many ways it is like blaming Porsche engineers for not producing a 911 for under $10,000.    The perception we fight daily is that requirements are zero cost items.  That "bits" are free, because a widget is not turned out at the end.

The issue is not requirements gathering, it is business management.  Business has become dependent on IT, and many businesses now force departments to pay for IT.  The "just put a field on the screen" has come to "Three days of work to put one field on a screen!!"  I can only budget one day.  You need to figure it out.

Sorry, I took the customer is always right for a long time.  Sadly, in this case, IT has become the mortician.  Even when we do a good job people don't like paying for it.

Thursday, July 17, 2003

Joe AA wrote, "Anyone besides myself remember when one of the goals of IT groups was to determine requirements instead of sitting back waiting for them to be handed to them by the customer on a silver platter?

I'm thinking that "developer's bill of rights" that stresses the right to clear, concise etc. requirements is hurting us more than it is helping."

Anyone here old enough to remember when corporate IT departments actually employed programmers, system analysts, DBAs, project managers, ....

Corporate IT departments have changed dramatically in last 15-20 years. The typical pattern of change has been as follows:

* Layoff/Fire a number of IT employees and require those left to take on more responsibilites

* Outsource work to American based consulting companies while at same time laying off more of the technical staff

* Outsource work to offshore companies while at the same time laying off even more of the technical staff

* Every 3-4 years get rid of the consulting companies you are doing business with and hire cheaper ones.

At many corporations, the only IT staff members left who are actually employees of the corporation are managers. Many of them have little or no software development experience under their belts. These people are good at playing corporate political games and many of them are only interested in climbing the corporate ladder as fast as possible.

One Programmer's Opinion
Thursday, July 17, 2003

These sorts of discussions need to define their terms better.

People say the "IT industry" has failed, when what they really mean is they've been screwed by aggressive outsourcing companies. So, point 1, outsourcing companies are greedy and often not very good. Agreed.

They say deadlines are missed or quality is poor. But deadlines were set by managers without professional expertise in software. Projects are often written by the cheapest code-monkeys the above managers can hire. 

If surgical teams were led by MBA's, we would see the same problems in the "medical industry."

Friday, July 18, 2003

Definitely... a lot of things have changed over the last 30 or so years.  And I am old enough to remember when IT departments only employed programmers... before our job was fragmented into system programmers, system analysts, DBA's... architects... etc.  I'm not really fond of excessive specialization.

So despite all of the typical causes of an IT as failure - meaning ladder climbing managers, low estimates, politics, new or old technology usage, lack of proper tools, yada, yada yada... I am presented with something along the lines of the following questions/statements from my co-workers:

o  "They" have to tell us what to do.
o  Who has to make that decision.
o  The user didn't tell us to change that...

I'm not talking about great significant decisions, things that would increase scope, take more money, require more people.

I'm talking about simple things...

like a data field increasing in size... and the decision to change the size of the field on a screen.

like a design for a code conversion lookup table - does it need to be maintained historical by date or not - for code values that are set in concrete.

Things... that once upon a time used to be common sense decisions... now have to be endlessly discussed, blessed, signed off.

I don't know.  I think I see it as an unwillingness to assume responsibility for the product being created.

Maybe we've been beat up too much to care any more.

Joe AA
Friday, July 18, 2003

I think the fundamental problem here is that it's almost impossible to get the *real* requirements when first talking to a customer.

Think of the number of times your requirements haven't matched what the customer finally wanted.  Then think of the number of times the requirements *have* matched the customer's desire.  From what I read, the former far outweighs the latter.

Doesn't this suggest that this methodology is broken?  Doesn't this suggest that IT should stop trying to identify every specific requirement at the beginning?

There are alternatives.

The Pedant, Brent P. Newhall
Friday, July 18, 2003

*  Recent Topics

*  Fog Creek Home