Requirements Analysis -- multiple clients
I'm about to start work on a project to develop a new tool for managing complex data via the Internet. The clients are about 5 academic CS research groups throughout the country. Although the clients work in similar fields, they will almost certainly be expecting something different from the project.
Additionally, I know that each site misunderstands the aims of the project to some degree. The academics are leaders in their fields and not used to admitting that they are wrong -- especially to their "competitors" (the other clients).
I plan on interviewing key players at each site, and trying to extract information with which to write a specification. However, I am worried that there will be very little intersection in their requirements, and that each will expect something quite heroic. While I might be able to satisfy one set of demanding requirements, it's unlikely that I'll be able to satisfy everyone.
What are your suggestions for performing the requirements analysis and specification phases? Do you have any tips on how I can produce a specification that everyone is happy with? How can I insulate myself from the problems associated with a project where the clients have different and possibly conflicting requirements?
Am I ...
... screwed from day one?
Monday, April 19, 2004
LMAO!
Sorry, seriously?
I don't want to generalize too much, so if any of you are from a CS dept. THIS DOES NOT APPLY TO YOU.
To be honest, whatever you do it will not be to their liking since you will not be using (insert advocated experimental methodology| platform|language) of (insert name of group x where x=1..5). No effort of yours will overcome this issue.
Then again, who knows. If this was not about getting the grant money and there is a real interest in the results (e.g. your program will automate some tedious grant preparation work) ...
Just me (Sir to you)
Monday, April 19, 2004
Oh well,
before anyone else metions it: Yes, I'm having a bad day. I'll try to lay off the Post button for the remainder of it.
Sorry Screwed, you were just some collateral damage.
Just me (Sir to you)
Monday, April 19, 2004
Tell each group to designate a representative that has the authority to make decisions for the group. Get the five representatives in one room and build the requirements together. Take copious notes and/or record the sessions.
Practically, this is the most efficient way to create your specs.
Cynically, this will be your backstop when group B says "why the hell did you do [x]?"
Philo
Philo
Monday, April 19, 2004
Philo,
That's a reasonable suggestion, but only guarantees that you can PROVE they *should* be happy with the results, ot that they *will* be happy with the results.
In my limite experience with this, theproblem with talking to a "representative" from a large group (corporation, university, etc.) is that they can NOT represent all the "factions" that will be using the solutions. It's often too political.
My jaded opinion.
Mr. Analogy
Monday, April 19, 2004
Mr. Analogy - to some degree you're right. However, if you have five groups that are at odds with each other, you're either going to spend a year flying back and forth from one to another or you have to get them in the same room. Since getting the *entire* group into a room just Does Not Work, then you get representatives.
The problem he's facing is that he goes to A, they say "we need nanometer resolution on microbe measurements." Then he goes to group B, where they say "we must have microbe measurements in millimeter resolution." So he says "Group A said they need it in nanometer resolution." Group B then says "What the hell are they talking about? Why would they need nanometer resolution? That's crazy talk."
Now what does our fearless leader do? This kind of situation is going to happen a dozen times a day; by the end of the fifth group it will be impossible.
BTW, I still advocate heavy use of iterative development - get bits into their hands as early as possible so the *whole* group can give you feedback. But as far as requirements go, IMHO this is the best, most workable solution.
Philo
Philo
Monday, April 19, 2004
OK, what about I:
1. Visit each site, present my take on the project to each group. Then, when I get it wrong, I'll note what they focussed on -- I'll take that to be that group's "deal breaker" issue. These will form my idea of what needs to happen for the tool to be accepted.
2. I'll ask each group to nominate a representative, and ask them to talk to their colleauges and put together a 1-page set of requirements for the group.
3. Get all representatives in a room, and go through the set of requirements. We can them prioritise them, identifying a core set of common requirements. At the end of the meeting, I'll get each person to sign off on the core set of requirements.
5. When I develop the tool, I'll build in a plugin architecture, that will allow myself, or each group, to develop plugins for the tool to satisfy non-core requirements.
I like the idea of iterative refinement -- I might go for an XP-ish approach where I release on a regular schedule and let feedback form the specification for the next phase.
... screwed from day one?
Monday, April 19, 2004
The iterative approach is probably what will save you.
Talk to them all. Put everything they tell you together and decide what's the minimum system you can provide that will give them something they can use (not necessarily everyone's wish list, just something useful).
Make it clear to them that all the other features they've asked for are planned for a subsequent release. Add features and make more releases until the money runs out.
At least that way they'll have something that works and some of them might even be reasonably satisfied. If you try to make everyone happy with release 1.0 you'll blow the budget and never deliver anything.
Tom H
Monday, April 19, 2004
When dealing with people in an academic environment:
Remember that the people at the 'top' can't force anything on anyone. You could implement all of their 'must have' wishes, and the people actually working in their groups could ignore what you do.
If at all possible, phase the project, and find people at each site who would use your tool if it were available. That way you can start small, and see if anyone uses the product. Academics are used to putting what is essentially a small amount of effort into a larger context to explain why it is valuable. Most of them will be used to the idea that you present a large vision of the problem you are solving (without specifics), but get to the end result in small steps.
When finding people who will use your product within each group, in my experience (I deal with people in biology and related disciplines) you have to distinguish between what they consider 'nice' and what appears essential for them to carry out their work. Most of the work is carried out by graduate students and postdocs, who have their own agendas and personal schedules. They will only use something essential, and only if it is timely. That is, on a several-year project, there might be a window of perhaps 6 months when a particular tool is of interest. Someone might tell you that they need 'x' now, and when you bring them 'x' in six months, they've moved on to another phase of their work and have no further interest.
A good way to find out where your work fits in is to read a few papers from the lab (remembering that the work is generally several years old) and a few grant applications (you can get the abstracts off the Internet for some granting agencies.)
Remember the saying that a university is a collection of anarchists united by a common parking lot.
Good luck. I've been writing software for academics since the late 1960s, and it has always been fun.
Dan Brown
Monday, April 19, 2004
Another tactic to use is to try to get these folks focused on the problems they are trying to solve and the characteristics of a sound solution. If they can agree on this, you'll get a lot less variability in the details of the solution you'll eventually implement.
another two cents
Monday, April 19, 2004
You need to publish peer-reviewed articles discussing the problems that need to be solved. Since you've got a deadline, keep discussion down to two years.
When that's settled down, submit grant applications to multiple user bodies and wait another 12 months. Then appoint review panels and finally advertise for some post-docs.
Monday, April 19, 2004
Put together a core. (Yea discuss it but keep it clean and coherent.)
Let the groups customise things in ways that don't make your job too difficult.
Offload the points of argument to each group and say fix it yourself.
Harvest their really good ideas and put them in next version of the core.
(My tuppence.)
Mike Gale
Monday, April 19, 2004
Ditto on Tom H's response.
Frogmoses
Thursday, April 22, 2004
Recent Topics
Fog Creek Home
|