Latency

Brian E Carpenter brian at hursley.ibm.com
Mon Feb 3 15:06:33 CET 2003


I don't think the problem is the idea of writing down requirements.
On the contrary, that is actually needed in any engineering
project. The problem is when we elevate requirements documents to
the point where they are reviewed as if they were protocol designs.
I'm at a complete loss to understand why the one you cite, the
multi6 requirements draft, wasn't published as an informational
RFC more than a year ago. I can only assume it is because of a hidden
desire to turn it into an algorithm (which is tricky, since some
of the requirements are mutually exclusive - a perfectly normal
situation in engineering).

So let's do requirements, and use the waterfall method for complex
problems that require it, but without turning either of them into
rigid systems that block progress.

   Brian

"Ayyasamy, Senthilkumar (UMKC-Student)" wrote:
> 
> > To put it less subtly, in some cases we have gone from "rough
> > consensus and working code" to "fine requirements and forbidden
> > code".
> 
> Very true for many working groups.
> 
> A related quote from Noel in multi6 WG:
> http://ops.ietf.org/lists/multi6/multi6.2003/msg00029.html
> "To paraphrase an old line, "those who can, do, those who can't, write
> requirements documents". On a more serious note, I've really come to have a
> low opinion of requirements documents for what I think are several good
> reasons.
> 
> For one, it seems that people spend inordinate amounts of time creating and
> arguing over them, and we don't have infinite free engineer-hours.
> 
> For another, requirements documents cannot take into account the varying
> costs of providing each of the requirements - which often trade off against
> each other in ways that depend on the exact details of the various possible
> engineering solutions. They also usually don't examine the benefits of each
> requirement - which is really necessary when you start to examine the
> cost/benefit ratio of meeting each requirement.
> 
> To give an example from the current context - when a multi-homed site loses
> one link, do we want to be able to keep existing connections that use that
> link open? Well, it sounds like a useful requirement - but how expensive (in
> complexity, etc) is it, versus how much benefit will we really see from it (in
> a world in which most interactions use the Web, and most Web transactions use
> short TCP connections, and depend on cookies for long-term interactions).
> Somehow I don't think a requirements document is going to answer that,
> especially since the answer depends on how much doing it costs - which you
> don't know until you've done the design.
> 
> So requirements documents always seem to turn either into:
> 
> - i) a Procrustean bed, in which all requirements have to be met no matter
> how poor the cost/benefit ratio - and which, moreover, delude people into
> thinking that if all the requirements are met, then a good outcome is
> guaranteed, which is almost the exact opposite of the truth (recall my
> aphorism that the measure of a great architecture is one which meets
> requirements the designers didn't know about - and that doesn't consider
> other issues, like a design failing because it reaches kitchen-sink levels of
> inclusiveness and consequently ponderousness), or
> - ii) documents that in the end provide only a limited amount of guidance
> anyway.
> 
> Having said all that, it might be nice to have a short (3-4 page) document
> that generates some *brief* discussion (e.g. I still don't know what people
> think about the example above, whether it's necessary to keep connections
> open). However, the massive exercises we've seen too many of in the IETF
> recently are an almost complete waste of time, IMO."

-- 
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Brian E Carpenter 
Distinguished Engineer, Internet Standards & Technology, IBM 
On assignment at the IBM Zurich Laboratory, Switzerland


More information about the Problem-statement mailing list