The need for smaller protocol specifications
Jari Arkko
jari.arkko at piuha.net
Thu Jun 12 11:42:56 CEST 2003
Charlie et al, thanks for raising this issue. I believe
large specifications are a significant problem in the IETF
currently, and good points have been made in the discussion.
I'd like to add a few thoughts about the reasons why
we are ending up with large specifications. The discussion
has mainly concentrated on the IESG's requirements, but
in my opinion there are significant other causes for this.
In fact, I wouldn't blame the IESG for this at all...
[Let me add that I have been personally involved in the
creation of several too large specifications, so I know
what I'm talking about ;-) ]
o We (as in all of us) tend to be too optimistic about
the complexity of our projects at the beginning.
We are not very good in estimating how big a specification
will eventually need to be when it handles all error
cases, security, etc. Still, document structure stays
generally fixed from the -00 document. So if the problem
looked like an easy idea for a 10 page -00 I-D, you
might have wanted to split the document and protocols
into subfunctions if you knew it was going to be 100
pages at the end.
o Perceived need for speed at the beginning. Can't
split the document because it would slow us down
and require multiple WG LCs etc.
o Working groups generally try to appear as attractive
as possible towards potential customers of their
specifications and other stakeholders. For instance,
there's a need to show that the WG in the first place
is needed because all these wonderful functions need
to get done. Or an existing WG wants to ensure that
it doesn't "lose" against a competing approach.
o Similarly, if a "beauty contest" is run between two
approaches in the WG, both approaches end up having
a lot of features.
o Requirement processes that we nowadays tend to run
don't deal well with the prioritization of features.
If someone can invent a requirement, it generally
gets fulfilled in the solutions. We don't think
about cost-benefit analysis enough.
o Our quality control functions ensure that specifications
are complete (with all error, multi-version support,
different network scenarios, security, scalability
functions).
So, it looks very much like all of us are trying to do
a perfect job. Trouble is, if you multiply all requirements,
all network scenarios, and completeness you get a large
spec.
So what should we do about it? I think the right thing
is to take a serious look at fulfilling all requirements,
and to use modular specification approaches, and roadmaps /
architecture to show interested parties that you will
eventually support all components. And for whatever we
do produce in the smaller specifications, I think its
fine to require it to be complete in the sense of
not breaking the Internet, handling all error situations,
having security (even key management - its often very
application specific), and so on.
--Jari
More information about the Problem-statement
mailing list