It's the Users, Stupid

Why open, voluntary standards are better than government-mandated standards.

Abstract

The history of data communication shows a number of attempts to make communication between computers easier by having multiple systems converge on a single specification for communication. The attempts include proprietary "industry standards" (SNA, NetBIOS), government-supported and -mandated standards (ISO, ITU, GOSIP), and voluntary, open standards (IETF). This talk gives some views on what happened with the various attempts, attempts to draw some conclusions about why things came out this way, and tries to point out what it is reasonable to strive for in the future.

In the beginning, there was SNA

Well, not quite the beginning. But close enough. IBM owned and controlled it, IBM was always first with products on the standards, and IBM could make changes to the specification any time they wanted.

Users were forced to pay the price IBM wanted, or live with the inevitable problems that came from running with "nonstandard" hardware and software on "clone" computers in their networks. And the flexibility IBM thought was reasonable - quite a few networks were run under the principle that configurations (which were centralized) were ONLY changed on Thursdays - if you got one character wrong, you had to wait until next week's timeslot before you could correct it.
The cost of running SNA was great - both in money and manpower.

But it provided value to the users.

In many ways, the spiritual inheritor of SNA is the Microsoft NT Domain system. Never documented enough for interoperability, hedged about with license fees guaranteed to make an accountant's head swim, and subtly or not-so-subtly encouraging the perception that everything should be done the Microsoft Way.

But providing value to the users.

(note - Microsoft Windows 2000 Active Directory is different. Not necessarily better - but different.)

The Sad Story of the GOSIPs

To those who have forgotten what this was, or are happy never to have learned:
GOSIP stands for Government OSI Profile, where OSI stands for Open Systems Interconnection.

OSI was a project chiefly driven by and through the International Organization for Standardization (ISO), often in cooperation with the International Telecommuncations Union (ITU).

Its mandate was to specify a communications functionality that would be independent of the vagaries of a single vendor, and fulfil the requirements for communications in the computer age. In some ways, this was the response of the rest of the world to the dominance of the IBM SNA networking architecture in the computer marketplace.

Its core product and architecture was the famous seven-layer model, published in 1978.

The standardization process then started climbing up the layers - standardizing layer 2 (HDLC) in xxx, layer 3 (X.25) in xxx, layer 4 (OSI Transport) in xxx, and the Session layer in xxxx.

About this time, two interesting things happened:

The result of the confluence of those two factors was catastrophic.

The standardizers became convinced that they had won the war, and could do what they thought was "right" - which in many cases involved adding more bells and whistles, including compromises and spending huge amounts of time on specifying test suites and conformance statements - and never, ever removing a feature just because it was useless.

The equipment vendors became hysterical about having "checkboxes" on their product marketing material saying that they would support OSI - and "implemented" it by finding some software that could be claimed to do it, as cheaply as possible, rushed it out into the field with minimum testing - "wink, nod - nobody sane will turn this on anyway" - and went right along pushing their proprietary solutions as "the thing you should use if you REALLY want to get things done".

This then led to a situation in the field where:

Of course, this didn't help the adoption of OSI software at all.

The Internet Infection

At the same time, unbeknownst to the people mentioned above, a phenomenon was spreading.

This did not come with promises of eternal bliss down the road, nor did it come with requirements of buying certain types of systems - you could run "anything as long as it ran Unix", and quite a few other system types.

And the beauty of it was that it delivered functionality now. Email to the guy next door, file transfers across campus. If you got through the hurdles of connecting to the "Arpanet" (later transmogrified and extended into "the Internet"), you could exchange email with other users, and access the filestores they had put up for public access. Grotty and crufty and generally looking like a high school science project - but it worked. And it delivered value. Now.

The developers of the core Internet standards were not interested in striking it rich. And they were not interested in maintaining control. Their main concern was that they wanted a network they could use for other things - if it worked, they liked it; if it didn't, they didn't.
And they liked having more people to communicate with over the network.

It was not long before people started making a business out of things - Cisco Systems was established in 1984, the first commercial ISPs were established in 1990. But that initial impetus towards communication as a value in itself was enough to color the user experience of the whole revolution in communications that we now term the "Internet": by the time the Internet intruded upon public consciousness, the principle of openness was so well established that even when giant (for their time) proprietary networks like AOL or Compuserve connected to the Internet, it was the Internet norm of "everyone can talk" that prevailed, not the walled garden norms of the proprietary networks.

The Internet took its time about formalizing its way of making standards. It was only in 1992 that the current process for developing standards got documented (RFC 1310), formalizing and building on far earlier traditions. But from very early on, the Internet standards process was built on three principles that made it different from other efforts - just how different, it took some time to understand:

The result of the two first is kind of obvious - participating was easy, so the people who felt that they could contribute, did - making more smart people available for the standards effort. But the results of the third one warrant some mention.

For one thing, the fact that nobody was handing out "conformance marks" for IETF standards meant that some really cheap software got onto the market. Some of it worked, some of it didn't - and when the users were mad, they could check the standards for themselves, point out the nonconformance, and either get it fixed or switch software. And if the standards were nonfunctional, they could show up at the IETF and yell at people.

Another effect was that people adopted Internet software because they found it useful. The Internet itself was perhaps the greatest lure - but there were also huge deployments in isolated environments where the inclusion of the base standards in the BSD-derived UNIX workstations meant that the threshold for getting from "nowhere" to "useful" was rather low.

So compared to the OSI situation, we got a completely opposite effect:

Is there any wonder the Internet won?

Lessons for the future

Standards, like any other effort that requires money to power it, have to be made so that they produce value for investment.

The value of a standard is not first and foremost to the companies that produce the products that implement the standard - it is to the users that take up the products and services enabled by the existence of the standard. Product vendors benefit indirectly - because the users gain more benefit from their products, the users will demand more of them.

The value of a mandate to use a particular standard is negligible. If the standard produces real short-term benefit, people will use it with or without a mandate. If it produces no short-term benefit, the effort to get it used will consume huge amounts of energy, effort and investment that could have been far better spent on making sure there are useful standards available.

Openness allows the feedback loop of the standards process to be shortened - the users can talk to the people making the standards - or can even BE the people making the standards. This produces more useful standards, which is in the long run a benefit to all.

Users and vendors: Open standards are good. It's YOUR responsibility to make them work right.

Governments: Don't ruin them by helping too much!

Onwards!