Update to clarify combining characters
jefsey at jefsey.com
Wed Apr 23 14:46:50 CEST 2014
At 09:33 23/04/2014, John C Klensin wrote:
>--On Tuesday, April 22, 2014 23:58 +0200 "J-F C. Morfin"
><jfc at morfin.org> wrote:
> > John,
> > it seems that all this is becoming political and economical
> > issues totally out of the IETF end to end and my fringe to
> > fringe scopes.
>While parts of it, to my surprise and disappointment, seem to be
>getting worse (with the ICANN "variant" effort perhaps heading
>the list), IDNs have always been driven by politics (especially
>if that term is used in a broad sense). They exist at the
>boundary between the DNS, which was not intended for end-user,
>natural-language, identifiers and a whole series of requirements
>that arise as soon as one wants to accommodate reasonable
My understanding of the situation is pragmatic. We had a sunrise
period until 1978 with mainly a lot of work (US, UK, Spain R&D) and
two visions: Norman Hardy first (Tymnet) and Louis Pouzin (Cyclades).
Then, LaRoy Times and Joe Rinde took the path of multi-technology at
the fringe (smart interfaces) and Vint Cerf at the edge (ASCII
content). Tymnet was capability oriented and Internet Unix/IP
superuser oriented. Tymnet was byte oriented, Internet datagram
oriented. Tymnet deployed the international network and was acquired
by the industrial establishment which clumsily imposed its global network.
This period was characterized by a stable structural element which
was the USG oversight, FCC for Tymnet and DoC for the Internet. This
gave a hierarchical flavor to the distributed underlying catenet +
protocols (let say hardware + software) but prevented too much
attention to brainware (except may be by the NSA). With the NTIA
distanciation (demand to consider things as they individually are),
this period is ending. It has a huge legacy with pros and cons we
cannot change. This means that we are, in mathematical terms, at a
self-organized criticality period. From an agoric (term of Norman
Hardy to qualify network/market multi-logic) point of view, if we
cannot change the past, we can inflect the network of the
potentialites it has created.
IMHO the key nodal factor in the internet architecture is IDNA2008
because it shows how the "real internet" (the one which also
addresses RAM and other issues) addresses diversity, i.e. the
multi-something (in this case multilinguistics). (1) it works (2) our
various architectural thinking came to a consensus over it, (3) it
completes the IEN 48, RFC 1122, 1958 and 3439 architectural
documents, (4) can be tested along ICANN/ICP-3 pragamatic rules, and
(5) is within the reach of non-commercial contributors also called
for help by RFC 3869 aside Governements.
> and tensions between global and localized identifiers (I think
> your fringe to fringe scope ideas fall into the latter category).
Yes. It is the location (RFC 1958) of the missing presentation layer
six. But it has to be on the network and on the user side. I feel
IDNA2008, the IETF post-IDNA2008, and IAB responses to my appeals
have well documented now the network side. One would need one single
network side consolidated document the IUCG would have to complete on
the user side and we could amalgamate into an smart use fringe to
fringe intelligrams oriented architecture and interoperating system.
Based upon the existing robust and proven end to end datagram
>IMO, given that we are constrained by the limitations of a DNS that
>was designed for a different set of requirements, IDNA represents
>about as good a balance and set of tricks for getting around those
>limitations as we are going to get (and probably as is
>possible). The tradeoffs could have been made differently, but, at
>best, only by trading one set of issues for another.
Let come back to Vint's IEN 48 consistency. Joe delivered the bundled
capacities I deployed that were what Vint targeted as being layered.
My belief is that the architecture is here now thirty years but also
billions of users later. Let not touch it, let try it.
>We know how to solve those problems (at least "pretty well") but the
>solutions require a different naming and name-resolution
>environment: either in the form of an "above DNS" naming layer or
>some flavor of DNSng that is designed around natural language,
Yes. But you keep only thinking about DNS, I am thinking about naming
and architectonics. Your IDNA solution is a paradigm for
everything that is network-diversified. An end to end robust core
plus fringe to fringe qualified relational spaces.
>everyone who understood the technical limitations of the DNS and who
>had even an elementary understanding of the natural language issues
>has known all along that IDNs would not fully meet the end user expectations.
This is the blessing. The DNS is limited to what it is. It is end to
end robust. It is level with the billions of machine's thinking it
relates with. Rock solid, even ICANN proof!
>Given those DNS limitations, coming up with a technical solution
>that would meet the requirement to be able to accommodate a
>reasonable range of non-ASCII mnemonics within a DNS context, using
>Unicode, and without causing more problems than necessary was, and
>remains, within IETF scope. Very little more than that is.
In order to protect Milton's idea I obtained http://dnsa.org and
started there a wiki on the naming issue. It is independent thinking
on 0-Z numbered alphadecimal labels multiple capacities. It includes
the DNS and many other things. As you know with Gerard we work with
the French Gov on a geolinguistic system that should help clarifiy
the "cc-tags" and "lang-tags" within an open framework.
>Yes. But nothing in what you say contradicts the assertion that
>Eric's examples do not demonstrate that Wabanaki works more
>poorly in IDNA than in Swedish (there are languages that do work
>more poorly), nor Eric's belief that, as I understand it,
>languages and scripts without a major and ongoing ICANN presence
>and willingness to invest considerable resource are better
>accommodated deeper in the DNS tree than at the root.
Correct. I consider two things:
- the equal treatment the internet protocol set gives to ASCII, and "0-Z."
- the intelligence of Swedish, Wabanaki, French, etc. people to get
the simplest, the most robust and efficient out of it.
>That discussion was less about economics but still very much able
>policy and politics... long before ICANN.
The very technical question I have is: once the NTIA is unbolt, what
will be the need for its technical interface (ICANN)?
More information about the Idna-update