More detail: a sketchy idea for expressing zone policy
Shawn.Steele at microsoft.com
Wed Dec 9 19:17:23 CET 2009
>> Another problem is that it doesn't help when processing disconnected
>> strings. If I wanted to validate a URL in an address book, I'd have
>> to look it up.
> Hrm. Isn't that true today? If I give you a URL, don't you have to
> check in the DNS to see whether you get NXDOMAIN?
Sort of. Most editors let you type http://whatever without actually looking to see if it goes anywhere. OWA just highlighted http://whatever when I typed it, for example. So there are many cases where software currently touches a domain name without actually hitting the DNS. I'm not sure how those cases are impacted by the per-zone idea. Similarly current architecture causes IE to try to figure out what form of the name it should pass to the low-level name resolution APIs. So IE's trying to figure out the IDN rules, but it doesn't directly have access to the DNS, it'd have to call some new API to get this info.
That doesn't necessarily block your idea, but they seem interesting.
>> What happens when the zone changes the policy? (Or adds a policy
>> when it suddenly becomes aware of the option?)
>It seems to me that things which did archiving of links (like, say,
>spiders) would have to archive the policy document under which the
>archive was accessed, too. But since the policies have a begin and
>end timestamp, that should work.
Hmmmm. Seems like a good place for a lot of error. Also: Does it matter if I archived the site? If the rules changed I couldn't get to the old site anyway?
More information about the Idna-update