ICANN/DNSO
DNSO Mailling lists archives

[ga-roots]


<<< Chronological Index >>>    <<< Thread Index >>>

Re: [ga-roots] On why the root is not open



Harald:
Your statement of what happened is basically correct. 
It was 6-10, not 5-10 and it was WG-C, not WG-B (unimportant errors). 

However, one must add this important note:

In choosing an arbitrary point between "a hundred" and "a million,"
DNSO chose 7 - very much on the low side. We all know why.
The TM lobby wants a restricted and highly regulated name space
and a variety of incumbent registries don't want additional competition.

It is now clear that 7 was far too restrictive, as many of us in theWG
argued at the time. Too many good proposals didn't get accepted. 
(My proposal calling for 500 new ones in the first
round was supported by about 1/3 of the WG)

Indeed, we won all the economic and technical arguments. 
We were just outmuscled politically. 

What cannot be emphasized too strongly is that ICANN does not
operate in a political and economic vacuum. If it artificially
restricts the market too much, there will be attempts to bypass
it. Thus, ICANN must not blame organizations like New.net
for "instability" it should only blame itself.

>>> Harald Tveit Alvestrand <harald@alvestrand.no> 05/14/01 07:21AM >>>
At 11:32 12.05.2001 -0700, Simon Higgs wrote:
>Second misconception. There is no shortage of TLDs. There's an 
>artificially manufactured scarcity. Big difference. Root fracture is a 
>natural consequence of artificial scarcity. The Internet routes around 
>failure. Take away the artificial scarcity, put all the TLDs requested 
>since 1995 into the USG root, and the problem goes away.

Since the participants on this list seem intent on rehashing ALL the old 
arguments no matter what others want, I might as well put up the standard 
response to this one.

TECHNICALLY, the DNS can easily support several hundred names in the root 
zone - because we are doing that now.
TECHNICALLY, we have grave doubts that it can support a million names in 
the root zone with an adequate stability for the root zone service. (The 
problem is rate of change for stability)

So, we the TECHNICAL people have passed a slip to the POLITICAL/OPERATIVE 
community saying "more than a hundred but less than a million, please".

As .com has proved, an open zone will have more than a million entries.
So there has to be a rule; some will get it, some will not.

The POLITICAL community, knowing full well that deleting a TLD from the 
root is probably going to be impossible, went through the WG-B process, and 
came up with the "five to ten" recommendation.

In the same process, there was no consensus for giving "those who thought 
of it first" any preferential treatment in handing out new TLDs. That was 
2000, not 1997.

Bottom line:

Some rule for who gets into the root must exist. And FCFS was not accepted.

--
This message was passed to you via the ga-roots@dnso.org list.
Send mail to majordomo@dnso.org to unsubscribe
("unsubscribe ga-roots" in the body of the message).
Archives at http://www.dnso.org/archives.html 


--
This message was passed to you via the ga-roots@dnso.org list.
Send mail to majordomo@dnso.org to unsubscribe
("unsubscribe ga-roots" in the body of the message).
Archives at http://www.dnso.org/archives.html



<<< Chronological Index >>>    <<< Thread Index >>>