[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re[6]: [wg-c] Re: IP/TM Concerns & New GTLDs



Tuesday, August 03, 1999, 11:47:43 AM, Roeland M.J. Meyer <rmeyer@mhsc.com> wrote:

> A nacent registry would have a very difficult time competeing with NSI
> on any sort of equal footing. In fact, they wouldn't be. The numbers
> that I have published here are actually an estimate of what it would
> take to build a registry capable of competeing with NSI. Yes, it could
> be done cheaper, but it would not be market viable, IMHO. Those who
> criticize the cost estimates are clearly doing so without this
> understanding. The objective isn't to build a minimal, low-buck,
> registry, rather it is to build a competitive gTLD registry. Anything
> less and NSI would flatten it, make road pizza out of 'em. Any investor,
> in such a project, would require this level of competitiveness or they
> would not waste their money.

I guess this depends on your definition of market viable.  Just like
the ISP industry, success can be measured by a number of different
methods.

The problem with your numbers is that they presuppose that only a
highly commercial registry with several million registrations is the
only workable model, and I don't agree with that either.

I will use the third level domain registries as an example.  When the
largest third level domain registry so far closed, ml.org, they had
well over 200,000 registered domains to over 125,000 unique users.
When they closed, 3 staff members worked to develop a new registry,
one that would not be plagued with the problems that ml.org developed.
They learned their lessons from ML.org very well, and in a short time
have grown to over 35,000 registered domains to just over 34,000
users.  They did this with a shoestring budget, worked out deals that
gave them multiple servers on diverse geographical and topological
locations.  They developed the registry software in a matter of weeks,
not months, and at a cost of nothing more than volunteer labor.  They
have a robust system, with good security.  They are responsive to
their users, and address their feedback and concerns.  They run a
public whois server providing information on their registered hosts.

They are on track to top ml.org's registrations in less than one-third
of the time it took ml.org to get to that volume, and are doing it
with a level of stability that would have made ml.org envious in its
time.

A registry that has the potential to top 100,000 registrations, as a
free or low cost model, is a success anyway you cut it.

Why do we have to assume that all new registries must be on the level
of .com?   Why SHOULD all registries even WANT to be on that level?

What is the harm in having multiple models, models that will appeal to
different segments of the market?  The DHS.org model is one that will
not appeal to highly commercialized entities, and indeed may appeal to
a segment of the market that would not otherwise be interested in
registering domains.  NSI's .COM or IODesign's .web(tm) Registry will
appeal to different segments of this marketing.  People who have
talked about prospective registries clearly understand that they are
coming into a disadvantage competing with .COM, but geez, at least its
a first step.  Their registries WILL grow, and as they grow they will
compete more and more on merit.

But we have to take the first step somewhere.  I'm not opposed to the
idea of creating two or three new gTLDs, a limited number, to start
off with.  I think its a great idea.  But we have to make sure that we
don't lay new registries in bondage, and limit their ability to
respond to vast changes in the market and thus hurt the chances of
their success, by placing unnecessary and overly restrictive limits on
their operations.  Frankly, I think the CORE model has its advantages
as on model for TLDs, but I recognize the appealability and
workability of other models as well, and have yet to see any
compelling reason why we must assert some artificial limit on the type
of model of new registries.  Let there be 100 or more new gTLD
registries at some point.  So what if some of them never exceed more
than a few thousand registrations, as long as the registry remains
solvent?  Set REASONABLE requirements for new gTLDs, such as a bond
that will enable another organization to take over management of
existing registrations for a period of time, an agreement by
registries that should they become insolvent or unable or unwilling to
operate the gTLD, that there is a set procedure for transferring the
TLD to a new registry.  But make these requirements REASONABLE.

The ONLY problem I have with the CORE model, is that the supporters of
it want it to be the ONLY model.  And I see no justification in that.
None at all.

I would be happy to expand on what I think reasonable requirements
are.

> To Ken Stubbs:
> If you think that you could build a gTLD registry that could compete
> with NSI, for under $2MUS then good luck to you, but not a single
> investor would agree with you and they vote with $cash$. The marketing
> costs alone would run you into bankruptcy court, in less than 3 months
> of operation.

I guess that depends on how you market.  The great thing about the
internet is that there are sundry ways to market without great
expense, and word of mouth travels fast in this network.

--
William X. Walsh
General Manager, DSo Internet Services
Email: william@dso.net  Fax:(209) 671-7934


(IDNO MEMBER)
Support the Cyberspace Association, the 
constituency of Individual Domain Name Owners 
http://www.idno.org