Scrutinize ICANN's Thousand New-TLDs Limit

By Alex Tajirian
Alex Tajirian

I am skeptical about how ICANN has arrived at a technical limit of a thousand new TLDs per year. The ICANN study driving this number must be made public so that our industry's risk management experts can size up the finding.

Why am I skeptical? First, given that the limit is supposed to remain constant over the next few years instead of rising, it would appear that ICANN has overlooked the benefits of experience. Second, the now defunct Lehman Brothers and Bear Stearns claimed to have the best processes for analyzing and managing risk. To put it mildly, they didn't.

Effective risk measures can't be based only on historical data and simulations; they must incorporate judgment based on experience and expertise. When considered alone, forecasts based on historical data can provide a false sense of security. Nevertheless, historical data on new TLDs is scarce, especially as it relates to such an enormous first-time endeavor. The risk from black swan events (sometimes referred to as "tail events" and "extreme events"), a term popularized by Nassim Taleb in his book of the same name, must be managed because attempts to predict such events are futile.

Thus, we need more transparency in ICANN's risk management readiness to launch new TLDs.

By Alex Tajirian, CEO at DomainMart

Related topics: ICANN, Internet Governance, New TLDs


How about a million? Phillip Hallam-Baker  –  Oct 16, 2010 10:25 AM PST

Like the IPv4 address space, a thousand a year is probably the anti-goldilocks number. It is too big and too small at the same time.

If IPv4 had had a 16 bit address space we would have fixed the address problem in 1985. If IPv4 had a 48 bit address space we could have learned to live within the constraints. But they chose 32 which was not even the size of a telephone number.

A thousand domains a year is probably large enough to kill off the protective registration market. Very few brands outside the top brands are going to register in every domain as a matter of policy. So the automatic registrations will be significantly less than the hundreds of thousands that new TLDs can currently expect.

It is of course technically possible to run a domain with a million or a billion names. The issue is not technology, the issue is funding models. One of the rather peculiar aspects of ICANN's TLD registration scheme is that they charge upwards of $50K and pay none of that over to the people who actually run the root servers.

In the long run this system is unsustainable it can only be a transition away from the original broken concept of a hierarchical delegation. The root will replace .com but with the task of publishing the root being managed by multiple registries bidding under competitive tender rather than by one company that managed to acquire a monopoly.  The cost and complexity of registering in the root will gradually fall until it is little more than cost of management.

It will take a decade or so this happens, but once the decision was taken to allow any new domains, the end result became inevitable.

You mean something like... David Conrad  –  Oct 16, 2010 10:16 PM PST

Only one of the documents on which Phillip Hallam-Baker  –  Oct 17, 2010 3:40 AM PST

Only one of the documents on which public comment is being sought is actually linked there.

Of course it is rather unlikely that there will be millions of applications when the non-refundable fee is $75K. But then again there is no need for the review scheme in that case.