Home / Blogs

ICANN Shows Safe Decisions Aren't Always the Right Decisions

When ICANN selected Deloitte and IBM to provide technical database administration services for a key part of its new gTLD program, it became quickly clear that the choice was not made on whose proposals scored highest, but rather it was based on which partner presented the least perceived business risk but at a much higher than necessary cost.

I know this because I wrote a response to the original RFP and as a result am pretty familiar with what the original specification required. Since the announcement of the TMCH providers, it seems rather unusual the number of public questions that remain unanswered despite being part of the original requirement.

I used to work for IBM and I can remember how sales benefitted from a customer's belief that no one ever got fired for choosing IBM. Well, those days are long gone. Just look at the growth of the open source market. And in a complex multi-stakeholder world, it becomes even more important to implement solutions that are consistent with the values of the community-at-large.

As a matter of fairness, note that at the recent ICANN meeting in Prague, a TMCH panel discussion included the contractors hired by ICANN. You can listen to the recording of the meeting here and view a copy of the presentation here.

A couple of observations of the material presented underscore the notion that, just maybe, this contract award was not that well-thought out but must have been perceived as low-risk.

First, the system will be available for testing in mid-July without having any discussion with registrars and/or registries as to the technical impact of the solution. A key criterion in the original RFP was to ensure the TMCH could operate with as little impact as possible on registrars and registries that have significant lead times in software development. With no involvement to date with registrars and registries yet relying on an independent third party EPP consultant, it's hard to believe that the solution will in fact have minimal impact on registries and registrars.

It did not have to be so.

In criticizing the cost to registries for what ought to be a simple look-up, AusRegistry International (ARI) and Neustar early-on submitted comments in opposition to the new TMCH. They said, under the plan, registries are expected to operate with a copy of the TMCH data that is created at a "point in time" and therefore could exclude prior rights data that is added after the copy is made but before the application for a domain is submitted. Data currency in the TMCH therefore relies on registrants being better informed about the TMCH itself than about a specific gTLD. It begs the question of how much marketing budget the TMCH will have to ensure its message reaches potential registrants with the same efficacy as new gTLDs.

ARI and Neustar aptly suggested deploying prior rights existence using the DNS. Whenever a "string" was added to the TMCH, then a web page at "string".tmch.something would resolve and thus through a simple DNSLookup, anyone could determine if prior rights exist for that "string", at no cost, 24/7/365 with the full reliability of the DNS system.

The DNS approach also allows for public dissemination of specific data (approved for public viewing) as well as access to password protected information based on gTLD if necessary although I'm not convinced that information needs to be made available to neither the registrar nor the registry in detail.

Finally, ARI suggested identifying an applicant through a "sunrise" code. I think they are partially right but they make an invalid assumption about needing multiple codes. The ideal solution would require each user of the TMCH (let's call them a Trademark Registrant for lack of a better name) to have a single unique code that would include the kinds of security measures ARI suggested. But, the TMCH based data would determine whether the submitted string was eligible for the specific TLD being requested. Validity across every new gTLD should be an assumed data facet of the TMCH and should not require unique token based access.

Customer service, either at the registrar, registry or even TMCH level need only be able to say to the customer your application "has been rejected for this string in this gTLD because you don't meet criteria ABC". And the only organization that has the final say on whether a specific prior right is valid or not is the adjudicating body through which the original claim of prior right is submitted.

It is clear that the TMCH implementation can be better served by having real time access to data — at no cost to the registrar or registry — and that having access to a webpage (site) uniquely established to deliver the public facing data for a specific "string" allows for transparency in the verification process which supports a multi-stakeholder, multi-tiered provisioning model that more closely resembles the ICANN model.

In selecting IBM and Deloitte, ICANN made a safe decision. But as the TMCH details become more apparent, it is clear that they have not made the best decision for the community in terms of overall architecture, efficiency, competitiveness and cost to registrars and registries. With just a small change in approach, the TMCH could be engineered to reflect the ICANN model, with a technology base consistent with current practitioners and a suite of service providers that would offer the market freedom of choice.

The implementation of the Uniform Rapid Suspension program (URS) shows similar contracting methods which favored selecting the "known player" as opposed to the player(s) that could better deliver the service within the defined requirements.

The Prague session on URS included panelists from WIPO and NAF, the supposed selected providers of the URS service. However, well before this session was scheduled, ICANN announced that the key objective of keeping URS claims costs to within a $300-500 USD range could not be met. Wait a minute! The requirement to offer the service at that price was a requirement from the outset. If NAF and/or WIPO could not provide the service within the proposed budget, why were they selected in the first place? Were there any other proposals that met the requirement? Who knows?

Alas, I have to watch from the sidelines as both a new gTLD applicant and as a registrar as inferior solutions are implemented because it was likely the safe thing to do. I sit in frustration as I read the commentary from the community that suggest alternatives that I know were included in original proposals submitted to ICANN yet for whatever reason were not even considered in the top five of submissions.

Even more frustrating is that the simplicity of these other "not name brand" proposals would allow for full implementation well in advance of the first gTLD release even if ICANN were to change its mind and award the contract to a more creative solution today that did indeed meet the implementation criteria.

But, hey, doing it right is not necessarily playing it safe.

By Richard Schreier, CEO of Pool.com

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

Co-designer of the TCP/IP Protocols & the Architecture of the Internet


Well said, Richard. By Antony Van Couvering  –  Jul 10, 2012 2:47 pm PDT

Well said, Richard.

Add Your Comments

 To post your comments, please login or create an account.




Sponsored byThreat Intelligence Platform


Sponsored byVerisign

Domain Names

Sponsored byVerisign

New TLDs

Sponsored byAfilias


Sponsored byWhoisXML API

DNS Security

Sponsored byAfilias

Brand Protection

Sponsored byAppdetex