Home / Blogs

On Search Neutrality

John Levine

In recent months there's been a robust and apparently well-funded debate about the legal status of search engine results, in particular Google's search results. On Tuesday, Tim Wu, a well-known law professor at Columbia weighed in with an op-ed in the New York Times, arguing that it's silly to claim that computer software has free speech rights. Back in April, equally famous UCLA professor Eugene Volokh published a paper, funded by Google, that came to the opposite conclusion, that in some cases they do. (Personally, I think they do to the extent the results reflect the intentions of the humans who wrote the code.)

The reason this is a hot topic, of course, is because some people whose web sites don't appear as high as they'd like in search results think it's a monopolistic plot against them, and Google should be required to present search results in a neutral way. It might be, but more likely it's not, and the cure would be far worse than the problem.
The whole argument about search neutrality is based on a false assumption, that there is such a thing as a neutral search result. Any mechanical definition you can invent, e.g., the page with the most incoming links, or the page with the most incoming links from other domains, will instantly be gamed by SEO spammers and the answers will be useless. Furthermore, a good search engine does a great deal of semantic analysis to get useful results. For example, if you search for key lime pie, Google recognizes that as an idiom, looks for it as a unit, and also realizes that it matches a lot of recipies so it adds decorations to the search page appropriate for a recipe search. It's a strong enough idiom that many searches, e.g., for "can lime pie" will be redirected to key lime pie. If you happened to name your web site "can lime pie", too bad, your name will be autocorrected.

How the heck you can make that "neutral" without completely destroying the utility of a search engine? You can't. The only way to imagine that you can is to completely fail to understand what search engines do.

The only place I can see any possibility of a remedy is in the universal search, where Google adds results from maps or plane schedules or the like. Some decades ago, as part of an antitrust settlement, IBM agreed to document and separate out some of the functions of their mainframe system OS/360. That way, if people wanted to use a competing product for a function, the product could use the defined interface and people could install it and it'd work. In practice, hardly anyone ever did, but the interfaces were there if anyone wanted them. I'd think something like that might be workable for the results other than search, with maps being the prime example. But it's not the same as making the results "neutral".

By John Levine, Author, Consultant & Speaker. More blog posts from John Levine can also be read here.

Related topics: Net Neutrality, Policy & Regulation, Web

 
   

Don't miss a thing – get the Weekly Wrap delivered to your inbox.

Comments

John, you might find my 2006 article Eric Goldman  –  Jun 21, 2012 10:03 PM PDT

John, you might find my 2006 article on this topic interesting.  It supports your position.  http://ssrn.com/abstract=893892 Eric.

To post comments, please login or create an account.

Related Blogs

Related News

Explore Topics

Dig Deeper

Mobile Internet

Sponsored by Afilias Mobile & Web Services

DNS Security

Sponsored by Afilias

Cybersecurity

Sponsored by Verisign

IP Addressing

Sponsored by Avenue4 LLC

Promoted Posts

Buying or Selling IPv4 Addresses?

ACCELR/8 is a transformative IPv4 market solution developed by industry veterans Marc Lindsey and Janine Goodman that enables organizations buying or selling blocks as small as /20s to keep pace with the evolving demands of the market by applying processes that have delivered value for many of the largest market participants. more»

Industry Updates – Sponsored Posts

Radix Announces Global Web Design Contest, F3.space

.TECH Gets Its Big Hollywood Break

Major Media Websites Lose Audience Due to Slow Load Times on Mobile

Leading Internet Associations Strengthen Cooperation

i2Coalition to Present Tucows CEO Elliot Noss With Internet Community Leadership Award

DeviceAtlas' Deep Device Intelligence Now Addresses Native App Environment

A Look at How the New .SPACE TLD Has Performed Over the Past 2 Years

Michele Neylon Appointed Chair Elect of i2Coalition

2016 U.S. Election: An Internet Forecast

MarkMonitor Supports Brand Holders' Efforts Regarding .Feedback Registry

Why .com is the Venture Capital Community's Power Player

Miss.Africa Announces 2016, Round II Seed Funding Tech Initiative for Women in Africa

Airpush Chooses DeviceAtlas to Provide Device Awareness to Mobile Ad Network

DeviceAtlas Releases Q2 2016 Mobile Web Intelligence Report, Apple Loses Browsing Market Share

Effective Strategies to Build Your Reseller Channel (Webinar)

Facilitating a Trusted Web Space for Financial Service Professionals

News.Markets: A Rising Star in the World of Financial Trading and New TLDs

Mobile Web Intelligence Report: Bots and Crawlers May Represent up to 50% of Web Traffic

DeviceAtlas Brings Device Awareness to HAProxy

Dyn Weighs In On Whois