Home / Blogs

The One Reason Net Neutrality Can’t Be Implemented

Suppose for a moment that you are the victim of a wicked ISP that engages in disallowed “throttling” under a “neutral” regime for Internet access. You like to access streaming media from a particular “over the top” service provider. By coincidence, the performance of your favoured application drops at the same time your ISP launches a rival content service of its own.

You then complain to the regulator, who investigates. She finds that your ISP did indeed change their traffic management settings right at the point that the “throttling” began. A swathe of routes, including the one to your preferred “over the top” application, have been given a different packet scheduling and routing treatment.

It seems like an open-and-shut case of “throttling” resulting in a disallowed “neutrality violation”. Or is it?

Here’s why the regulator’s enforcement order will never survive the resulting court case and expert witness scrutiny.

The regulator is going to have to prove that the combination of all of the network algorithms and settings intentionally resulted in a specific performance degradation. This is important because in today’s packet networks performance is an emergent phenomenon. It is not engineered to known safety margins, and can (and does) shift continually with no intentional cause.

That means it could just be a coincidence that it changed at that moment. (Any good Bayesian will also tell you that we’re assuming a “travesty of justice” prior.)

What net neutrality advocates are implicitly saying is this: by inspecting the code and configuration (i.e. more code) of millions of interacting local processes in a network, you can tell what global performance is supposed to result. Furthermore, that a change is one of those settings deliberately gave a different and disallowed performance, and you can show it’s not mere coincidence.

In the 1930s, Alan Turing proved that you can’t even (in general) inspect a single computational process and tell whether it will stop. This is called the Halting Problem. This is not an intuitive result. The naive observer without a background in computer science might assume it is trivially simple to inspect an arbitrary program and quickly tell whether it would ever terminate.

What the telco regulator implementing “neutrality” faces is a far worse case: the Performance Problem. Rather than a single process, we have lots. And instead of a simple binary yes/no to halting, we have a complex multi-dimensional network and application performance space to inhabit.

I hardly need to point out the inherently hopeless nature of this undertaking: enforcing “neutrality” is a monumental misunderstanding of what is required to succeed. Yet the regulatory system for broadband performance appears to have been infiltrated and overrun by naive observers without an undergraduate-level understanding of distributed computing.

Good and smart people think they are engaged in a neutrality “debate”, but the subject is fundamentally and irrevocably divorced from technical reality. There’s not even a mention of basic ideas like non-determinism in the academic literature.

It’s painful to watch this regulatory ship of fools steam at full speed for the jagged rocks of practical enforcement.

It is true that the Halting Problem can be solved in limited cases. It is a real systems management issue in data centres, and a lot of research work has been done to identify those cases. If some process has been running for a long time, you don’t want it sitting there consuming electricity forever with no value being created.

Likewise, the Performance Problem can be solved in limited cases. However, the regulator is not in a position to insist that enforcement actions are restricted to those narrow cases. It is unavoidably faced with the general case. And the general case is, in a very strict sense, impossible to solve.

The Halting Problem is a subset of the Performance Problem. If you could solve the latter, then you could solve the former. You can’t solve the Halting Problem, so the Performance Problem is also unsolvable. QED.

This single reason from computer science is enough to tell us that “net neutrality” is a technical and regulatory dead end. The only option is to turn around and walk away. You can argue as much as you like about its moral merits, but mathematics has already decided it’s not happening in the real world.

So if not “neutrality”, then what else?

The only option is to focus on the end-to-end service quality. The local traffic management is an irrelevance and complete distraction. Terms like “throttling” are technically meaningless. The lawgeneers who have written articles and books saying otherwise are unconsciously incompetent at computer science.

We computer scientists call this viable alternative “end-to-end” approach a “quality floor”. The good news is that we now have a practical means to measure it and hard science to model it.

Maybe we should consciously and competently try it?

By Martin Geddes, Founder, Martin Geddes Consulting Ltd

He provides consulting, training and innovation services to telcos, equipment vendors, cloud services providers and industry bodies. For the latest fresh thinking on telecommunications, sign up for the free Geddes newsletter.

Visit Page

Filed Under

Comments

It'll be easier to prove than you Todd Knarr  –  Sep 8, 2017 7:25 PM

It’ll be easier to prove than you claim. It’d be that hard if the ISP were willing to degrade performance to a large swath of providers in addition to the service in question, but they won’t be because that’d place them at a disadvantage against other ISPs. So in reality what’ll happen is that the regulator will be asking the ISP to explain to the court why it is that little or none of the traffic across that route shows any performance degradation except traffic for this one service, even when that other traffic has the same profile as traffic from that service, and why it is that traffic performance seems to remain relatively stable over time except for at this one point where it suddenly changes. Questions that the ISP’s going to have a hard time answering without sounding like fools or liars.

Nice try, but no banana Martin Geddes  –  Sep 8, 2017 9:03 PM

Todd - that’s a beautiful theory, and many make the same intuitive assumption. However, it is not technically correct.

Even if a single application or end point is targeted, you cannot prove what the intended effect was (except in extremis, like dropping every packet). For instance, if an application is over-saturating some downstream resource, “throttling” its upstream resources will actually increase its performance.

More formally, you cannot recover the intentional semantics from the operational semantics when the system has emergent operational semantics. It is impossible to reverse your way through the “labyrinth of luck” of all the interacting random processes in such a stochastic system.

The effect of any traffic management rule was contextual to that one moment in time. At a different moment, it may have a different effect. You cannot reproduce the past condition. So it is not possible to demonstrate the causal link that is widely assumed to exist.

>Suppose for a moment that you are Charles Christopher  –  Sep 9, 2017 4:03 PM

>Suppose for a moment that you are the victim of a wicked ISP that engages in
>disallowed “throttling” under a “neutral” regime for Internet access.

https://www.wired.com/2017/04/want-real-choice-broadband-make-three-things-happen/

The problem is not the “wicked iso”, the problem is the lack of competition to move to another service provider. This was witnessed in my home state when the legislature was successfully lobbied by USWest and Comcast. Access to decent bandwidth in my area only occurred a few years ago.

My neighborhood is very old, the telecom central office is at a distance far greater than DSL is rated for. When I had a land line during most springs the wetness got into the wires and I’d have no dial tone let alone DSL access. Comcast blocked voyage and many other activities I needed to perform as a Domain Registrar. I live in a densely populated city, not a rural area. I eventually had to obtain Comcast Business to get decent service at $150 per month.

I have spoken to Century Link (was USWest) techs when I see them in the area. I ask them what the status is of their equipment is in the area. They confirm the wires in this area are next to useless and that if I went back to them they WOULD NOT be able to find a working pair of wires to provide me service. They all say this is a perfect area to update but Century Link is not interested in doing so.

Recently Comcast blocked my server transactions, and my business class service allows servers. Tech came to my house, saw the problem, took it to his second level support and everybody was scratching their head with no clue as to why it was happening.

Let service providers be as evil as they wish IFF there is an option to switch to another provider. Its the lack of competition and competitive forces that create this problem.

“State governments that are serious about giving their citizens more options should start by making sure cities are allowed to build their own networks—something that 70 percent of the population supports, according to a recent poll by Pew, including about 67 percent of Republicans.”

While I generally disagree with government competing this way, I understand and support it being a need right now as it seems to be the only way left to create some competition. To have more than just two bad options for service.

Katrina is another great example of this at play. Cities, with the help of volunteers, such as amateur radio operators, quickly setup a functional wifi network in the area. The local ISP we enraged that an alternative was setup so quickly while their networks were out. Local ISP force laws to be put in place REQUIRING the adhoc network be removed once their networks were back up .... Example of customers being fish in a barrel are all to plentiful, ISP have intentionally created this problems to server their own needs.

http://www.nbcnews.com/id/9591546/ns/technology_and_science-wireless/t/post-katrina-landscape-turned-wireless-lab/#.WbQOyhRlky4

“Yet even the NPS team, which was sent in by the military, had early run-ins with FEMA, which had taken over jurisdiction of the hospital parking lot where the team was working.
“We had to ask FEMA for permission to practically do anything, including use the outhouses,” Steckler said.”

https://slashdot.org/story/06/04/04/198253/new-orleans-tech-chief-vows-wifi-net-here-to-stay

“After Hurricane Katrina last year, New Orleans set up a city-wide wireless network to encourage businesses to return and assist in recovery. The New Orleans technology chief recently said that he intends to make the network permanent, in spite of state law and the disapproval of telecoms.”

Now back to my own state:

http://www.heraldextra.com/news/opinion/editorial/set-utopia-free/article_c6191cae-c6f1-547a-aafe-2faf6710a8a6.html

“When UTOPIA was first proposed, I was all for getting a fiber optic connection to every home and business in the at-that-time 17 cities. In my opinion, the original business model was sound; install fiber to each home/business and offer data, voice, and television services at the retail level.

Of course, the entrenched incumbent businesses, namely US West (it became Qwest and now CenturyLink), Comcast, and AT&T;, who would face real competition, sent their lobbyists to the state legislature and after some intense lobbying, got the legislature to eviscerate the UTOPIA business plan by passing a law that prohibited community-based consortiums such as UTOPIA from offering services at the retail level.”

Comment Title:

  Notify me of follow-up comments

We encourage you to post comments and engage in discussions that advance this post through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can report it using the link at the end of each comment. Views expressed in the comments do not represent those of CircleID. For more information on our comment policy, see Codes of Conduct.

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

VINTON CERF
Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Related

Topics

Brand Protection

Sponsored byCSC

Domain Names

Sponsored byVerisign

New TLDs

Sponsored byRadix

IPv4 Markets

Sponsored byIPv4.Global

DNS

Sponsored byDNIB.com

Threat Intelligence

Sponsored byWhoisXML API

Cybersecurity

Sponsored byVerisign