Home / Blogs

Tinkering Without Tampering: Wrestling With Convergence and Communications Policy (Transcript)

Our world finds itself at a critical juncture. Both trillions of dollars and the future of human communications including fundamental access to it are at stake. For telecom operators and media outlets there is not a migratory way from where we are to the future. There is a clear consumer shift underway that runs in the opposite direction to that of telecom and media incumbents; emergent social practice is increasingly clashing with the very structure and desires of incumbent players.

A battle is unfolding which is taking place across three related planes; between industries that were previously clearly demarcated (telecom, cellular, Internet and media); between distributed, peer-to-peer ecosystems enabled by the Internet versus centralized, command-and-control ways of organizing to deliver services and content; and between opportunistic infrastructure versus tolled infrastructure.

It was for these reasons that one of the six keynote speakers invited to Spring 2009 Emerging Communications Conference (eComm) in San Francisco was Richard Whitt, Google’s Washington Telecom and Media Counsel. His keynote was entitled “Tinkering without Tampering: Wrestling with Convergence and Communications Policy.”

I’d like to remind readers that the next Emerging Communications Conference & Awards (eComm), will take place this Fall in Amsterdam. I can promise that the event will be very worthwhile attending and many of the larger reasons why will become public over the coming months. In the meantime, the Super Early Bird registration is still on and ends on 21st July. To receive a further 20% off use the promotion code ‘CircleID’ during registration.

* * *

Richard Whitt, Washington Telecom and Media Counsel, Google on stage at 2009 Emerging Communications Conference (eComm) in San Francisco — Photo © James Duncan DavidsonTranscript of “Tinkering without Tampering: Wrestling with Convergence and Communications Policy”
March 5, 2009, by Richard Whitt, Washington Telecom and Media Counsel, Google — 2009 Emerging Communications Conference (eComm) in San Francisco

I see Google can coexist with the broadcasters; that’s pretty impressive. I know it’s been a long three days. I wanted to do something a little different, a little more conceptual, pulling together some technology and economic thinking to try to guide our next set of policymakers, particularly in Washington, as they grapple with broadband related issues.

Maura Corbett and Sascha Meinrath, and Michael Calabrese, and a bunch of other people here have talked about some of the policy issues in D.C. This is my attempt to take a little bit of a step back and ask some more fundamental questions about whether and why we actually want policymakers involved in technology areas, particularly the broadband market.

First off, I want to talk a little bit about this notion of convergence that everybody has been discussing seemingly forever, except strangely enough, in Washington. The notion of the “virtuous hourglass,” there is still an awful lot of people on Capitol Hill and at the FCC who don’t quite understand what that means. I do believe it’s a misleading term. I came up on the panel, earlier today; the idea that everything in the telecom space is moving down to a single service or single concept; of course, that’s not true. The point is that we are more or less moving to a single platform IP, at least for the foreseeable future; on top we have all the different applications and devices; and on the bottom, different kinds of network platforms, thus, the virtuous hourglass.

We are also converging on some of the key elements of Internet architecture, including modularity, the smart edges, interconnection of networks and ubiquitous IP. One of the end results is that services in our application software on the platforms and the core moves to the edges. Again, a lot of this, while it is probably old hat to many of you as kind of a truism, it’s not so much the case in Washington.

One of the challenges is trying to have policymakers really understand how networks function and how these realities take place. At the same time, we’ve got emergence, convergence leading to emergence, the Internet, and if you go back to complexity sites—the net like many large complex systems is a complex-adaptive system, which means the whole is greater than the parts. It has all these amazing emergent properties, like feedback mechanisms.

The Internet, in the terms of The Economist, is also a general platform technology, or GPT, which means it acts as a ubiquitous bearer for all kinds of all growth and innovation happening on top of it, and in particular, there is this concept of spillovers. GPTs generate these things that, again, The Economist calls “positive externalities,” that are not captured by the platform owners.

This really becomes one of the key issues in that old debate around network neutrality. These network effects include all kinds of economic activities by innovators and entrepreneurs, and also include all kinds of non-economic effects. Yochai Benkler has written an entire book on peer production, and on social production, where the user stands in the shoes of the traditional producer of services and outside the traditional producer/consumer relationship. Professor Susan Crawford has talked about what she calls the “social layer of the Internet,” which is all the human interaction in communications—diversity, freedom of expression, democracy, and all the values that don’t typically show up in neoclassical formulas.

These spillovers actually create the value we see in the Internet, and in turn, in broadband networks. The implications—again, this is not entirely new to you all but many people in D.C. have not fully grappled with this—the battles are now shaping up between the networks and their users. Network neutrality can be seen as the latest, but it’s part of a long string that have happened over decades and is not reaching culmination because of these technological changes and market changes.

Blaire Levin, who is an analyst for Stifel Nicholas, calls it “the value chain tug-of-war.” The networks of course, in the past, held much of the value and now because the services are no longer inextricably bound to the network, that value is moving to the edges. Naturally, the network owners are not particularly happy with that happening.

Why are we involved in this stuff in D.C.? That question is asked to me with some regularity. Our corporate mission statement is to “Organize the world’s information to make it universally accessible and easy to use.” The organizing and usefulness aspects of that are more or less what our software engineers are responsible for and all of our various vendors, partners, product teams, etc., but the middle concept of universal accessibility is actually the one thing that is not under our control. In fact, it’s largely outside our control because unlike our ability to create a new algorithm or tweak a software application, we have very little say over whether and how our users can reach us; that’s the network layer that we’re talking about here.

We do believe very much in the ecosystem of the Internet. When network neutrality first came out as an issue three or four years ago, there was actually an internal debate at Google about what the position of the company would be. The management actually did a lot of the pros and cons talking about it, and at the end of that discussion they said, “We were a company born of the internet. We were raised there. We found our success there. It is this concept of innovation without permission, as Vint Cerf has described it, and we really believe in it. The competition that arises from the net makes us better as a company. It makes us sharper, quicker, move with more agility, so we actually believe in those elements of the net and we want to see them preserved and maintained going forward.” That’s been a large part of my role at Google, for the past several years.

For policymakers in D.C. and particularly for communications policy, one of the challenges is looking at the market and the technology without using the old telecom ways of thinking. Being here at the Emerging Communications Conference, it’s a great platform for all the new thinking around communications, which frankly is just not being heard as much in D.C.

They’re still using all the same old tools, the same old concepts; they still have very much this urge to tamper in the marketplace. The first instinct is to regulate something. There is this issue that is very much an old issue but still one that is with us, and that’s the role of network infrastructure and society. What are the ways we should be looking at this as a legal construct; what are the ways we should be thinking about this as a society?

One of the answers that I suggest is that we need to see the market and the government with fresh eyes, not just as standalone or antagonistic entities but as linked co-evolving agents in the larger ecosystem. There is a whole school in economics called New Institutional Economics, which talks about the institutions that make up the marketplace and how that engenders the ability for market agents to get together and buy and sell and barter and trade, and to do all the things of commerce.

We’ve seen in the financial meltdown, unfortunately, institutions that have seriously broken down both in the government side and on the market side. I think the problem is sometimes we look at the government as it’s always evil, it’s always there to cause problems. In fact, we need government there in some cases to make sure that the markets run smoothly.

At the same time, we don’t want the government to be there just for any old purpose. The policymaker first and foremost, in acting as an adaptive agent in that marketplace, needs to be sensitive to its own cognitive constraints and to the dynamism and unpredictability of the marketplace itself.

The first principle should really be for the policymaker to take great caution; to tinker and not to tamper. That’s one of the formulas I’ll get back to in a second. It also means the policymakers are ill equipped to deal with the numerous issues stemming from convergence and the value shifts in these complex markets.

Here is an example of one conceptual tool that I think is better equipped to deal with the converged networks. The Communications Act of 1934 and the way the FCC is set up today is based on the so called silos approach. You have telephone companies and cable companies. You have TV and radio broadcasters and satellite companies, etc. In fact, that is increasingly no longer the case. It’s no longer accurate to describe the industries in that way.

Instead, we should be looking more towards the old OSI layered stack or some variance on that. It actually mirrors the market economy we have today and it allows policymakers to focus on the right issues at the right level. I wrote a paper about this a few years ago, and I still think this remains a viable way for the FCC to reorganize itself, reorganize its way of thinking around networks.

I want to be a little provocative and talk for a few minutes about network neutrality and broadband. I hope to kind of bring some new perspectives on it for you guys today. I think there are some misnomers about net neutrality, as it’s been called.

First, I think about it not as the “net” or I call it more “network neutrality.” The network we’re talking about is the last mile of the broadband on ramps to and from the Internet. There is this misnomer, of course, that the net is somehow completely neutral in architecture and that we should be mirroring the neutrality of the Internet.

We know that’s not true; the end-to-end principle still abides as a fundamental characteristic of the Internet, but there are many exceptions to that. We all know that there are many non-neutral structural and business activities taking place every day on the net, which is fine. The point is the net is a robustly competitive place and those kind of non-neutral activities are acceptable in that context. The concern around broadband is that because it’s in relatively scarce supply, the concerns around neutrality there are heightened.

It’s also about the outcome and not the path. Every time you hear someone say “net neutrality” the next word is usually regulation. Again, we’ve heard that several times today. My observation is that you can have a network neutrality environment without the regulation to get you there.

I would submit that we actually have net neutrality right now. We don’t have a law, we don’t have legislation in place that is passed by Congress and signed by the President, we don’t have regulations adopted by the FCC that says, “Thou shalt have a net neutral world,” but in fact, because of largely the bully pulpit of the FCC over the past five or six years, and some principles that were adopted, which I believe are unenforceable—we’ll find out from the D.C. circuit shortly, in the Comcast vs. BitTorrent case. We still have a world where, by and large, the broadband providers have been hesitant to go forth and do non-neutral type things.

It really is about the outcome. It’s the environment we want and not necessarily the path. There are many ways to get there. This may be one of them, frankly, or other ways like self-regulating organizations, standards, bodies, and the like.

Also, it’s this notion of the openness norm. We have relied on it to this point, but it’s unclear whether market forces will allow that to stay in place. There are a number of folks who are confident that the notion of openness is now so deeply embedded within the user community, within consumers, that we can’t go back. We can’t turn back the clock. We’re not going to be in a situation where somehow those norms break down and broadband providers start doing various things on the network that we’re not happy about. That may be true, but the incentives and the ability to discriminate are very much there. I think the real challenge going forward is how do you essentially discipline the market behavior where you think there are problems of concentration in a minimally intrusive way and a narrowly tailored way that still allows for the flexibility and the adaptability of the broadband providers, going forward?

Broadband also, again, there are many misnomers around this, and particularly now, we are at the height of the season in Washington, with the broadband stimulus portions of the stimulus package. Broadband, in and of itself, is simply transportation and communication put together. Its transporting bits, communication between people, creating interactivity, the always on aspect of it. We value broadband for what it enables, not for what it is.

For one thing, it’s not the Internet, as I’ve mentioned. It’s the on ramps to the net to regulate some aspect of broadband provider behaviors, not necessarily to regulate the Internet. There could be that there, but it’s not necessarily the case.

It’s also not a content delivery system. It can be, but that’s one of many things it could do. It could provide Internet access, but that’s one of the things it does. The social value we see in broadband seems to be around online connectivity. That seems to be the element that really sticks with us, and yet, at the same time, we talk about broadband in a way that focuses much more on the infrastructure and much less on the Internet access part of that, or the online connectivity part of that.

It’s also not your vegetables or a box of widgets. The economics of broadband include very high, fixed upfront costs which as a general platform technology, which is unique in economics; there aren’t a whole lot of other types of industries you can point to that have the similar characteristics. That actually dictates the way we think about broadband.

Here is a suggestion on what I call “three dimensions of broadband as an optimal Internet platform.” Folks have traditionally looked at one or two of these. I think it actually makes more sense to talk about all three of them together.

  1. First of course, you need to have the infrastructure itself, and that’s the IP transmission and broadband component on the bottom, there. That is kind of obvious; that’s where the national broadband policy—this will be put in place by the FCC—that is the broadband stimulus plan; you want more and bigger pipes to more people.
  2. There is also this idea of sufficiency of net carriage. You could have a broadband platform that is uniquely tuned for the Internet but if 90% to 95% of the capacity goes to traditional video, cable video, proprietary content, that kind of thing that is actually the case today with some cable systems; that’s not optimal for the Internet or for connectivity, if that’s something that is really driving our policy interests.
  3. The third part of it is what I call “integrity of net access,” which is what others would call the “open Internet” or “network neutrality.” I think you need to have all three of these dimensions working in some way together, in order to have the right kind of mix for Internet access.

I’m not saying you need to regulate to get there, I’m saying these are the things that I think policymakers should be thinking about as they look at the various options.

Real quickly, one way of translating some of these economic technology considerations into some more concrete policy goals and objectives, one suggests a policy goal, borrowing from Susan Crawford, she talks about this idea of “cognitive diversity.” My thought here is to have more good ideas. That’s the thing that we actually want to come out of all of this work with the broadband networks, with the Internet. We want to generate more good ideas and that is something that should be—the end user is ultimately the one who decides what those ideas are. They fuel innovation and economic growth. They also are just things we talk about. It doesn’t necessarily have to be any kind of economic benefit to them whatsoever. The market kind of provides the fodder for these ideas. The mechanism I suggest is the policy objectives that helps us get there is to look at broadband as an optimal Internet platform, with those three dimensions that I suggested.

What holds back the broadband companies from providing optimal Internet access over their networks? What are the things that they look to that might create less of an incentive rather than more of an incentive to get to that place? Here are four examples I can think of:

  1. One is ruinous competition. Rob Atkinson has written about this in D.C. in some detail; the idea that we may want to have three, four, five, or ten “pipes” coming to the home but in fact the economics may not support it. Because you’re talking about these really expensive, high fixed upfront cost networks, if you start dividing a limited pie based on the number of facilities there, you can get to a place where competition actually becomes harmful. Again, there have been some studies on this. People don’t know exactly what the right numbers are; could that be two networks, three networks, four networks? Somewhere along the way there, you get to a place where ruinous competition can set in. That should be something policymakers are aware of. This notion that we’re all waiting for competition to arrive—it may never arrive, just because of the sheer economics of it.
  2. Positive externalities, as I mentioned before, this wedge between the public benefit we get from the net and the private costs of building broadband networks.
  3. The incentives to prioritize traffic, in terms of the idea of two-sided markets, the desire of broadband providers to get additional revenue to support their networks.
  4. Finally, existing mindsets—there remains today, at least in the policy debates in Washington, a divide between the “bell heads” and the “net heads”; that is still the case, even if the bell heads are cable companies. The value chain tug of war is very much alive.

I wanted to show this last slide here; throw out some thoughts about the legal conundrum we’re in. The FCC has done, in my view, a poor job of figuring out what is the right regulatory regime in this situation. I suggest we go back and look at the common law of common carriage.

Common carriage is the basis of the Communications Act, that goes back hundreds and hundreds of year, to Britain and even before that, to the Roman Empire. There were three reasons why government got involved in the first place in imposing any kind of oversight over a common carriage.

  1. The first was market concentration. That is kind of the obvious one. It is the one most people point to today. The evidence, of course, is mixed. I would like to think there is room for more competition, particularly from spectrum-based offerings, but I think we’re still not entirely sure if that is going to happen.
  2. The other two strands are much more interesting; public callings—that was the idea that public infrastructure uniquely is important to society, whether it’s roads, bridges, railroads, or anything that transports things or in this case, communications, is of unique value and of unique interest to government policymakers because it enables so much on top of it. There is also this idea that you are using public resources. When you’re talking about wireless networks, of course, it’s spectrum. When you are talking about the wireline side, it’s rights of way, access to conduit, etc. Together that creates a public interest in broadband or in broadband infrastructure. Again, not to say what that interest should look like, in terms of a regulatory outcome, but just to say we have a policy view that says, “We need to look at this; it’s important to us.”
  3. It’s the idea of bailment, which is voluntarily holding yourself out as providing something. You had a duty of care if you called yourself an innkeeper. Once you assumed that role, the duty of care was assigned to you. One could say the same thing in the case of a broadband provider. Once you voluntarily agree to provide Internet access to your customer, you have to provide it and perhaps in a certain way and in a certain manner.

Finally, competition law is necessary but not sufficient. It doesn’t account for many of these positive spillovers or this notion of infrastructure being given unique value to policy makers.

By Lee S Dryburgh, Founder of eComm Media, Inc

Filed Under

Comments

Comment Title:

  Notify me of follow-up comments

We encourage you to post comments and engage in discussions that advance this post through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can report it using the link at the end of each comment. Views expressed in the comments do not represent those of CircleID. For more information on our comment policy, see Codes of Conduct.

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

VINTON CERF
Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Related

Topics

Brand Protection

Sponsored byCSC

DNS

Sponsored byDNIB.com

Domain Names

Sponsored byVerisign

New TLDs

Sponsored byRadix

Threat Intelligence

Sponsored byWhoisXML API

Cybersecurity

Sponsored byVerisign

IPv4 Markets

Sponsored byIPv4.Global