Home / Blogs

Exploring the Roots of Wireless Spectrum Controversy (eComm Panel)

Earlier this month, I had the opportunity to attend the The Emerging Communications (eComm)  2009 conference in San Francisco which was packed with 3 days of fascinating conversations about the future of communications. I absolutely enjoyed talking to various speakers and attendees giving me a deep level of appreciation and perspective on technical, commercial and political issues at hand—and what is likely to come in the next few years. And speaking of politics, Lee Dryburgh (CircleID), who founded eComm in early 2008, has generously allowed us to share with you a fascinating panel discussion which took place on day 3 of the conference called “Spectrum 2.0 - What’s really happening?”

The panel included:

Richard (Rick) Whitt, Google’s Washington Telecom and Media Counsel; (also see a related interview by Lee Dryburgh)

Peter Ecclesine, Cisco’s Wireless Technology Analyst;

Richard Bennett (CircleID), Network Architect, Standards Engineer & Inventor of Internet-Oriented LAN Protocols;

Darrin M Mylet, Vice President Wireless Services, Cantor Fitzgerald;

Maura Corbett, Partner, Qorvis;

So with no further ado, here is the transcript of the entire discussion starting with the moderator, Brough Turner (CircleID), founder of Ashtonbrooke Corporation. (Photo credit: James Duncan Davidson)

eComm Panel Discussion: “Spectrum 2.0 - What’s really happening?” — From Left to Right: Maura Corbett, Richard (Rick) Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett.
Photo © James Duncan Davidson, More Photos

Brough Turner (moderator): Let me just start by saying why we have a Spectrum 2.0 panel. Basically, with very few exceptions, like perhaps, visible light, all radio spectrum is regulated in basically all countries. It’s also very valuable, so you have politicians involved, business interests involved, and a lot of noise going on. Behind that, we have radio technology, which is evolving at a rate that arguably is faster than Moore’s Law. In any event, performance is increasing at an incredible rate, much faster than political processes. There is a certain amount of back and back fighting going on.

The second thing that is worth noting before we kick it off is that there is an underlying set of claims about open spectrum. The most extreme view is that you don’t need any regulation, what so ever, because everything can be solved in the receiver.

The argument that I find best explains that is to think about here I am in this room, in the visible light, which is a piece of the electromagnetic spectrum. I have a very good receiver, my eyes, but more particularly, it’s the visual cortex in my brain. The result is that I can look at Lee and see that he is busy typing on his computer, but I can not only pick out all sorts of things there; I can see his left eye and his right eye, his nose, and his mustache. That’s incredible beam-forming resolution, far beyond anything that the Wi-Fi that’s running through this room does.

Besides that, I’m doing that without even transmitting. I’m doing that in the presence of white noise, literally the white light up there. There is a lot of stuff that potentially could happen in radio technology. On the other hand, the cognitive radio stuff that people talk about may be ten, twenty, fifty, or a hundred years in the future.

It’s an interesting space with a ton of tussle. What we’re going to do, today, is to ask each of the panelists to introduce themselves. Lee has it in the schedule as two and a half minutes, each. I asked them if they could target ninety seconds, each. I think we’ll absolutely limit it to two and a half minutes. I’m willing to go in any order. Why don’t we start with ladies first. Maura Corbett.

Maura: Hi again, I’m Maura Corbett. I’m a partner with Qorvis Communications in DC. I didn’t introduce myself, before, so I’ll introduce myself, now. I have specialized, going on fifteen or twenty years, particularly in the very difficult public policy battles like Brough just mentioned, that often surround new technologies and issues of convergence.

We do a lot of work in building coalitions with technology companies, public interest groups, academic institutions, etc. to try to translate the fascinating stuff that all of you do to policy makers in Washington that could affect your business models and your ability to succeed in the marketplace. Like it or not, a lot of this stuff comes under some form of rules, be it technical rules that the Office of Engineering and Technology at the FCC does, or often stupid rules that policy makers make on the Hill because they don’t know any better.

Oftentimes, and most of the time, we find ourselves in the role of translator, which has proved especially challenging with spectrum because it is not as well understood as the traditional wire line networks, just because they’ve been around a lot longer, and the fights have been around much longer. It’s particularly challenging and hopefully, our best work is still ahead of us.

I also serve as Executive Director of the Wireless Innovation Alliance, which I alluded to, before. We were successful in getting a final report in order out of the FCC to allow white spaces technologies to move forward. As I said before, we’re very hopeful that that is the floor and not the ceiling, that it is the beginning of the liberalization of the public spectrum in our country.

Brough: Thank you. Next up is Rick Whitt, from Google.

Rick: Hi, I am Rick Whitt, with Google. I am the Washington Telecom Media Counsel for the company. I will be talking, later this afternoon, about some other ideas I have around DC Policy. I thought that in about seventy-eight seconds I would give you the thumbnail on Google spectrum policy.

We’ve been active in Washington for the past couple of years, on a number of fronts. Our main objective is first to create new wireless-based broadband platforms, where that is possible. Second, it’s to open up the existing platforms. Third, it is to make more efficient use of the spectrum.

How have we done this? At the FCC, we have a couple of policy initiatives that we were part of. One was the 700 MHz auction, which some of you may be familiar with, the C-block and the openness conditions there, which we were successful in triggering at a very interesting auction that we had with our friends at Verizon.

Then TV white spaces, as Maura mentioned, last November the FCC adopted some pretty solid rules and certainly an excellent framework for unlicensed use of the TV white spaces, for both fixed and mobile purposes.

In addition to that, we do work in the marketplace. We don’t just rely on policy. In fact, we largely see policy as a way for the government to nudge the market in certain directions. If you talked about openness two years ago, for example, most of the incumbent carriers would have just laughed at you because they thought it was some kind of bizarre, wacky, west coast kind of idea of using a commons of the network. Now, they’re all fighting amongst each other to say that one is more open than the other.

The Android initiative that we have been part of, and working with our friends at T-Mobile and the T1 Phone, and the Clear Wire initiative, where we have some money there and they have an open platform. On both the policy and market side, we’re just trying to push the wireless space to look more like the Internet, in essence.

The challenges ahead on white spaces—there are some implementation issues we are working on, including the database, some of the protocols and standards around that, as well as the adjacent channel power limits. We can talk more about that later one. We are also watching the C-block to make sure our friends at Verizon actually do abide by the openness parameters.

I think the next big challenge is where do we get the next 500 MHz of spectrum. It’s not going to come from what’s been allocated by the FCC, so far. Many of us are now looking at the US government. There are certain initiatives underway. In fact, we were just talking about this before the panel; some real opportunities to try to create market incentives to allow the US government agencies to unlock their spectrum, essentially “use it or lose it,” and some sort of market efficiency mechanisms that are being discussed. All in all, it’s fascinating times for policy in DC

Peter: My name is Peter Ecclesine. I’ve been working in wireless at Cisco since 1996. I’ve been in 802.11 since then. I was the editor of .11j, which did Japan, with thirty-four different sets of rules. I was the creator and composer of .11y, which was the US 3650 band, with cognitive radios, with location in the beacons, and I’ve had the Wi-Fi Alliance, this week, to file a petition to reconsideration on the TV white space rule, to have a DNS-kind of server that will push out certificates, allow radios to operate on channels for time, and change those on the fly, also to allow client devices to be without sensing.

There is no requirements for little devices like this to have to waste their energy sensing wireless microphones that are illegal. [Laughter] Even if you heard them, you couldn’t tell whether they were legal or not. I’m pushing on where we go next, and I believe there is easily 500 MHz from 4.4 to 4.9, and then we’ll do the C band from 3.7 to 4.2. There is plenty of half a Gigahertz here and there. The essential thing is to get that database out of the filing cabinets in Gettysburg, Pennsylvania, get it online, and have a certificated way to communicate with that database so we can share with the government or anyone else.

Darren: I’m Darren Mylet. I recently worked on developing secondary markets for spectrum, trading, and secondary market policy. There is not a lot of progress there, relative to number of transactions because most of the time, people don’t want to sell or trade what they have. We have made a lot of progress, though, in developing systems that manage spectrum dynamically, and databases for managing these multidimensional assets.

The real challenge in spectrum 2.0, in my opinion is that you have the wireless haves and the wireless havenots. Those wireless systems and architectures—you’re accustomed to working with already, and you have new entrants coming out into the marketplace and building systems.

I’m most bullish, actually, on the white space, because I think the proceeding, that policy, and that technology is actually going to allow for the build out of new, competitive markets. I think it’s going to be one city at a time, which I think is a great way to adopt new technology and to potentially go after these giant Goliaths that really dominate the delivery of IP services, today.

I do think these new networks have to be different. They have to have a different architecture, but access to spectrum and getting spectrum is a key component in the foundation of that. We will have to carefully watch as we move forward, and transparency of who has what and where, how it is used. Getting all that information out there is going to make this industry thrive even more, and create new things we probably haven’t even dreamed about. I do believe it’s going to take some time to make this a reality.

Richard: Last but not least, I’m Richard Bennett. I’m a network engineer and standards architect, and sometime political activist. I’m one of the people who blew up Bob Metcalf’s Ethernet in 1984, and replaced it with the systems that we call Ethernet, today. They actually have an active hub in the middle of the network that can manage the bandwidth.

I worked on the early Wi-Fi, sort of the precursor to the Wi-Fi system, in 1990, that essentially took the architecture of our twisted pair Ethernet, and got rid of the telephone cable and put an antenna in it. That’s what we have, today, in the way of Wi-Fi.

In my day, RF was one option for a physical transmission, and pervasive infrared was the other. If you ever wondered why the security in Wi-Fi is so weak, it’s because for infrared networks, the security solution was to close the door. [Laughs]

I also had the misfortune to work on ultra wideband, both as a standard and as an abortive technology. Maura had the same experience in the early 2000’s. Between my experience with Wi-Fi, which I think everyone more or less holds up as the raging success story for light licensing. In UWB, we had something that is arguably an abject failure.

I think the lesson there is that open spectrum, for what it is worth, or light licensing, is in fact not the panacea that is necessarily going to get us all to the promised land of free networking, but somehow being able to sell our applications for millions of dollars. It’s another tool that is appropriate to use in certain technical circumstances and not in others.

Finally, I’m sort of known as an opponent of net neutrality, not because I don’t agree with it as an airy principle, it’s just that whenever you try to translate that airy principle into a set of enforceable rules, you end up making a lot of things that are really essential to the rational operation of networks impossible.

I think the challenge, moving forward with this lightly-licensed spectrum, is to figure out where the balance is between the degree of openness that we can permit on these networks without losing the reliability and performance that users expect from them. At the end of the day, the first, last, and always what somebody wants from the network that carries its phone calls, whether it’s mobile or otherwise, is reliable dial tone, for the calls not to be dropped, and for the voice quality to be good. That is the fundamental foundation upon which this whole application infrastructure is built.

To wrap it up, Vince Cerf says the architecture of the Internet is like an hourglass. At the top, you have a lot of diversity in applications. At the bottom, you have a lot of diversity in transports, but at the middle, you have a fairly constrained IP layer that really limits the amount of foolishness that you can do above and below it. I think that structure repeats itself in wireless networking. We have to figure out where the minimal constraint needs to be to make the system work well.

Brough: Thank you. With that opening, I was going to ask a set of questions and eventually throw it open for you guys to ask some questions. We actually have thirty minutes left. Thank you for everybody speeding through things.

I thought one long-term question to throw out there, first, before we dive into next media policy and next year, and TV white spaces, and so forth; the long-term question is David Reed and others have basically talked from basic physic principles, saying that interference doesn’t exist and a pile of things that people argue about. There is a set of physical reality there, in terms of Maxwell’s equations and electromagnetic radiation. There is also engineering reality of what is feasible today, what’s feasible in five, ten, fifteen years.

Could I get people to take the long view and one or more of you comment on where we are realistically going to be in ten years or twenty years, and what you do to transition with this radio stuff, given that you do have TV sets that were designed fifty years ago, and other things that have been allocated and are still stuck there. Does anybody want to start?

Rick: I’ll take a quick stab at it. David has talked about that in some of his blog postings. I like the notion that somebody else raised, which is that most of this country’s spectrum policy is based on the dumbest, cheapest TV sets you can buy, and that everything else kind of falls in line with that.

I think as much as with the Internet, we are moving intelligence from the core to the edge, which is this transformation in the telecom space. I think similarly, we are talking about wireless devices and spectrum policy generally around those devices. I think we’re moving to a world of cognitive radios and the ability to sense environments.

As you mentioned in your opening, we have a lot more knowledge now, about how these technologies work than we would have had ten or twenty years ago. We will have a lot more in the coming decade, I would think. I would imagine, over time, we’re going to move to a similar paradigm shift, where more and more of that intelligence is going to be at the edges where the device is, less necessary in the network, and my hope is although I fear it may take a while, that the spectrum policy itself starts to follow the technology rather than having this dictated to us from technology that is fifty or sixty years old.

Peter: I think that DTV is a good example. It shows the wrong modulation. It took them ten years to choose it. If they had to do it again, it would be OFDM. We know, ten years from now, that we’re going to get rid of that crappy modulation they chose in 1988 [Laughs], so there is going to be another translation and there is going to be another set of rules because they got it wrong. They got it wrong in ultra wideband. If they had given those people 10 dB more transmit power, we would all be using ultra wideband. When they get those rules wrong, they stay forever. You have to get them right, the first time, and trouble is that the engineers are too conservative in the FCC and there are too many people complaining about their crystal radios getting interfered with.

What there is is a lack of an experimental policy. What this TV white space is, is an experimental license. Every day, you’re good for one day. Every day, you go back again and try again, and maybe they’ll let you choose some channels and maybe they won’t. Maybe they’ll say this device is gone; we’re not going to allow these short microphones anymore, in this band. [Laughter] There is a recall policy built on these radios. It’s the only way to go forward. It’s the only way to choose the right modulations to make the trade offs on the mass, instead of casting these things in stone and putting them on our tombstones.

Richard: Ultimately, the thing that holds us back with tomorrow’s technology is yesterday’s technology. The thing we have to do is to figure out how to get obsolete systems off of the air.

There is a perfect example in the [0:18:20.0 unclear] here, in this room. I sniffed the air and the thing that’s happening is you’re access point is sending protection frames because there is a lot of 11B systems, or at least one per channel.

That costs you 40% of your bandwidth. That problem exists in every kind of radio spectrum where licenses have been issued. There is obsolete equipment that if we could get that off of the air, we could replace it with stuff that makes much better use of the resource.

Darren: I’m absolutely convinced that if we finally get the quantification and qualification of what’s actually being used, how it’s being used, where it’s being used, and we start having this sort of management data, we will be able to make better policy. If we can just increase utilization of the overall public and private sector spectrum, I would say 10%, the net worth of this room would probably go up by 1000%, potentially. I think that should be the real goal in supporting these policies that allow for seeing who has what, where; using it; accountability; and then redistributing those assets. Spectrum can be the foundation to deliver new services.

Maura: I would just add that I think what we’re talking about here is changing the way we approach spectrum, and really, any policymaking technology policymaking, for once, let the facts stand for themselves; what screwed everything up is the politics of it and the vested interests. I don’t want to sound like President Obama, but particularly in the area of technology, the engineering is the engineering. So make the decisions on that and try to have the courage, the government, the FCC, the agencies, and the administration, to let the facts stand for themselves. We might end up in a better place.

Richard: Reality-driven policy.

Maura: Yes, that’s not going to change. We’d like it to change; we can hope that it will change, but really, we need to figure out a way to navigate around it because it’s not going away.

Brough: So, that brings me into the related question. There has been stuff, recently, about how we should just blow up the FCC, get rid of it. [Laughs] There is somebody who is in favor of that [Laughs], reforming it in one way or another. It’s interesting that the stimulus money has gone not to the FCC, but to the NTIA. What is the path to actually change something and is there any hope that we’re going to see that in the next four to ten years?

Peter: You have to build a little, test a little, and learn a little. That’s what makes software move so quickly in this world. Here is 700 THz, every microsecond, every millisecond. What was there before, what will be there since? It’s a new millisecond, a new microsecond, and it’s a new second. These laws go down; they’re down for ten years. They’re down for twenty years, they’re down for fifty years.

The laws are wrong in the use of this spectrum and what energy we put in the air, and what we do to recover it, and how we get the signals out of it. The only way forward is to have a very lightweight, experimental, build a little, learn a little, test a little kind of system. When you say, “How much did it cost to build this thing,” somebody probably spent $20 million or $30 million to build and test this thing, so they’re not going to do that for a toy piece of spectrum that NTIA has in the lab in Boulder. It’s not worth spending $20 million or $30 million to try and prove something. Maybe they could do it with FPGAs, like Adaptrum did. The reality is it’s the law that has to get lightweight, not spectrum and its use.

Rick: Since I earn most of my living by going to the FCC, I’d hate to see it go away. [Laughter] I think part of the FCC’s problem is institutional and part of it is personnel. We’ve had a situation, more recently unfortunately, where we’ve had leadership at the Commission who have not been particularly big fans of technology, don’t really understand technology.

One of the things I really find fascinating is that we have Julius Genachowski, who is going to be joining as the Chairman. He is the first Chairman, in fact the first Commissioner I can think of in a long time who has real business experience. Everybody else has had jobs basically taking paychecks, every two weeks, from the federal government for some other job they were doing on the Hill, or around town someplace. We’re finally getting somebody who is a real technologist, who understands the Internet. I think that alone is going to make an enormous change in how the Commission operates. I think he is going to rely, much more, on technical expertise to help him figure out what the right solutions are to the problems.

I also like Darren’s suggestion, the notion that we need to have transparency in what’s happening with spectrum. There is a great article in Wired, this month, about this radical notion of knowing everything there is to know about all transactions happening, financial transactions happening all the time. End users could then access that information, and figure out where the risk is and make decisions accordingly. That is something that would completely transform the way we have a financial system in this country.

It’s similar with spectrum. Who had what spectrum, where they were using or not using it, how long those licenses last, when they are up for renewal; you could have a whole cottage industry born that focuses on finding ways to incense the correct uses of the spectrum or to get them out of the spectrum. You could create all kinds of mechanisms, kind of a use or lose policy for the folks who are using it, today.

I think the agency is necessary for certain informational purposes, repositories of information for people to use. In terms of policy, it would be great if we could increasingly rely on the market. I think we have a little bit more time to go before we have true competition in some of those markets.

Richard: If Genachowski is a technologist, he’s also one with a Harvard law degree, like Kevin Martin and the other predecessors. I’m not sure how much of a break it’s going to be from the FCC’s tradition as a legally and politically bound agency.

In general, when we want to increase productivity and efficiency, we outsource. Perhaps, what the FCC needs to do is more outsourcing. It’s actually started with the standard that Peter wrote, 802.11y. From my understanding, and this is probably pretty naive, but joint partnership between the FCC and IEEE 802.11 to draft a set of rules for how to use a particular frequency to accomplish a particular thing; historically, the FCC has dealt with regulating analog transmissions.

We live in a digital world, now. The rules one could argue that we need to govern digital communications require an expertise that is not built into the FCC’s DNA. By deferring to an export organization like IEEE 802, the FCC is recognizing that they don’t have expertise in that area but that they need to compensate for it by getting cooperation of somebody else that does.

There are probably good things and bad things about how that exercise is played out. Maybe there are other organizations that can provide that sort of function, too. Who knows; maybe what we need is a free market in standards. We’ve always had this joke that the great thing about standards is that there are so many to choose from. We don’t really know how to write regulations for digital communication because we haven’t been doing it long enough. We’ve got a few data points, but we need to develop more.

I think the larger question is not whether Larry Lessig can get attention for saying we need to blow up the FCC, because we’re going to have some sort of regulation. It’s how we put the regulations together. What sort of a process builds those regulations, how we test them, how we revise them, and how that whole process works.

Peter: I would like to clear up one thing for you. The rules for the 3650 MHz came out in 2005, FCC 05-56, it was the end of the Powell administration. The first thing that happened is the Wi-Max forum filed and said, “Auction it off in fifty cities”. The only trouble was that twenty-five of those cities were inside the fixed satellite exclusion zone so you couldn’t quite follow that policy.

I went in, in August 2006, and showed them 802.16h, which was license-exempt Wi-Max, and showed them 802.11j, which was forty pages and done in under two years, and showed them 802.11y, which was three pages. They said, “How long will it take to finish it?” I said, “Two years”. They said, “Okay, we’ll wait”. [Laughs] In June 2007 they finally fixed the rules for the 3650 band FCC 05-56 and FCC 07-99, but they had the faith and had engineers who were watching the drafts and what was going on in the drafts to know whether they should wait a little bit to respond to those comments. It wasn’t an explicit kind of following. The same thing is true, today. There are FCC engineers who are in the ballot pools of 16h and who are in the ballet pools of 22, and they’re kind of monitoring. It’s not an explicit arrangement.

Brough: Could I just ask; I’m puzzled. Is the reason that worked because there were not major vested interests trying to fight it off?

Peter: Exactly, there is 50 MHz and it was lousy spectrum and half the country was off the map and you couldn’t do anything with it anyway. It was just a guard band to the C-band network. There was nothing at stake.

Brough: Right, so it’s kind of like the original, original 2.4 GHz that was garbage spectrum that nobody cared about.

Peter: The real point is that in Part 15, there are three rules, 15.5a, 15.5b, and 15.5c. 15.5a is that just because you’re registered, you have no right. This microphone is registered; it gives no right. If you are in the TV band’s database, that gives no right.

15.5b is you can cause no interference to any authorized service and you must accept interference from all authorized services. What that means is you have no circuits. You just have packets. It’s all best effort. You have no control over the authorized services and the energy they put in the air. When someone drives up in their truck or van and they’re authorized and you’re not, like a news-gathering crew bringing a mic into this room.

15.5c is you can ignore everybody except for the FCC or their designee, the FBI guy who calls on you, not the state, not the sheriff, not the mayor, not anybody else. They don’t matter; this is a federal matter.

If you start with those three rules, and those are the three rules of Part 15 spectrum, it means it’s all packets, all best effort. Anything else is a dream and you’re deluding yourself because your foundation is based on those rules.

Brough: Darren, do you have something else?

Darren: I’m pretty optimistic, actually, on the new FCC, based on some of the work I’ve done in other jurisdictions, and other countries. I do think we need to get a bit more balance in the FCC, more technical and business people. I think they should be more proactive at conferences like this, finding out what the issues are and then taking that information back to the Commission and reporting on it, and making good, sound policy that is in the best interests of the public.

It’s going to be an interesting test, this next four years, but I think it’s going to be pretty optimistic and pretty good.

Brough: That’s great. I love a little optimism. [Laughs] Let me ask one more question and then we’ll take some questions from the audience. The obvious thing that is in everybody’s mind, today, is TV white spaces. It has the most publicity, recently. What do you think are the real outcomes that we’ll see, and the real time frames in terms of commercial success, business models associated with it, and time frames? Rick, do you want to start, since you are peddling that issue, or maybe Maura because ladies first?

Rick: Sure, there are a number of implementation steps that have to be taken, now. We do have the rules published. Now, we’re going to go through the reconsideration petitions at the FCC. There is also a court challenge. We’ll go through those kinds of legal elements, but the real hard work has to be done in terms of implementing the rules, themselves, the geolocation database is central to that. The Commission is supposed to be issuing a public notice, very shortly, that will lay out the ground rules for asking for requests for proposals for those who want to be the database administrators.

Google is part of a group of hi-tech companies and some database operators, like Neustar and Comsearch and PCIA, trying to develop some of the ground rules, some of the nomenclature and concepts that we think would be useful for the Commission to essentially endorse and to then allow us to move forward quickly to put the database in place.

There is another piece of this, which is the power limits. This is a real challenge. We went to the Commission, late in the day, trying to argue for a more granular variable power control limits, particularly in the adjacent channels. Right now, it’s 40 milliwatts, and the fact of it; it takes about 80% to 90% of the potential white space channels off the table in most of the major metropolitan areas. When you do that, you’re really making this a non-starter, in terms of commercial viability.

Fortunately, the Commission does have an open door, or slightly ajar door, perhaps, in the order that says people can come back and ask them to modify those parameters. We are working with the agency and we’ve talked to some experts. We’re trying to develop some ideas around how to make it what we call a variable power control, which is based on where you are. It’s tied into a database and it knows where you are exactly geographically, and therefore you can be much more precise about the kind of protection you’re giving to the DTV signals.

That’s going to be the single big stumbling block. Without changing that, it may be that much of the commercial viability is hampered. It may become much more of a niche technology that happens only in certain, more rural areas of the country.

Maura: I would just add one thing. What’s important for us to do to support that and to prevent an outcome like UWB, is to work very hard in making white spaces real. It’s very easy to allow something that is theoretical to be defeated and to go away.

What our challenge was in white spaces is that we were on defense because we were trying to explain something that hadn’t happened, yet. The people hadn’t envisioned it, yet. Whereas, the purported interference that the broadcasters and the sports leagues and whatever, that was real. Interfering with something was real.

Our challenge to support what’s being done at the Commission, is to explain what a white spaces world looks like to those who are making decisions so that the decisions have real consequences.

Peter: The rules that are written are a dead end. They got us through four years of talking and now we get a new deck. As it stands, the only place that will get white space is Adak, Alaska. TV Channel 2 is on its 1700 foot high tower, rusting in the wind, and you can use all the other channels except for 3, 4, and 37.

Other places, where there is not much money, you can also find white space where you can operate under those rules. But, like Brett Glass says, you can’t find the cheap equipment because nobody’s going to make equipment that meets those RF masks that is personal/portable. Those RF masks are for fixed devices, transmitting at high power, and that means zero volume.

That means no new chips. If you want volume, you have to have the personal portables, some masks have to be relaxed. The sensing requirements have to be relaxed. The database has to tell you the power that is allowed, channel by channel, on the open channels. Right now, it’s two-dimensional, flat Earth geometry. Sooner or later, they’ll recognize mountains and valleys. Then we can recognize terrain shadowing and actually have some real power that has some real use in the real band.

This is the stalking horse for moving to that cognitive radio, but the rules as they’re written, is another abject failure, just like ultra wideband was, on Valentine’s Day of 2002.

Rick: I disagree on one point, which is the framework the Commission established. I think it is really important, the precedent that they created here, unlicensed use, fixed and personal portable devices, over the strong political objections of the broadcasters. They went ahead and did it. I agree with you, the parameters are not great. They need to be fixed, but at least the framework is now there. If it survives appeal and survives the recon period, then it gets down to the nitty gritty work with the engineers at the Commission. I do have hope that we can persuade them on at least some of these things, enough to make this a viable technology.

Richard: I think Rick had it right, the first time. The power limits are so low that it’s no usable. Before Kevin Martin left town, he wanted to get white spaces off his desk so he could make part of his legacy that he is the great champion of the consumer, by this Earth-shaking new precedent. They blew it. It’s a botched order.

The best thing that could happen for advocates of open spectrum would be the for courts to void that order, for the FCC to go back and start over. What they have to do is we have to decide whether we want to use those frequencies for another alternative in home networking, which is essentially all it’s good for, with the current power levels.

Instead of just Wi-Fi and power line networking, and MoCA and the other stuff, we have fragmentation in that marketplace. Or, do we want to use it to provide a third pipe? If you want to use it to provide a third pipe, you need to have a set of rules that permit power levels that are high enough to pull that off. I think that’s the better use for that spectrum than this is, now.

It’s one of these symbolic, political actions that allows both sides to declare victory. The white space advocates can say, “Yeah, we won; the FCC voted 5-0 and we won the battle”. When you look at the actual rules, it’s the broadcasters that won. Nobody is going to be able to build systems, nobody has any incentive to build systems to utilize that spectrum.

Darren: Let’s hope that commonsense prevails in the Commission and they get the power levels right and they get the technical rules right. Isn’t there enough spectrum inside the house, already? You think about all the solutions that are out there; so much spectrum has been allocated to unlicensed, that this spectrum should be used for outdoor. If you don’t think that there’s an opportunity out there for an alternative pipe—I think the Commission always talks about needing a third pipe out there. You would think that this think would move forward and that the right minds will prevail.

Brough: Okay, I certainly hope so. At this point—oh, Dean has a question.

Dean: I have a couple of questions. First off, do you see any prospect for the US not having different spectrum policies to most of the rest of the planet, because this does make it rather difficult for creating mobile devices which support international bands. If you look, at the moment, you have T-Mobile, the bizarre 3G allocation, 700 MHz allocations which are completely different from everywhere else. What do you reckon the chances are of actually having some sort of international harmonization?

The other thing is as well as the TV white space, is there any initiative around band sharing for federal or other government spectrum? Certainly, in the UK, there is an interest in getting band sharing with civil aviation authority, or the coast guard. There is no point in having maritime radar allocated anywhere, apart from the coast, for example.

Peter: We’ll go with the second one, first. In the United States, in the 3650 MHz band, there are three radio location repair stations. The band allocation was shared between NTIA and non-government uses. That’s why it didn’t have to be auctioned off. If it’s exclusive use and it isn’t government, it has to be auctioned off.

We’ve always had the conflict, when it came to mobile and portable, between the desire for the federal cash cow to feed the rest of the government [Laughter], and for people to find bands that had to be refarmed. You have to go and replace the towers, replace the equipment, of all the people who were there, whether it was the satellite S Band 1700, 2200 band, or whatever else.

We’ve had four regions in the World Radio Congress, for that very reason. We never go across all the world, one way. We certainly don’t when there is money on the table like there is in mobile.

Darren: There is actually a test bed going on, right now, within the federal agency, in the 410 to 420 MHz to actually share spectrum. There are five or six companies that are actually going to start these test beds soon. There is additional spectrum that they’re identifying, outside of UHF white space, where policy and testing is moving forward. This is all part of that opening up, hopefully, of getting more spectrum into the ecosystem.

Dean: In the UK, as a background, it’s working on an administered incentive pricing, from the Treasury. This essentially involves the Treasury taxing other government departments for their implicit use of radio spectrum, which encourages them to release stuff they don’t need.

Darren: We’re following the UK, there, because we have a similar process called Circular A11. We just haven’t implemented it yet, whereas the UK has implemented it. I have to give the UK a lot of credit. I think they are doing a lot of innovative things. They’re measuring their spectrum utilization, already. They’re mapping their spectrum assets. We have a similar program; we just haven’t initiated it, yet.

Maura: This is a question to Richard’s point about outsourcing to experts. Haven’t you guys done that more so than anyone else?

Dean: Possibly, Ofcom uses a lot of external agencies for a variety of things because it’s resource constrained.

Maura: Without a lot of issue about it.

Dean: Doesn’t seem to have a problem.

Maura: Why are you laughing? [Laughs]

Brough: In partly because I’m always amazed to the extent to which US policy people don’t look at what happens in other parts of the world. I’m glad you are actually aware of it, and it’s much to your credit.

Maura: This is happening, and quite well.

Brough: We have a very short time left; one more question from somebody? I would like to inquire, in terms of refarming other blocks of spectrum; I know in a conversation with Peter there was a discussion of 3.4 to 4.2 GHz. What are pieces of spectrum, other pieces of spectrum that might realistically become available as secondary use, or something, in the next four years?

Peter: In Canada, for example, the C-band satellites are at 3.7 to 4.2 so they have all those dishes pointed at the equator. That means they’re pointed south. In Canada, they use that spectrum and license east-west use of the spectrum, that’s in the same bands as the fixed satellite receivers are receiving. What we know is that people are going to VSATs. They’re going to smaller dishes and essentially, that 3.7 to 4.2 in the United States could be reused with little difficulty, if we A) follow the Canadian precedent, or B) got it in a database.

There is another big hunk from 4.4 to 4.9, that is historic AT&T microwave, and MCI, used across the country. It’s all fiber now, triple redundant fiber, and Japan gave it back. That was the Japan 4.9 recovery. There is another half of a GHz that is there that could be tacked on, one way or the other. But, it’s not a mobile spectrum. Neither of these are mobile spectrum. The frequencies are too high, and the propagation is too bad. They certainly are the kinds of things you could use for going ten miles, or carrying broadband around.

Those are the most immediate big blocks that come to mind.

Brough: To the other panelists, or to Rick—other people advocating things. Is there anybody besides Peter, who is aware of this; who is going after this? Is that part of Google’s lobbying?

Rick: The challenge now, as I mentioned, there is nothing obvious out there. It has to be the less obvious stuff that could be available in the commercial side, as well as the stuff the government has locked but are not using. I think efficiency has to look in both places. It comes back to Darren’s point about we need transparency, to map what’s actually happening with all the spectrum assets, and then figure out a policy that unlocks as much of that value as we can.

Brough: We appear to have run out of time. We have nobody screaming at me about wanting an additional comment. I think, at this point — ah, Lee, we have another question, now. We’ll let Lee keep himself on schedule. I want to thank the panelists. This has been very good. [Applause]

(Conference transcript reproduced here with permission from eComm Media, Inc. Some corrections made on Mar 23, 2009.)

By Ali Farshchian, Founder & Editor

Filed Under

Comments

One correction Richard Bennett  –  Mar 24, 2009 7:20 PM

Thanks for posting this, Ali. I misspoke about the hourglass model: it was Steve Deering, not Vint Cerf who pointed out the fact that IP is much less diverse than the functions above and below it, and that there’s another hourglass in HTTP.

Re: One correction Ali Farshchian  –  Mar 24, 2009 8:16 PM

Richard, you're welcome and thanks for the clarification.

Comment Title:

  Notify me of follow-up comments

We encourage you to post comments and engage in discussions that advance this post through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can report it using the link at the end of each comment. Views expressed in the comments do not represent those of CircleID. For more information on our comment policy, see Codes of Conduct.

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

VINTON CERF
Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Related

Topics

Brand Protection

Sponsored byCSC

DNS

Sponsored byDNIB.com

IPv4 Markets

Sponsored byIPv4.Global

Threat Intelligence

Sponsored byWhoisXML API

New TLDs

Sponsored byRadix

Cybersecurity

Sponsored byVerisign

Domain Names

Sponsored byVerisign