It’s in Beta and Always Will Be
Scott Wallsten is the vice president for research and senior fellow at the Technology Policy Institute and a senior fellow at the Georgetown Center for Business and Public Policy. He served as economics director of the Federal Communications Commission’s (FCC) Broadband Task Force in 2010 that developed the National Broadband Plan after Congress required the FCC to draft a plan to “ensure every American has access to broadband capability.”
The National Broadband Plan (NBP) has received some criticism for not having met its goal of increasing broadband adoption. Is this a fair criticism? Has the NBP failed in meeting its goal?
While politicians don’t like to admit it, the government has limited ability to affect adoption as we near saturation. As a country we focus heavily on availability, but availability is not really an issue. Adoption is the real issue.
The small share of the population that could not access wired broadband could access satellite internet service providers (ISPs), so people who had the strongest demand for Internet access already had it. Therefore, they were already included in broadband adoption statistics. As a result, even new wired and wireless options, including newer and faster satellite broadband, cannot have as large an effect on adoption as it would if the FCC’s definition of “unserved” really meant unserved.
The real adoption gap is income-based, and for the most part we still do not have universal service programs targeted at broadband for low-income people.
One of the major components of the NBP addresses the “spectrum crunch”. You have been a major advocate of encouraging secondary spectrum markets as a means to tackle this problem. Why do you think these markets still have not emerged to any significant degree? Do you still think it is the best model going forward?
It is a misconception that secondary markets have not emerged to any significant degree. As John Mayo and I documented in a paper a couple of years ago, thousands of voluntary secondary trades happen each year. Even though it may not be the Nasdaq-style wheeling-dealing market that some imagined, it should be viewed as a major policy success: over several years the FCC steadily reduced bureaucratic transaction costs of trades.
That said, the FCC has recently given some mixed signals regarding secondary markets and its view of spectrum scarcity. For example, by preventing Lightsquared from launching a network on its licensed spectrum, the FCC effectively removed about 15 billion MHz-pops (a measure of the number of people covered by each megahertz of spectrum) out of the available licensed-but-unused 36 billion MHz-pops of spectrum that could have been available for secondary trades, even if Lightsquared’s network failed.
By preventing Lightsquared from launching a network, the FCC has let a large block of spectrum continue to sit empty rather than encouraging alternate models of spectrum use in licensed-but-unused blocks.
Focusing on secondary markets and encouraging their efficient operation is crucial. Auctions, no matter how well done, only deal with the initial allocation of spectrum. As market conditions change, spectrum use must be able to change as well. The only way uses can change efficiently is if transactions costs for trades are low.
Spectrum sharing has also been cited as a potential solution to the spectrum crunch. Do you agree with this view or are there other solutions you feel have more potential?
Technological improvements contribute to using spectrum more efficiently, and spectrum sharing is one of those technologies. While there is no technological silver bullet, the right way to think about a “spectrum crunch” is that increasing demand for wireless services increases demand for spectrum, while technological improvements reduce demand for spectrum. The net effect on spectrum demand comes from how those two forces balance out.
One of the most difficult aspects to formulating Internet policy is that technology evolves so rapidly. How is the NBP equipped to adapt to such rapid change?
Blair Levin, the former executive director of the National Broadband Plan, frequently said that the NBP was “in beta and always will be.” His comment was exactly right.
Rapid technological change will make most planning obsolete rather quickly and is why, for example, projecting technological winners and losers is a fool’s errand.
Instead, plans should focus on establishing conditions that promote innovation and investment while ensuring that policy goals that may be unrelated to economic efficiency, per se, can be met in cost-effective ways.
Feature Photo: cc/(Palagret)
One thought on “It’s in Beta and Always Will Be”
Sorry, comments are closed.