Skip to Content

The Spectrum Crunch That Wasn’t

Tiny transmitters, spectrum sharing, and new information- coding technologies promise to keep wireless data capacity increasing for years.
November 26, 2012

Take a look around at the next ball game or concert you attend. You’ll see thousands of fans snapping photos and videos and e-mailing them to friends. Those armies of smartphone owners—and their tablet-toting brethren—are contributing to a striking increase in wireless data usage: Cisco Systems estimates that mobile data traffic will grow by a factor of 18 by 2016, and Bell Labs predicts it will increase by a factor of 25. Intuitively, there’s a problem: all these photos and videos go over the airwaves. Yet just a few sections, or bands, in the spectrum of radio frequencies are available to the wireless carriers, which paid billions of dollars for them. Vastly more frequencies are reserved for other uses, from television and radio to aviation and military applications. Data traffic is growing so rapidly that carriers have imposed usage caps and raised prices. Surely, these two basic realities—exploding data use on the one hand, limited bands of spectrum on the other—must mean we will soon run out of airwaves for our gadgets, right?

Just two years ago the chairman of the U.S. Federal Communications Commission, Julius Genachowski, suggested as much. He said the U.S. wireless industry desperately needed to get its hands on underused parts of the spectrum controlled by government agencies or TV broadcasters. Otherwise, wireless companies would find that demand for their services would outstrip their ability to provide them. “If we do nothing in the face of the looming spectrum crunch, many consumers will face higher prices as the market is forced to respond to supply and demand,” he declared. Similarly, an AT&T executive, Jim Cicconi, said that “the need for more spectrum is an industry-­wide issue and problem.”

But these claims were premature. For one thing, spectrum “crunches”—mobile phone usage that overwhelms the available wireless frequencies—would occur at highly specific locations and times. Sometimes, alternative strategies can completely solve these localized problems.

Look around that stadium, for instance, and you’ll probably find milk-carton-size boxes tucked away in the rafters. These are short-range Wi-Fi receivers, operating on unlicensed portions of the radio spectrum. Your phone can send data through them instead of on the long-range cell-phone frequencies. The Wi-Fi boxes mop up all the data you send, and route it out of the stadium over a wired Internet connection. So the data sent by you and nearly everyone else in the stadium doesn’t touch the precious spectrum that the wireless carriers claim is running out. That clever trick is just one example of the new strategies and technologies that can be brought to bear.

Things Reviewed

  • Report to the President: Realizing the 
Full Potential of Government-Held Spectrum to Spur Economic Growth

  • President’s Council of Advisors 
on Science and Technology

  • July 2012

The entire spectrum system is managed inefficiently. A recent advisory report to the White House made that clear enough, and it emphasized that sharing wireless frequencies more widely—rather than parceling each band out to a limited set of users—could increase wireless capacity by a factor of thousands. For example, many sections of the airwaves that are reserved for TV stations and federal agencies go unused. That’s partly because some regions have only three local TV channels and no one needs the remaining spectrum set aside for TV broadcasts. Or a military weapon system that gobbles spectrum in San Diego uses little or none in New York. “We don’t have a spectrum crunch so much as we have a spectrum policy crunch,” says David ­Tennenhouse, Microsoft’s vice president of technology policy and a former MIT professor and Intel executive. “The so-called ‘spectrum crunch’ really reflects artificial spectrum scarcity.”

To document this artificial scarcity more precisely, his company has launched a project, called the Microsoft Spectrum Observatory, to measure where and when bands of radio frequencies are actually being used, starting in Washington, D.C., Seattle, and Redmond, Washington. ­Tennenhouse hopes it is the first step in a far broader data-gathering effort that leads to smarter spectrum regulations. Pointing to the runaway success of Wi-Fi, which covers only short ranges and works on open, unlicensed frequencies, he adds, “The challenge now is to extend those proven successes to enable wider-area broadband access using other underutilized portions of the spectrum.”

Some early efforts at frequency sharing have begun. For example, some television channels that go unused in a given geographic area, referred to as “white spaces,” can now be used by other devices. And in December, the FCC recommended that researchers and companies be allowed access to frequencies that have been reserved for radar systems.

Many more airwaves could eventually be shared with the help of cognitive radios, which sense available frequencies and shift between them in milliseconds to avoid interference with other devices. Some of the first outdoor tests are under way at the University of Colorado. Groups elsewhere, including Virginia Tech, the University of California, Berkeley, and Rutgers, are also working on the technology. However, at least for now, rigid regulations don’t allow widespread use of flexible technologies like cognitive radio.

It’s not that the entire subject of a spectrum crunch is a red herring. Radio frequencies are a limited resource, and some bands aren’t well suited to long-distance communications. Wireless carriers can’t endlessly install new base stations, those towers atop office buildings or hillsides (sometimes disguised as trees), because eventually the signals would interfere with those from other stations. But shorter-range transmitters and receivers that use dedicated cellular frequencies—called small cells—can already fill gaps in coverage. The smallest of these, called femtocells, can be as cheap as $200 and give clear service in homes and offices while keeping the load off large base stations, much like those Wi-Fi gadgets in the stadium rafters. “Small cells are the hottest thing in the wireless industry right now,” says Jeff Reed, director of the wireless research lab at Virginia Tech.

John Donovan, an AT&T executive vice president, said this fall that while the company had bought additional spectrum rights and wanted still more, the immediate crisis had passed, and that half the new demand through 2015 would be handled by small cells. Such technologies have emerged far more strongly than anticipated. “If you looked a few years ago, you’d say we’d be out of spectrum by now,” says Vanu Bose, founder of Vanu, a wireless-communications company in Cambridge, Massachusetts. Bose, along with Reed, was a technical advisor on the White House report. “There are lots of ways to satisfy the demand,” he says. “Adding spectrum [for commercial services] is certainly one of them, and so are small cells, alternative offloading technologies, and innovations we haven’t even conceived of yet.”

Eventually, new technologies might free up airwaves by making wireless data transfers happen much more quickly. For example, MIT researchers have shown it’s possible to reduce the amount of back-and-forth communication required to deal with dropped packets of data. While the technique may be a few years from being widely implemented, lab demonstrations show that it could increase capacity tenfold. That means you could download your video 10 times faster than you do now, freeing the network that much sooner for someone else to use.

So can new technology stave off a spectrum shortage forever? Perhaps not, but Microsoft’s Tennenhouse says that decades of research advances are waiting to be applied to the problem: “Right now, we have a 15- to 20-year backlog of new technologies and architectures … which can take us a long way into the future.”

This story was updated on January 2, 2013.

Deep Dive


Capitalizing on machine learning with collaborative, structured enterprise tooling teams

Machine learning advances require an evolution of processes, tooling, and operations.

The race to destroy PFAS, the forever chemicals 

Scientists are showing these damaging compounds can be beat.

How scientists are being squeezed to take sides in the conflict between Israel and Palestine

Tensions over the war are flaring on social media—with real-life ramifications.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.