Skip to Content

"Crisis" in Availability of Wireless Spectrum Is a Myth

Stanford engineers argue that outmoded protocols and business rules, not technology, are why 95 percent of spectrum goes unused.

Imagine that you were only allowed to fly on one airline, or take a single route to work, or use one designated checkout counter at the grocery store. What would happen if that airline or road or checkout counter became especially popular, or you wanted to use them during a particularly busy time?

Efficient provisioning of resources requires that consumers be able to switch to cheaper or less-crowded alternatives when the need arises. A group of Stanford Engineers has applied this bit of obviousness to the one area where, to date, it has been less than obvious: wireless spectrum.

A 2005 study by the NSF found that only 5.2% of the wireless spectrum from 30 MHz to 3000 MHz was in use at any one time. And yet a study from the same year of the wireless spectrum devoted to cell phone signals in New York City found that almost half of that spectrum was in use.

The problem is that we’re all locked into the spectrum offered by a single cell phone carrier, and our phones can’t even access most of the wifi hotspots that are in range, much less use them to make calls.

As Yap et al. outline in a provocative new paper entitled Delivering Capacity for the Mobile Internet by Stitching Together Networks, this leads to all sorts of inefficiencies that could be solved by a network ruled by standards that allowed devices to be agnostic about which portion of the wireless spectrum they are currently using:

- Increased capacity through more efficient statistical sharing. Cellular network operators tend to heavily over-provision their network in order to handle times of peak load and congestion. Most of the time, the net- work is lightly loaded. If instead they were able to hand off traffic to each other, or from cellular to WiFi networks, then their traffic load would be smoother, and their network more efficient. For example, what if AT&T could re-route traffic from their iPhone users to T-Mobile during an overload? Or T-Mobile could re- route their customers’ flows to a nearby WiFi hotspot?

- Exploit differences in technologies and frequency bands. Mobile technologies such as EVDO and HSPA provide wide area coverage with consistent bandwidth guaran- tees; while technologies like WiFi provide high band- width and low latency. Lower frequencies provides better coverage and penetration; while higher frequen- cies provides better spatial reuse. Being able to use the most appropriate technology for the application at hand would make best use of capacity available.

- Open up new sources of capacity. The ability to move between networks also open up new sources of capacity. For example, one can now use a network such as that of to supplement their main network, without having to deploy an extensive WiFi network. Such crowd-sourcing can be a powerful tool to cover dead spots and relieve congestion.

By “stitching together” all available wireless networks, Yap et al. propose a future in which the unused 95% of available wireless spectrum could be fully utilized - and thus exploding the amount of wireless spectrum available to our ever growing array of ever more bandwidth-hungry devices, from cell phones and laptops to the impending tsunami of wireless devices that will comprise the Internet of Things.

The key to this evolution, say Yap et al., is a complete re-think of wireless protocols, an effort that is a subset of a larger effort called OpenFlow. Currently in use by universities running experimental wireless networks, OpenFlow allows handoffs across radios and networks. It might sound like a pipe dream, but the OpenFlow consortium already claims to include a number of (still secret) switch manufacturers.

Image of cracked Droid Eris cc Robert Nelson

Follow Christopher Mims on Twitter, or contact him via email.

Keep Reading

Most Popular

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Sam Altman says helpful agents are poised to become AI’s killer function

Open AI’s CEO says we won’t need new hardware or lots more training data to get there.

An AI startup made a hyperrealistic deepfake of me that’s so good it’s scary

Synthesia's new technology is impressive but raises big questions about a world where we increasingly can’t tell what’s real.

Taking AI to the next level in manufacturing

Reducing data, talent, and organizational barriers to achieve scale.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.