Mobile Web vs. Reality
Telecommunications companies are spending billions to prepare high-speed mobile wireless networks. But it’s not clear whether the technologies will work…or if we even need them.
John Chapman brims with enthusiasm. The director of Hewlett-Packard’s mobile and wireless strategy has just signed a three-year research agreement with NTT DoCoMo, the cellular spinoff of Japanese telecom giant NTT. The goal? To brainstorm the infrastructure for a wireless network with such abundant capacity that, according to Chapman, “we will no longer bother to measure it.” Hewlett-Packard has allied itself with NTT DoCoMo-whose name means “anywhere”-because the Japanese firm is the world’s leading mobile-Internet provider. An estimated 72 percent of Japanese cell-phone owners routinely connect to the Internet, compared with a mere six percent in the United States. Chapman believes that if Hewlett-Packard can offer Americans rich streaming video, data, graphics and voice over a high-speed network that reaches every street corner, subway platform, beachfront and backyard, they will sign up in droves.
How to build this broadband wireless network is the burning question. Telecom companies would need to spend hundreds of billions of dollars to catapult today’s narrowband cell-phone infrastructure to broadband. This is no mere “upgrade.” Today’s meager cell phones and wireless Web devices connect to the Internet at a laggard 9,600 bits per second, less than one-fifth the speed of the average desktop modem. And even a desktop modem doesn’t qualify as broadband. Its speed has to be at least qua-drupled for users to enjoy instant Internet access and to view full-motion video with movielike quality.
Furthermore, the Wireless Application Protocol by which today’s mobile devices connect to the Internet typically supports only clunky, dumbed-down, black-and-white versions of a few hundred Web sites deliberately tailored to a tiny screen. Despite the constant commercials for “smart” phones and wireless wondergadgets from the likes of Sprint, AT&T, Palm and Kyocera, most people are frustrated by the embryonic “wireless Web.”
Given the huge expense to license new broadband spectrum from national governments, technical and regulatory battles over which emerging communications protocols to use, plus the need to overhaul cell towers and mobile devices, some experts wonder whether the benefits are worth the trouble. Do we really need streams of data flowing from cell towers everywhere so we can watch a CNN video clip as we step off a midtown curb, paying a steep per-minute fee for the privilege?
Maybe not. Outside of Japan, enthusiasm for this scenario seems to be waning, even among the telecom companies that would charge you for it. The cost appears so astronomical that voracious consumer demand for such amenities as digital video and music would be needed to cover it. No U.S. or European surveys indicate such demand exists, or that consumers would pay the premiums.
What is clear, however, is that consumers who are strolling or driving want reliable cell calls, paging, e-mail and fast, easy access to the entire full-color Web. None of which, in fact, requires broadband. Having taken a reality check, some telecom executives are promoting a new vision: improve the cellular system enough so consumers get uninterrupted phone calls and instant access to the Web over a trendy handheld device while they’re outdoors, and reward them with streaming, multimedia, broadband brilliance once they step indoors, at home, office or hotel, on trains or on planes. When the frenzy over broadband dies down a bit, is that what the future is really going to look like?
Nature’s Speed Limit
This new, hybrid vision remains contrarian in the wireless industry, largely because mobile broadband evangelists like Chapman believe in “build it and they will come.” Their companies are proposing a jumble of competing protocols to move us beyond today’s second-generation, or 2G, digital cell-phone networks. They plan to roll out so-called 2.5G systems that combine voice and data this year in Japan and Europe, and in 2002 in the United States. Japan claims it will soon follow with the third generation, or 3G. Yet nobody seems certain when 3G, the standard that will support true broadband applications, will actually be implemented.
Working against its rapid appearance is a fundamental law governing data communications that was laid down way back in telecom’s primordial era: 1948. That year, Claude E. Shannon of Bell Labs stated that the maximum amount of data that can be transmitted through any channel is limited by the available bandwidth (the amount of radio-frequency spectrum it occupies) and by its signal-to-noise ratio (the signal to be communicated versus interference).
Both limits are strikes against mobile-wireless communications. A wireless channel can only use the portion of the spectrum approved for it by the International Telecommunication Union and licensed by one of its 189 member states. The licensing fees are appalling: carriers spent more than $46 billion for the 3G spectrum in Germany alone. At those prices, a carrier must maximize the payback of its channels by packing as much data as possible into as narrow a frequency band as possible-a practice that runs counter to the principle of filling a broad band with data-intensive multimedia streams, which is, technologically, the optimum strategy. To resolve the conflict, carriers must devise technology that can send signals faster in tight bands.
To make matters worse, the medium through which the signals flow-the earth’s surface atmosphere-is a very noisy place these days. Cell-phone signals careen off buildings, hillsides and each other, creating interference and decay. To improve fidelity, manufacturers must boost the signal power or reduce the noise. But they can’t increase power because the Federal Communications Commission and its European and Asian counterparts restrict the electromagnetic radiation cell towers and handsets can emit. Besides, raising a handset’s power level kills its batteries.
No surprise, then, that engineers focus on reducing noise. This game began in earnest in the mid-1990s, when digital cell phones started to replace analog versions, increasing voice clarity dramatically. Even though telecom companies had to spend billions of dollars to add digital transceivers to their cell towers, the upgrade quickly paid for itself, because it also allowed the providers to cram many more simultaneous voice calls into the same slice of expensive bandwidth, with less interference.
Look Before You Leap
There are two basic schemes for packing as many digital calls as possible into the available bandwidth. The Time Division Multiple Access protocol, an early format championed by AT&T, has evolved into the Global System for Mobile Communications, now a near-universal standard in Europe and Japan. Code Division Multiple Access arose as the main alternative, adopted by Sprint and GTE, and by the end of the decade it reduced noise better than the time division method and packed more data into a single channel.
The leading 3G standards approved by the International Telecommunication Union are based on the code division protocol. But to implement them, telecom companies must license expensive new spectrum and overhaul cell-tower and handset technology. After initial blind enthusiasm, few U.S. carriers now seem in a hurry to make massive investments. Tom Crook, director of technology research for Sprint PCS, speaks for many when he says, “I don’t see us doing 3G anytime soon.”
Technical assessment groups like Adventis in Boston also say that, on a real street corner, 3G data rates won’t come close to the maximums industry proponents quote, which are based on pristine lab conditions. In a recent compilation of technical and investment studies examining eight proposed broadband technologies, Adventis concluded that only three could realistically achieve average data speeds faster than a desktop modem’s. And those three would roughly double the speed, far shy of the quadrupling needed for real broadband performance.
Ken Hyers, an industry analyst with Cahners In-Stat Group, says that the daunting expense and dubious technical results of 3G are causing U.S. carriers to “take a wait-and-see attitude. They’re saying, Let’s see how much bandwidth our customers are really going to use.’” If all they want are simple Web services such as online restaurant directories, the answer may be, not much.
Rather than taking the true broadband leap, Sprint PCS and others have decided to test the waters using 2.5G technology, which uses the same spectrum as current 2G networks and requires only a relatively minor hardware upgrade. Although 2.5G can’t achieve true broadband data rates, it does offer one huge advance: it’s “always on”-instantly available, 24 hours a day. You won’t have to dial in and wait 30 seconds while your mobile device connects to the Internet. Instant access changes your relationship to the Internet profoundly. Studies show that people in homes with instant access through hard-wired systems such as digital subscriber lines and cable modems use the Internet three times as much as people who must dial in each time. When you’re mobile, any delay is even more discouraging and may stop you from accessing the Net altogether.
So who really needs 3G, then? Some of the canniest telecoms have begun to blur the distinction by defining their 2.5G technology as 3G. Anil Kripalani, a senior vice president at Qualcomm, says, “We know how to push the envelope.” Like other U.S. proponents, he sees no need for carriers to jump to the real 3G. Thus the world’s telecom disparity could continue, with America leaning toward 2.5G, Japan intent on 3G, and Europe and the rest of Asia vacillating in between.
Yet even if the 3G dream touted so boldly within the industry fades away, it won’t have been in vain, since it is what has motivated carriers to move to 2.5G standards. Engineers are devising intriguing cell-tower transceivers and handset antennas to help ensure that wireless users get the maximum bandwidth and strongest signals available, regardless of how many Gs they’re pulling. Yet incompatible transmission protocols still pose a problem. Each cellular device uses a microprocessor-radio chip that supports only one protocol. A phone using a code division protocol requires a different chip than a phone using a time division protocol, and different 3G phones from AT&T and GTE, say, would use different chips even if they were both based on a code division protocol.
One technology, known as software-defined radio chips, could provide a solution, according to Benny Bing, a leading wireless authority at the Georgia Institute of Technology Broadband Institute. Still in prototype, software-radio chips can switch among protocols, filtering techniques and detection schemes. At any moment, a mobile device with software radio inside could switch seamlessly among American, European and Japanese telecom standards, as well as competing transmission protocols (see “The Universal Cell Phone,” TR April 2001).
Beating the Air Raid
The prospect of designing a winning broadband mobile architecture has attracted legions of ambitious technologists. The question remains, however, whether they can ever supply enough real communications power-remember Claude E. Shannon-for you to check out that CNN clip as you walk around downtown. Rather than trying to rev up the cell-phone network to deliver a broadband Internet, perhaps we are better off with parallel systems, one for phone (which already exists) and one for data (under construction).
“There’s really no big reason why the good old cell-phone system should survive or thrive as the wireless Internet,” says Teresa H. Meng, a groundbreaking wireless researcher at Stanford University who is now chief technology officer of wireless-chip maker Atheros Communications. Instead, Meng says, telecom companies could place wireless data transceivers on every building and utility pole. Each transceiver would cover a small area, or “nanocell,” ranging 200 to 300 meters in diameter. Together they would create what Meng calls a “wireless fabric.” Because the transceivers would be so close to users, they could send clear, high-speed wireless signals over narrow bandwidths, at frequencies that fall into the industrial/scientific/medical” portion of the spectrum, which regulators make available free and is used by cordless phones, garage-door openers, medical instruments and factory machinery. And handsets could get away with low power output, conserving batteries.
In tests at its Sunnyvale, CA, labs, Atheros’s chipsets are reaching data rates hundreds of times faster than desktop modems-true broadband. Not burdened by having to carry voice, they can be far speedier than 3G schemes, which supply voice and data together. Meng also says, “The data communications industry has the upper hand. Because the cell-phone industry is heavily regulated and totally standardized, improvement has been made very incrementally-as in 3G versus 2G. Those technologies are 15 years old.” Even some cell-phone pioneers, like Martin Cooper, who developed the first portable cell phone at Motorola in the early 1970s, agree that a dual system might be more practical than 3G (see “Everyone is Wrong,”).
Blanketing our towns with nanocells may seem far-fetched, but Meng insists it would cost less than acquiring pricey 3G spectrum. Chip Elliott, principal scientist at Verizon’s BBN Technologies, concurs. He estimates that a network covering a large city would require $20 million up front in equipment, plus $5 million in annual network costs. Not bad, considering that the 3G spectrum for New York City alone was auctioned for billions of dollars, and the required system upgrade will add much more to the cost.
Freed from voice, data-only systems could provide a quicker, easier path to inexpensive, broadband Internet service, perhaps even that streaming CNN video on the street corner. Indeed, Metricom’s commercially available Ricochet mobile data-only service already operates twice as fast as desktop modems. Technologists are testing much faster data-only schemes, too. The consensus is that Orthogonal Frequency-Division Multiplexing, a format currently used to transmit high-definition television in Europe, could provide the best option. Rajiv Laroia, chief technology officer at Flarion, a leading commercializer of the scheme, says his company will offer equipment later next year.
Of course, there are many hurdles to a data-only infrastructure. The industrial/scientific/medical spectrum could rapidly become overcrowded, forcing carriers to license pricey spectrum after all. Interference could degrade the quality of those multimedia Web streams. There is no agreed-upon transmission protocol, leaving data-only services open to the incompatibility that besets 3G. To avoid walking around with several different gadgets, we’d need that software-radio device to switch handily between voice and data modes. Stray beyond urban areas, furthermore, and it’s hard to imagine a nanocell on every fifth fence post.
Which brings us back to the contrarian vision of a hybrid network: a 2.5G cell-phone system providing clear voice, paging and always-on Internet access to our handheld devices outdoors; and the cable-TV and computer-network wiring already in place indoors providing the full-blast broadband experience-which the handheld can tap into.
This architecture could be built relatively quickly and inexpensively. High-speed, broadband Internet will soon be available in many indoor environments, as companies such as Cisco Systems and Juniper Networks busily string fiber-optic cable to homes and businesses. A simple wireless transceiver in the corner of a lobby or living room would feed your mobile device; you could access the high-speed networks being built into modern trains and planes the same way. This scheme also dovetails nicely with what is happening inside numerous businesses, where aging, hard-wired local-area networks are being replaced with “fixed” indoor wireless networks, which are cheaper and easier to install and readily support broadband data rates. It would be simple for your mobile device to latch on to this infrastructure.
Looking toward the end of the decade, you may end up using 2.5G wireless for convenient cell-phone calls and Web access while traipsing around town, then cut over to a fixed-wireless network when you step into the coffee shop, subway station or meeting room, perhaps using a software-radio Web phone that switches between voice and data as needed. Flip open your 2.5G CellMate while walking down Main Street to call home, then convert to data mode to download a shopping list after your spouse tells you about a sudden party you didn’t know you were hosting. When you step into Mammoth Grocery, CellMate switches over to the store’s fixed-wireless network so you can quickly check Online Wine to see which vintage will complement dinner. The store’s map appears, leading you to the wine aisle. You point CellMate at the checkout’s infrared scanner to debit your bank account. And that indoor/outdoor hybrid system, rather than the grand vision of “3G,” might be what the future really looks like for broadband wireless.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today