Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

The Network Is the Computer, Finally

Like many revolutions, PlanetLab is based on a startlingly simple idea that has been around for a long time, advanced most notably by Sun Microsystems: move data and computation from desktop computers and individual mainframes into the network itself.

But this can’t be done with today’s Internet, which consists of basic machines, called routers, following 1970s-era procedures for breaking e-mail attachments, Web pages, and other electronic files into individually addressed packets and forwarding them to other machines. Beyond this function, the routers are dumb and inflexible: they weren’t designed to handle the level of computing needed to, say, recognize and respond to virus attacks or bottlenecks elsewhere in the network.

PlanetLab’s smart nodes, on the other hand, are standard PCs capable of running custom software uploaded by users. Copies of a single program can run simultaneously on many nodes around the world. Each node is plugged directly into a traditional router, so it can exchange data with other nodes over the existing Net. (For that reason, computer scientists call PlanetLab an “overlay” network.) To manage all this, each node runs software that divides the machine’s resources-such as hard-drive space and processing power-among PlanetLab’s many users (see “Planetary Pie,” below). If the Internet is a global, electronic nervous system, then PlanetLab is finally giving it brains.

The payoff should be huge. Smarter networks will foster a new generation of distributed software programs that preempt congestion, spread out critical data, and keep the Internet secure, even as they make computer communications faster and more reliable in general. By expanding the network as quickly as possible, says Peterson, the PlanetLab researchers hope to restore the sense of risk-taking and experimentation that ruled the Internet’s early days. But Peterson admits that progress won’t come easily. “How do you get an innovative service out across a thousand machines and test it out?”

It helps that the network is no longer just a research sandbox, as the original Internet was during its development; instead, it’s a place to deploy services that any programmer can use and help improve. And one of the Internet’s original architects sees this as a tremendously exciting trait. “It’s 2003, 30 years after the Internet was invented,” says Vinton Cerf, who codeveloped the Internet’s basic communications protocols as a Stanford University researcher in the early 1970s and is now senior vice president for architecture and technology at MCI. “We have millions of people out there who are interested in and capable of doing experimental development.” Which means it shouldn’t take long to replace that Buick.

2 comments. Share your thoughts »

Tagged: Web

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me