Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Grid Unlocked

While the Web may be a tough act to follow, grid computing advocates have been paving the way for the technology’s hoped-for commercialization by focusing on such nitty-gritty issues as standards-setting. “Remember how much we’ve gained from the fact that every computer runs the Internet Protocol,” says Foster. To achieve the same universality for grid computing, the U.S. grid community has merged with those of Europe and Asia to form the Global Grid Forum-an organization patterned after the Internet’s standards-setting body, the Internet Engineering Task Force. The forum’s goal is to make sure that Globus, Legion and any other grid protocols can interoperate seamlessly. “If every computer uses standard methods for managing authentication, authorization, describing resource capabilities and negotiating access for resources,” says Foster, “that’s a big win.”

The grid pioneers are likewise building alliances with their counterparts in commercial peer-to-peer computing. In practice, however, peer-to-peer efforts appear to be most effective for problems that can easily be broken into myriad small, independent pieces-a category that does not usually include, say, the complex physics simulations and virtual-immersion applications where grid computing really shines. Nonetheless, Foster says, the potential for synergy is clear. That’s why the Globus protocols have already been integrated into such industrial-strength peer-to-peer systems as the Condor protocols developed at the University of Wisconsin-Madison and the Entropia platform from Entropia of San Diego, both of which are designed to capture the unused capacity of an organization’s networked workstations.

The payoff for such efforts is that the computer industry now seems to be taking grid computing very seriously indeed-with the most notable example being IBM. Last August, at the same time it won the contract to build national grids in the United Kingdom and the Netherlands, as well as TeraGrid in the United States, Big Blue announced that it would “grid-enable” many of its server systems. This initiative, which would mean that servers in many institutions and organizations could be plugged into grid networks quickly and easily, was said to be as big or bigger than IBM’s commitment to Linux, which already stood at roughly $1 billion. (Indeed, IBM had already used Globus to link its own R&D labs in the United States, Israel, Switzerland and Japan.)

Yet IBM is hardly alone. Last November, eight other computer makers-Compaq, Cray, Silicon Graphics, Sun Microsystems and Veridian in the United States, together with Fujitsu, Hitachi and NEC in Japan-announced that they would implement the Globus Toolkit on their machines as a standard platform for grid computing. Then early this year, Microsoft completed a contract with Argonne to translate the existing Globus Toolkit to Windows XP, according to Todd Needham, manager of the software giant’s University Research Programs group.

If nothing else, Microsoft’s move should hasten the day when home and office computers will be able to join the grid by the millions, just by plugging in. But perhaps just as significantly, it also symbolizes the fast-developing alliance between grid computing and “Web services,” a similar technology that has emerged independently over the past few years and has been embraced in slightly different forms by Microsoft, IBM and Sun, among others. Like grid computing, the Web services idea revolves around future software applications that are created on the fly out of programs and data that live on the Internet, not the user’s machine. The main difference between this idea and grid computing is that Web services software tends to be much more closely tied to the World Wide Web protocols, as well as to Web-based standards such as XML.

Once again, however, as Microsoft and IBM’s embrace of Globus suggests, the potential for synergy is obvious. In January, Foster, Kesselman, IBM’s Jeffrey Nick and Argonne’s Steven Tuecke proposed an Open Grid Services Architecture that would integrate the two approaches, and announced that this framework would be implemented as version 3.0 of the Globus Toolkit. IBM, Microsoft, Platform Computing, Entropia and Avaki announced their support of the new architecture, with other companies to follow.
And in the future? History is indeed about to repeat itself, declares grid computing advocate Smarr-except that the explosion of grid activity may very well dwarf even the Internet boom of the 1990s. In the future envisioned by Smarr, grids of every size will be interlinked. The “supernodes,” like TeraGrid, will be networked clusters of supercomputers serving users on a national or international scale. The more numerous mid-sized nodes will use software such as Entropia to harness the power of multiple desktop and laptop PCs. If the TeraGrid and other supernodes are like central electric power stations, Smarr explains, these smaller nodes will be like solar energy collectors that capture a diffuse yet enormous resource.

Still more numerous will be the millions of individual nodes: personal machines that users plug into the grid to tap its power as needed. If, say, the members of a citizen’s group were worried about a proposed development project, they could use the grid to run the same simulations that the developers and government officials involved used. That way, they could easily see the effect of the development on everything from ground water to traffic patterns to employment. By using grid-based tele-immersion technologies, the citizens could even walk through the simulated project and get a realistic sense of what it would feel like to be there.
And thanks to the wireless revolution, “micronodes” will be everywhere. “Because of the miniaturization of components,” says Smarr, “we’ll have billions of endpoints that are sensors, actuators and embedded processors. They’ll be in everything, monitoring stress in bridges, monitoring the environment-ultimately, they’ll even be in our bodies, monitoring our hearts.”

And that, he emphasizes, is why we have to lay a solid foundation for the grid now, building in security and all the rest from the start. “We can’t do it as an afterthought,” he says. “The planet is assembling the grid infrastructure that it will live on for the rest of 21st century.”

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me