Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Layers of Paint

Visit any major company or university computer science lab around the world and you will likely find some kind of new interface work under way. Xerox PARC, the University of California, Berkeley, and Yale University, among others, continue to explore new on-screen metaphors. So does IBM, another powerhouse that, like Microsoft, is contemplating the demise of the desktop and sees “attentive” computing as an inevitable development. Its BlueEyes project uses a combination of sensors, video cameras and microphones to interpret your facial expressions, where you’re looking and even what you’re saying. That way, rather than clicking your desktop Web browser, you can access Internet information through very subtle human-computer interactions. You could verbally ask your Web browser to go to CNN Online. While you’re there, the browser might observe where you look on the page and offer pages with related content for viewing-in theory making it virtually effortless to get what you want from your computer at all times without having to stop at the desktop.

“I have no doubt that in ten years our computers will be attentive in some appropriate way,” says Robert Morris, director of the IBM Almaden Research Center. “As we learn more about human behavior, and what is considered okay and not okay from a privacy perspective, we will learn, and we’ll eventually end up with a great interface.”

But at least for the foreseeable future, interface designers see such work as an alternative to, rather than a replacement for, the desktop metaphor-a view shared by even the most caustic critics of the desktop. “Software today grows in layers, where we put the new over the old, like slapping a new coat of paint on,” says Gelernter. “When people instituted browsers they didn’t throw out Windows, and when they instituted Windows they didn’t throw out DOS.” By layering these alternatives over Windows, designers can drastically reduce the learning curve and hasten acceptance of their innovations.

However, some researchers in the field of human-computer interaction think it’s time to throw out thinking about “metaphor” altogether-after all, it hasn’t gotten us too far since the 1970s-and to begin designing devices that have no metaphor, no real-world analogy. It’s not the desktop metaphor that’s holding us back, they say; it’s the whole notion that we need to make computers act like something other than what they are.
“I’ve spent too much time with metaphors,” grumbles Don Norman. “The main problem with the metaphor is that it’s just a stand-in for something else. It’s not the thing you’re using. It may help a beginner user for the first 15 minutes, but after that it gets in the way. When I drive I don’t need metaphors. I turn the steering wheel left, and I go left.”

It’s not just the desktop metaphor that needs fixing, in other words, but the whole PC package, the way we relate to our computing devices. The desktop metaphor is so tightly wedded in our minds to keyboard, mouse and monitor that unless the outside package changes, the on-screen presentation doesn’t have much of a chance to evolve either. Break out of that design, though, and all sorts of things become possible.
Alias/Wavefront’s Bill Buxton predicts a world where the personal computer stops trying to be a general-purpose device, like a Swiss Army knife, and goes back instead to what it is good at: making text documents and spreadsheets. The problem isn’t the desktop metaphor at all-it’s that we’re trying to use our personal computers for tasks they weren’t meant to perform. Peel those tasks away to specialized devices-music to MP3 players, films to movie players, news and information to specialized readers-and you’ve solved the desktop metaphor problem. Each device will evolve its own best interface, depending on its specialized use. Buxton’s favorite evidence of this process is the Palm Pilot.

“The heart of Silicon Valley was littered with the corpses of companies trying pen-based computing,” he says. “You had Eo, Go, Momenta, Gridthen along came Palm. It did nothing that couldn’t have been done before but did it right. So even though we’ll have many apparent failures of new design concepts, there will be companies that go back and get it right.”

Throwing out the desktop metaphor, however, might be even tougher than replacing it with new metaphors-and not everyone agrees that the PC is on its way out. “That kind of thinking is wrong,” says Gelernter. “The PC isn’t a Swiss Army knife. It’s like a hammer. People don’t want a million different tools. They want a single hammer that can do a million things, because it’s a tremendously flexible, elegant and powerful tool.”

But even if the desktop metaphor never goes away completely, it will likely recede, buried perhaps beneath Robertson’s Data Mountain or Gelernter’s Scopeware. Or maybe, as Buxton predicts, it will drift back into performing only the tasks for which it was created and for which it is uniquely suited. What flowering of alternatives will replace it is still a matter of some conjecture. But if the tenacity of researchers in the field is any indication, big things are bound to happen eventually. As cyberpunk author William Gibson has said, the future is already here. It’s merely a question of figuring out which future it’s going to be.

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me