Skip to Content

The Next Computer Interface

The desktop metaphor was a brilliant innovation-30 years ago. Now it’s an unmanageable mess, and the search is on for a better way to handle information.
December 1, 2001

“The desktop is dead,” declares David Gelernter. Gelernter is referring to the “desktop metaphor”-the term frequently used for the hierarchical system of files, folders and icons that we use to manage information stored on our home or office computers. At the annual gathering of technophiles at TechXNY/PC Expo 2001 in New York last June, he told the rapt crowd attending his keynote speech that the desktop metaphor is nothing more than virtual Tupperware. “Our electronic documents are scattered by the thousands in all sorts of little containers all over the place,” he said. “The more information and the more computers in our lives, the more of a nuisance this system becomes.”

For the past decade or so Gelernter has been campaigning for a new metaphor to overthrow the desktop-first in research he carried out at Yale University, where he is a professor of computer science, and now as chief scientist of his new company, Mirror Worlds Technologies, with offices in New Haven, CT, and New York City. In March, Mirror Worlds announced a novel metaphor called Scopeware, software that automatically arranges your computer files in chronological order and displays them on your monitor with the most recent files featured prominently in the foreground. Scopeware is far more sweeping than a simple rearrangement of icons, however: in effect, it transfers the role of file clerk from you to the computer, seamlessly ordering documents of all sorts into convenient, time-stamped files.

If you have ever forgotten what you named a file or which folder you put it in, you probably will agree that it’s time for a change. The desktop metaphor is decades old, arising from early-1970s work at Xerox’s fabled Palo Alto Research Center, and was never intended to address today’s computing needs. Indeed, the product that brought the metaphor to mass-market attention was Apple Computer’s 1984 Macintosh; it had no built-in hard drive, and its floppy disks each stored only 400 kilobytes of information. Today we’re using the same metaphor to manage the countless files on our ever more capacious hard drives, as well as to access the virtually limitless information on the Web. The result? Big, messy hierarchies of folders. Favorites lists where you never find anything again. Pull-down menus too long to make sense of.

In other words, the desktop metaphor puts the onus on our brains to juggle this expanding collection of files, folders and lists. Yet “our neurons do not fire faster, our memory doesn’t increase in capacity and we do not learn to think faster as time progresses,” notes Bill Buxton, chief scientist of Alias/Wavefront, a leading maker of graphic-design tools. Buxton argues that without better tools to exploit the immense processing power of today’s computers, that power is not much good to us.

That’s why many researchers-at universities and startups like Gelernter’s Mirror Worlds as well as giants like Microsoft and IBM-are searching for alternatives. They’re examining metaphors taken from other media, such as books or diaries or film; 3-D schemes that use our sense of spatial orientation to create the illusion of depth on-screen, so that documents look closer or farther away depending on their importance to us; alternatives that borrow from video games the notion of having an intelligent guide, or avatar, to help us find what we’re looking for; or even theories that radically change the notion of what a “computer” is, so that we no longer think of devices as computers at all and are therefore open to new ways of interacting with them.

“The desktop metaphor made assumptions about how we use computers that just aren’t true anymore,” asserts Don Norman, cofounder of the Nielsen Norman Group, famed critic of computer design and author of The Design of Everyday Things. “It’s time to throw away the old model.”

Learning Esperanto

It will take a Herculean effort to overthrow the desktop metaphor-many observers believe it will prove impossible-chiefly because the three-decade-old interface, popularized by the Mac and quickly made nearly ubiquitous by Microsoft’s Windows, has become integral to our very notion of personal computing. “A couple of years ago we did a study on how to introduce new computing models,” says Dan Russell, research director in the field of human-computer interaction at the IBM Almaden Research Center in San Jose, CA. “We wanted to find people who didn’t understand the function of file folders, how to open files, how to delete files. We couldn’t find anyone. That makes it hard to change people’s expectations of how computers should behave.”

With this in mind, looking over the landscape of alternatives, one comes away wondering if the desktop metaphor has become a part of basic cultural literacy, like language itself, and that getting people to try any of the suggested improvements is like getting them to learn an international language like Esperanto-a good idea in theory, but for most people not worth the trouble.

Even its biggest critics today acknowledge that the desktop metaphor was an extraordinary breakthrough that tapped into the way people actually work and think, a vast improvement over typing in text commands alongside a blinking cursor. Still, people like Gelernter remain undaunted in their belief that its moment has passed. “It was a brilliant idea at the time,” he says. “But it’s explicitly tied to the way we managed information in the 1940s, with filing cabinets filled with separate folders of information. Even 10 years ago the notion of putting stuff in files and sticking certain files in folders and others on your desktop was already broken down and failing.”

Gelernter’s alternative, Scopeware, is the outcome of a decade of research and development at Yale. Scopeware replaces the desktop metaphor with what Gelernter calls a “narrative information system,” or what you might call the diary metaphor, where every type of file-an e-mail message, a word processing document, a digital image-is stored chronologically, in what appears on-screen to be a tiling stack of file cards.

Search for the term “demo” on the Mirror Worlds Web site, for example, and you get a stack of six virtual file cards, dated back to February 9, 1998. In the upper right-hand corner of each card you see an icon indicating the type of file-in this case, four files in Adobe Acrobat, one in Microsoft Word and one in Microsoft’s Internet Explorer that was taken from a Web page. On each card you see the title of the file, plus a small box previewing what’s inside. Moving your mouse’s pointer over one of the cards brings up a summary of the document and a larger picture, so you can see if it’s what you want; a double click opens the file itself. Search for “Gelernter” and you get about 70 such cards, chronologically arranged, with older documents receding into the background. You know immediately how to navigate. Scopeware works.

The diary metaphor has some clear advantages over the desktop metaphor. It is based on the notion that what we have created, modified or even looked at most recently is probably still most important to us. And, Scopeware’s inventor maintains, our sense of time is a strong organizing principle that can help us locate a file simply because we remember when we used it last. Rather than requiring you to manually rifle through buckets of information stored on your hard drive or inside an application like e-mail, Scopeware sorts information automatically, streaming it into predetermined categories.

But kill the desktop? While Gelernter has deeply criticized the desktop metaphor in his books and in a manifesto about the future of digital technology called The Second Coming, it’s our years of familiarity with that Xerox PARC design-the point and click, the icons and the menus-that make Scopeware so intuitive. Those of us who were 15 or older when we used our first mouse still remember how difficult it was, initially, to equate the horizontal movement of our mousing hand with the movement of the cursor on-screen. Now it’s natural. And Scopeware, if it succeeds, will do so because it makes use of what we already have. The company has positioned the product as a business software tool that helps companies organize and share information, rather than as a replacement for Windows; it works through a browser rather than as an operating system. “We aren’t taking on Windows at all,” Gelernter says. “That would be suicidal.”

That’s the quandary that researchers in the field of human-computer interaction have long struggled with: make something look too different on-screen, however good it is, and you will fail. “About ten years ago I realized that I wasn’t able to say, Okay, turn off your machine, because tomorrow I’m going to bring you a brave new world,’” says Ramana Rao, a founder of Inxight Software, a Santa Clara, CA-based startup funded by Xerox that is also exploring and marketing new user interfaces. “I needed to accept that there are hundreds of millions of PCs out there, and figure out where within that structure I could insert the thin edge of a wedge of a new way of doing things, where I could show you something incrementally better, then start pounding on the wedge until the old face drops away.”

Like Mirror Worlds, Inxight doesn’t intend a full-fledged assault on the desktop metaphor but rather seeks to give users-particularly high-end businesses willing to pay premium prices-a choice in how their employees see and manipulate information on a computer monitor. The company’s flagship technology is Star Tree, first developed at PARC in 1993. In the 1980s Xerox missed out on profiting from the desktop metaphor it developed; Inxight was founded in part to avoid that mistake with Star Tree, an alternative that uses space, rather than time, as an organizing principle.

Star Tree replaces hierarchies of pull-down menus with on-screen icons, whose relationships to one another can be viewed at a glance. A Star Tree-generated map of a Web site, for example, might show an icon representing the page you are currently on, with lines radiating outward to icons representing links from that page. You can also see what is linked to the links and so on-four layers of relationships at any one time, arranged on your screen in the shape of a globe (see Star Tree screen shot, left). If you want to find a particular set of links, you can spin the globe around on any axis by simply moving your cursor around the screen. It’s a visually appealing way to organize information-like Web pages or organizational charts-and it allows you to see relationships among files more readily than pull-down menus do. But Rao admits that the obstacles to getting the concept adopted widely are huge, and that his primary goal isn’t to overthrow the desktop metaphor but to become a part of it. “So when Microsoft says, Hey, we want to make this a part of Windows, so sell it or we’re going to pulverize you,’ then boy, we’ve won,” he says. “That is the goal. I’d love to hear that.”

Metaphor Trouble

Even Microsoft apparently sees the limitations in the desktop metaphor. The software Goliath has tried to introduce new metaphors on a regular basis-even though it has the most to lose if there is ever a disruption in the status quo. When the changes have been too literal or too radical, they have failed, even with the clout of Microsoft’s marketing behind them. For instance, Microsoft Bob, introduced in 1995, used the metaphor of a cozy-looking living room, complete with an animated guide, to help users navigate their computer systems. It was designed with new computer users in mind, as a way to give them a sense that they had nothing to fear. It failed spectacularly, mostly because the metaphor was so literal that it got in the way-users spent too much time navigating the desktop, trying to figure out how the virtual furniture, shelves and cozy fireplace related to a task like opening a file or application.

With Bob, in other words, it proved impossible to ignore all the distracting elements of the virtual living room. “In one sense the notion of metaphor’ can get you into trouble,” says Rao. “We call it the desktop metaphor,’ but the metaphor isn’t really very completely drawn out. It was just something to call it in the beginning, and after a while the computer desktop metaphor became its own thing, rather than something we thought of as like something else. There have been times people try to extend the metaphor, to make it look more like a desk or let you have piles of stuff like a real desktop, and it never works.”

Through Bob’s failure, though, Microsoft claims to have learned some things about what will work in new on-screen metaphors. For instance, Bob’s idea of an animated guide lives on in current versions of Windows as Clippy. While this personified paper clip is still far too obtrusive for many people-the little assistant is turned off by default in the new software-Microsoft researchers maintain that the idea behind it is extremely compelling. That’s because it points the way toward a design, alternatively known as an “adaptive” or “attentive” or “intelligent” interface, where the computer senses what we need and gives it to us. Call it the “friendly mentor” metaphor.

“We’re interested in an interface that adapts based on what the user is trying to accomplish,” says George Robertson, senior researcher with Microsoft’s user interface research group and formerly a principal scientist at Xerox PARC. “We try to build models of what the user is doing, but also to understand the nature of interruptions.” He notes that part of the problem now is that “Clippy tends to interrupt you when you’re in the middle of typing.” For instance, every time you type in the date in a Word file, it asks you if you need help formatting a letter. Microsoft is working toward solving that problem by building an inference engine that can anticipate your needs based on what you’re typing, as Clippy does, but won’t interrupt you. Rather, like a helpful personal servant, it will keep its observations to itself until you ask for help.

Conceivably, an inference engine can be made so intelligent that any change in the desktop metaphor itself becomes unnecessary: machines would automatically present information to you as you need it, eliminating the clutter and confusion that currently plague our computer desktops. If developers were able to build that degree of intelligence into our computers, they’d no longer need to overcome the high hurdle of educating all of us about how to use a new metaphor. Instead, we’d use the old one, but with far better results-much the way we use the same “interface” to drive automobiles today as in the days of the Model T. But behind that relatively unchanging interface, new tools such as antilock brakes, power steering, fuel injection systems and computerized warning systems aid us tremendously as we drive.

But that may be getting ahead of things. A nearer-term solution to the data glut and file loss perpetuated by the desktop metaphor will be to use 3-D graphics techniques, currently in vogue only in games and science and engineering software. “From our experience [with user groups], 3-D can make a real improvement,” Microsoft’s Robertson says. “It’s possible to pack a lot more information into the same screen space. You’re taking advantage of human perception and our ability to see depth relationships. You’re also taking advantage of human spatial memory. In the real world, if I put a piece of paper in a pile, I can remember where it is weeks later.”

One of Microsoft’s long-standing research projects to employ 3-D space is Data Mountain, which allows you to place files on what looks like a surface tilted at a 30-degree angle so that objects at the top of the screen appear smaller and farther away. Robertson found that spatial memory allowed participants in user studies to remember exactly where they had stored up to 800 images on the “mountain,” even after an absence of six months. “It works very well for storing things like photos or favorite Web sites because these things look different, and it’s very easy to spot the differences even when the images are small,” says Robertson. He concedes that Data Mountain works less well for documents; while thumbnail-sized images of pictures can be easily recognized, documents shrunk to that size all look more or less the same, making it very difficult to distinguish one from the other.

Another current research project at Microsoft that employs the virtues of 3-D is Task Gallery, where the metaphor is one of multiple rooms in which documents are “hung” until they are needed. By simply moving your mouse around the screen, you can “walk” from room to room in this virtual gallery, and hang the walls with miniature representations of your Word documents, Web pages, Adobe Acrobat files and so on. Conceivably, infinite “rooms” can be added, in which related documents and other files can be stored. However, Robertson admits that the project team has once again confronted the problem of people taking metaphors too literally. In user tests, for instance, participants balked at storing documents on the gallery’s virtual “floors” or “ceilings,” in effect reducing the usable space. Still, Robertson expects both Task Gallery and Data Mountain to become alternatives for organizing data in some Microsoft applications, most likely Explorer, in the next two years.

Layers of Paint

Visit any major company or university computer science lab around the world and you will likely find some kind of new interface work under way. Xerox PARC, the University of California, Berkeley, and Yale University, among others, continue to explore new on-screen metaphors. So does IBM, another powerhouse that, like Microsoft, is contemplating the demise of the desktop and sees “attentive” computing as an inevitable development. Its BlueEyes project uses a combination of sensors, video cameras and microphones to interpret your facial expressions, where you’re looking and even what you’re saying. That way, rather than clicking your desktop Web browser, you can access Internet information through very subtle human-computer interactions. You could verbally ask your Web browser to go to CNN Online. While you’re there, the browser might observe where you look on the page and offer pages with related content for viewing-in theory making it virtually effortless to get what you want from your computer at all times without having to stop at the desktop.

“I have no doubt that in ten years our computers will be attentive in some appropriate way,” says Robert Morris, director of the IBM Almaden Research Center. “As we learn more about human behavior, and what is considered okay and not okay from a privacy perspective, we will learn, and we’ll eventually end up with a great interface.”

But at least for the foreseeable future, interface designers see such work as an alternative to, rather than a replacement for, the desktop metaphor-a view shared by even the most caustic critics of the desktop. “Software today grows in layers, where we put the new over the old, like slapping a new coat of paint on,” says Gelernter. “When people instituted browsers they didn’t throw out Windows, and when they instituted Windows they didn’t throw out DOS.” By layering these alternatives over Windows, designers can drastically reduce the learning curve and hasten acceptance of their innovations.

However, some researchers in the field of human-computer interaction think it’s time to throw out thinking about “metaphor” altogether-after all, it hasn’t gotten us too far since the 1970s-and to begin designing devices that have no metaphor, no real-world analogy. It’s not the desktop metaphor that’s holding us back, they say; it’s the whole notion that we need to make computers act like something other than what they are.
“I’ve spent too much time with metaphors,” grumbles Don Norman. “The main problem with the metaphor is that it’s just a stand-in for something else. It’s not the thing you’re using. It may help a beginner user for the first 15 minutes, but after that it gets in the way. When I drive I don’t need metaphors. I turn the steering wheel left, and I go left.”

It’s not just the desktop metaphor that needs fixing, in other words, but the whole PC package, the way we relate to our computing devices. The desktop metaphor is so tightly wedded in our minds to keyboard, mouse and monitor that unless the outside package changes, the on-screen presentation doesn’t have much of a chance to evolve either. Break out of that design, though, and all sorts of things become possible.
Alias/Wavefront’s Bill Buxton predicts a world where the personal computer stops trying to be a general-purpose device, like a Swiss Army knife, and goes back instead to what it is good at: making text documents and spreadsheets. The problem isn’t the desktop metaphor at all-it’s that we’re trying to use our personal computers for tasks they weren’t meant to perform. Peel those tasks away to specialized devices-music to MP3 players, films to movie players, news and information to specialized readers-and you’ve solved the desktop metaphor problem. Each device will evolve its own best interface, depending on its specialized use. Buxton’s favorite evidence of this process is the Palm Pilot.

“The heart of Silicon Valley was littered with the corpses of companies trying pen-based computing,” he says. “You had Eo, Go, Momenta, Gridthen along came Palm. It did nothing that couldn’t have been done before but did it right. So even though we’ll have many apparent failures of new design concepts, there will be companies that go back and get it right.”

Throwing out the desktop metaphor, however, might be even tougher than replacing it with new metaphors-and not everyone agrees that the PC is on its way out. “That kind of thinking is wrong,” says Gelernter. “The PC isn’t a Swiss Army knife. It’s like a hammer. People don’t want a million different tools. They want a single hammer that can do a million things, because it’s a tremendously flexible, elegant and powerful tool.”

But even if the desktop metaphor never goes away completely, it will likely recede, buried perhaps beneath Robertson’s Data Mountain or Gelernter’s Scopeware. Or maybe, as Buxton predicts, it will drift back into performing only the tasks for which it was created and for which it is uniquely suited. What flowering of alternatives will replace it is still a matter of some conjecture. But if the tenacity of researchers in the field is any indication, big things are bound to happen eventually. As cyberpunk author William Gibson has said, the future is already here. It’s merely a question of figuring out which future it’s going to be.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.