Skip to Content
Uncategorized

You Will Want Google Goggles

I thought that glasses with “augmented reality” would be hopelessly dorky and could never go mainstream—until I saw the technology in action.

At first glance, Thad Starner does not look out of place at Google. A pioneering researcher in the field of wearable computing, Starner is a big, charming man with unruly hair. But everyone who meets him does a double take, because mounted over the left lens of his eyeglasses is a small rectangle. It looks like a car’s side-view mirror made for a human face. The device is actually a minuscule computer monitor aimed at Starner’s eye; he sees its display—pictures, e-mails, anything—superimposed on top of the world, Terminator-style.

Google cofounder Sergey Brin wore a Project Glass prototype at a charity function in San Francisco in April.

Starner’s heads-up display is his own system, not a prototype of Project Glass, Google’s recently announced effort to build augmented-reality goggles. In April, Google X, the company’s special-projects lab, posted a video in which an imaginary user meanders around New York City while maps, text messages, and calendar reminders pop up in front of his eye—a digital wonderland overlaid on the analog world. Google says the project is still in its early phases; Google employees have been testing the technology in public, but the company has declined to show prototypes to most journalists, including myself.

Instead, Google let me speak to Starner, a technical lead for the project, who is one of the world’s leading experts on what it’s like to live a cyborg’s life. He has been wearing various kinds of augmented-reality goggles full time since the early 1990s, which once meant he walked around with video displays that obscured much of his face and required seven pounds of batteries. Even in computer science circles, then, Starner has long been an oddity. I went to Google headquarters not only to find out how he gets by in the world but also to challenge him. Project Glass—and the whole idea of machines that directly augment your senses—seemed to me to be a nerd’s fantasy, not a potential mainstream technology.

But as soon as Starner walked into the colorful Google conference room where we met, I began to question my skepticism. I’d come to the meeting laden with gadgets—I’d compiled my questions on an iPad, I was recording audio using a digital smart pen, and in my pocket my phone buzzed with updates. As we chatted, my attention wandered from device to device in the distracted dance of a tech-addled madman.

Things Reviewed

  • Google’s 
Project Glass

Starner, meanwhile, was the picture of concentration. His tiny display is connected to a computer he carries in a messenger bag, a machine he controls with a small, one-handed keyboard that he’s always gripping in his left hand. He owns an Android phone, too, but he says he never uses it other than for calls (though it would be possible to route calls through his eyeglass system). The spectacles take the place of his desktop computer, his mobile computer, and his all-knowing digital assistant. For all its utility, though, Starner’s machine is less distracting than any other computer I’ve ever seen. This was a revelation. Here was a guy wearing a computer, but because he could use it without becoming lost in it—as we all do when we consult our many devices—he appeared less in thrall to the digital world than you and I are every day. “One of the key points here,” Starner says, “is that we’re trying to make mobile systems that help the user pay more attention to the real world as opposed to retreating from it.”

By the end of my meeting with Starner, I decided that if Google manages to pull off anything like the machine he uses, wearable computers seem certain to conquer the world. It simply will be better to have a machine that’s hooked onto your body than one that responds to it relatively slowly and clumsily.

I understand that this might not seem plausible now. When Google unveiled Project Glass, many people shared my early take, criticizing the plan as just too geeky for the masses. But while it will take some time to get used to interactive goggles as a mainstream necessity, we have already gotten used to wearable electronics such as headphones, Bluetooth headsets, and health and sleep monitoring devices. And even though you don’t exactly wear your smart phone, it derives its utility from its immediate proximity to your body.

In fact, wearable computers could end up being a fashion statement. They actually fit into a larger history of functional wearable objects—think of glasses, monocles, wristwatches, and whistles. “There’s a lot of things we wear today that are just decorative, just jewelry,” says Travis ­Bogard, vice president of product management and strategy at Jawbone, which makes a line of fashion-conscious Bluetooth headsets. “When we talk about this new stuff, we think about it as ‘functional jewelry.’” The trick for makers of wearable machines, Bogard explains, is to add utility to jewelry without negatively affecting aesthetics.

One criticism of Google’s demo video of Project Glass is that it paints a picture of a guy lost in his own digital cocoon. But Starner argues that a heads-up display will actually tether you more firmly to real-life social interactions.

This wasn’t possible 20 years ago, when the technology behind Starner’s cyborg life was ridiculously awkward. But Starner points out that since he first began wearing his goggles, wearable computing has followed the same path as all digital technology—devices keep getter smaller and better, and as they do, they become ever more difficult to resist. “Back in 1993, the question I would always get was, ‘Why would I want a mobile computer?’” he says. “Then the Newton came out and people were still like, ‘Why do I want a mobile computer?’ But then the Palm Pilot came out, and then when MP3 players and smart phones came out, people started saying, ‘Hey, there’s something really useful here.’” Today, ­Starner’s device is as small as a Bluetooth headset, and as researchers figure out ways to miniaturize displays—or even embed them into glasses and contact lenses—they’ll get still less obtrusive.

At the moment, the biggest stumbling block may be the input device—Starner’s miniature keyboard requires a learning curve that many consumers would find daunting, and keeping a trackpad in your pocket might seem a little creepy. The best input system eventually could be your voice, though it could take a few years to perfect that technology. Still, Starner says, the wearable future is coming into focus. “It’s only been recently that these on-body devices have enough power, the networks are good enough, and the prices have gone down enough that it’s actually capturing people’s imagination,” Starner says. “This display I’m wearing costs $3,000—that’s not reasonable for most people. But I think you’re going to see it happen real soon.”

One criticism of Google’s demo video of Project Glass is that it paints a picture of a guy lost in his own digital cocoon. But Starner argues that a heads-up display will actually tether you more firmly to real-life social interactions. He says the video’s augmented-­reality visualizations—images that are tied to real-world sights, like direction bubbles that pop up on the sidewalk, showing you how to get to your friend’s house—are all meant to be relevant to what you’re doing at any given point and thus won’t seem like distracting interruptions.

Much of what I think you’ll use goggles for will be the sort of quotidian stuff you do on your smart phone all the time—look up your next appointment on your calendar, check to see whether that last text was important, quickly fire up Shazam to learn the title of a song you heard on the radio. So why not just keep your smart phone? Because the goggles promise speed and invisibility. Imagine that one afternoon at work, you meet your boss in the hall and he asks you how your weekly sales numbers are looking. The truth is, you haven’t checked your sales numbers in a few days. You could easily look up the info on your phone, but how obvious would that be? A socially aware heads-up display could someday solve this problem. At Starner’s computer science lab at the Georgia Institute of Technology, grad students built a wearable display system that listens for “dual-purpose speech” in conversation—speech that seems natural to humans but is actually meant as a cue to the machine. For instance, when your boss asks you about your sales numbers, you might repeat, “This week’s sales numbers?” Your goggles—with Siri-like prowess—would instantly look up the info and present it to you in your display.

You could argue that the glasses would open up all kinds of problems: would people be concerned that you were constantly recording them? And what about the potential for deeper distraction—goofing off by watching YouTube during a meeting, say? But Starner counters that most of these problems exist today. Your cell phone can record video and audio of everything around you, and your iPad is an ever-­present invitation to goof off. Starner says we’ll create social and design norms for digital goggles the way we have with all new technologies. For instance, you’ll probably need to do something obvious—like put your hand to your frames—to take a photo, and perhaps a light will come on to signal that you’re recording or that you’re watching a video. It seems likely that once we get over the initial shock, goggles could go far in mitigating many of the social annoyances that other gadgets have caused.

I know this because during my hour-long conversation with Starner, he was constantly pulling up notes and conducting Web searches on his glasses, but I didn’t notice anything amiss. To an outside observer, he would have seemed far less distracted than I was. “One of the coolest things is that this makes me more socially graceful,” he says.

I got to see this firsthand when Starner let me try on his glasses. It took my eye a few seconds to adjust to the display, but after that, things began to look clearer. I could see the room around me, except now, hovering off to the side, was a computer screen. Suddenly I noticed something on the screen: Starner had left open some notes that a Google public-relations rep had sent him. The notes were about me and what Starner should and should not say during the interview, including “Try to steer the conversation away from the specifics of Project Glass.” In other words, Starner was being coached, invisibly, right there in his glasses. And you know what? He’d totally won me over.

Farhad Manjoo is the technology columnist at Slate and contributes regularly to Fast Company and the New York Times. He is the author of True Enough: Learning to Live in a Post-Fact Society.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.