Hello,

We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not an Insider? Subscribe now for unlimited access to online articles.

John Pavlus

A View from John Pavlus

An Expert's View on Google's Goggles

Mark Changizi, a neurobiologist and the author of The Vision Revolution, discusses Google’s augmented-reality glasses.

  • April 6, 2012

Project Glass, the latest sci-fi concept to come out of Google’s X Lab, has gotten a lot of attention online in the past 24 hours thanks to a clever demo video that shows a user donning a pair of augmented-reality eyeglasses which project a heads-up display of video chats, location check-ins, and appointment reminders.

Reactions to the product design have ranged skeptical to enthusiastic, but I was curious about the psychological and visual-cognitive aspects of the user experience. What would these “digital overlays” actually look and feel like? Would they really be as sharp and legible as the ones shown in the video? (I don’t know about you, but I can’t focus sharply on anything less than an inch away from my eyeball, which is where the eyeglasses’ tiny screen would be dangling.) Would they obstruct my vision and make me motion-sick? How would my brain make perceptual and physical sense of the graphics: where would I “look,” exactly, in order to “watch” the tiny picture-in-picture video chat shown at the conclusion of the clip?

I asked Mark Changizi, an evolutionary neurobiologist and author of The Vision Revolution, to answer some of these questions in an audio commentary track on the video, which you can watch above.

“The graphics are not going to look like they’re floating out in front of you, because it’s only being displayed to one eye,” Changizi explains. Instead, the experience would be similar to “seeing through” the image of your own nose, which hovers semi-transparently in the periphery of our visual field at all times (even though we rarely pay attention to it). “Having non-corresponding images coming from each eye is actually something we are very much used to already,” Changizi says. “It’s not uncomfortable.” So Google’s one-eyed screen design seems biologically savvy.

Then again, Changizi continues, “they’re presenting text to you, and in order to discern that kind of detail, you need to have it in front of your fovea”—the tiny, central part of your visual field. “That’s typically *not* where we’re used to ‘seeing through’ parts of our own bodies, like our noses.” Which means that those crisp, instant-message-like alerts won’t be as simple to render as the video makes it seem.

“The more natural place to put [these interface elements], especially if it’s not text, is in the parts of your visual field where your face-parts already are,” Changizi says. This could be in the left and right periphery, where the ghost-image of your nose resides, or in the upper or bottom edges of your visual field, where you can see your cheeks when you smile or your brow when you frown. “There could be very broad geometrical or textural patterns that you could perceive vividly without having to literally ‘look at’ them,” he says. This would also make the digital overlays “feel like part of your own body,” rather than “pasted on” over the real world in an artificial or disorienting way. That experience might feel more like “sensing” the digital interface semi-subconsciously, rather than looking at it directly as if it were an iPhone screen.

A Google employee (who preferred not to be identified) confirmed to Technology Review that “the team is involved in many kinds of experimentation, and some of that will involve outdoor testing,” but wouldn’t provide any details about what that testing has revealed about the perceptual aspects of the user experience. Clearly, the concept video is meant to convey the basic premise of Project Glasses, rather than render the user experience in a biologically accurate way.

But if Google really does plan to bring this product to market before the end of 2012, as it has claimed, it is exactly these psychological and phenomenological details that will have to be examined closely.

For his part, Changizi is optimistic. “Right now we have everyone walking around focusing their vision on tiny four-inch screens held in their hands, bumping into each other,” he says. “Whatever Google does with Project Glass, it’ll surely be an improvement over that.”

Tech Obsessive?
Become an Insider to get the story behind the story — and before anyone else.

Subscribe today

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! insider.prices.plus !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus ad-free web experience, select discounts to partner offerings and MIT Technology Review events

    See details+

    What's Included

    Bimonthly magazine delivery and unlimited 24/7 access to MIT Technology Review’s website

    The Download: our daily newsletter of what's important in technology and innovation

    Access to the magazine PDF archive—thousands of articles going back to 1899 at your fingertips

    Special discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.