Google Glass Needs Phatic Interaction, Stat
Google Glass’s new demo video is impressive. The product is looking less like magic–the original teaser video made visual and experiential claims that just weren’t plausible–and more like reality. The most interesting thing about the video is how it finally confirms the most mundane, and important, aspect of Google Glass’s user experience: how do you control the damn thing? Google Glass, apparently, relies on a Siri-like interaction: you invoke it by saying “OK Glass” and then issue further instructions.
The team at Google arrived at this solution after testing “dozens and dozens” of nonverbal head-gestures, and deeming them all too weird, annoying or uncomfortable. Voice commands were the lesser evil–but even Steve Lee, Glass’s product-design lead, acknowledges that jabbering at your headset dozens of times a day is not an ideal way of interacting with a wearable computer. “I think there will likely be some way to move your head, which is comfortable and natural for a user, as well as not make them look odd and strange,” he told Fast Company last summer.
In other words: Glass needs phatic interactions. And soon.
The term “phatic” comes from linguistics, and describes verbal expressions that aren’t meant to carry information or content, but are simply there to “keep the channel open.” It’s meta-communication. Small talk is phatic; saying “Can you hear me now?” or “You’re breaking up” over a bad cellular connection is phatic, too.
Phatic expressions can also be nonverbal–especially when applied to technology interfaces, says Laura Seargeant Richardson, an Experience Design Director at Frog. “I consider a phone’s vibration that indicates a text message to be phatic,” she told me. “It’s the interrupt, the attention-getting moment, the connection between you and the data or information the technology affords.”
Phatic feedback–meta-communication from the device to the user–is already commonplace. And a wearable computer like Google Glass has to employ lots of phatic feedback, if only to avoid being too visually distracting. But Glass’s cumbersome voice-control system shows that nonverbal phatic interactions will need to flow in the other direction, too: from the user to the device.
Google knows this. “OK Glass” is already a phatic expression, says Richardson: “That’s very much like saying, ‘What’s up, how have you been, good to see you’ and so forth. It establishes a connection.” The real vision of Glass, though, is less like a smartphone and more like an omnipresent companion that’s always paying at least a low level of attention to whatever it is you’re doing. “OK Glass” isn’t the equivalent of waking your iPhone up from “sleep.” It’s not an object you turn on and off; it’s an assistant whose awareness you direct. Nonverbal, “nudge-like” phatic interactions will make that process much more fluid–and much less socially awkward.
Figuring out exactly what those phatic interactions should be is a problem that Google has decided to punt on for now. Maybe it makes more sense to let them emerge from real-world use, much like the “@ message” and hashtagging conventions on Twitter did. Whether they arrive from a top-down or bottom-up process, though, phatic interfaces are the future.
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.