Who Will Solve Wearable Computing’s “Jetpack” Problem?
One of the best and most short-lived blogs about the future of technology wasn’t about the future at all, but the present–which is where people actually live, buy things, and put technology to use in their daily routines. In a word, it was against “jetpack” technology: stuff that sounds awesome and looks futuristic, but doesn’t really make a ton of sense from an interaction design or user experience perspective. Can we build jetpacks? Sure. Does anyone actually need or want them? No.
I can’t help thinking about jetpacks as the emerging field of “wearable computing” picks up steam. Google, Apple, and Microsoft are all plunging headlong into head-mounted augmented reality, which some consultancies say is supposed to be a multibillion-dollar industry by the year 2015. There’s no denying that the technology behind Google Glass is awesome: the automatic picture-taking functionality seems especially cool. But will you, me, and everyone we know really be wearing a Star Trek prop on our faces in three years, just so we can do the same stuff we already do with our smartphones, except without using our hands? Farhad Manjoo had his skepticism turned around by a hands-on demo for Technology Review, but I’m still not convinced.
It’s not that I don’t think someone will make this technology go mainstream. They will. But they’ll do it by solving the jetpack problem, not by gee-whizzing the hell out of tech journalists. There are two paths that most people see Google Glass-style wearables taking: that of the Bluetooth headset (inessential to most, useful to a few, and inescapably dorky) or the iPhone (essential to many, useful to most, and inescapably desirable). They’re both wrong. To truly go mainstream, wearables have to not be “technology” at all–in the same way that eyeglasses and wristwatches aren’t. They have to be part of the present, not “the future.”
This isn’t just a look-and-feel problem, although that’s part of it. A pair of Google Glasses designed by Gucci might not make you look a cyborg, but the lack of a truly compelling mainstream use case still looms large. Sergey Brin used Google Glass to take a zillion photos while driving. A famous fashion designer used them to make a short film. Thad Starner used them to look up answers to questions from a technology journalist from this magazine (and also get invisible tips from his PR handler at Google).
You know what would be much more instructive? Giving the glasses to 50 regular folks (a Fedex delivery guy, an office worker, a college student, a stay-at-home mom, a traffic cop, a sales rep, a barista) for a week and seeing how–or if–they fit the technology into their daily lives. They’re the ones who will have to be convinced that wearable computing makes sense. And none of them need a jetpack.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.