Reto Meier, an “Android Developer Advocate for Google” recently laid out a fairly science-fiction account of where computer (or at least mobile) interfaces are headed.
In the spirit of the best futurism, all of his predictions - from Augmented Reality eye glasses to advanced batteries - have parallels in the real world. What follows is a walk-through of the future, expressed in terms of the not quite ready for prime time discoveries coming out of labs today.
You Can Never Have Enough Monitors
Working on the average laptop is like working on a desk that’s as big as a sheet of paper. That’s why all our “files” are half an inch high. The key to productivity and immersion is more, bigger screens - hence the proliferation of external monitors, secondary reading devices and even mobile phones with improbably large screens.
Meier’s Prediction: Five years from now, we’ll have the first widely-available flexible displays and built in HD projectors.
Reality: So-called “Pico” projectors (named for their tiny size) already exist - there’s even an HD version, the Forever Plus, that’s less than an inch on its longest dimension. And there are mobile phones, such as the Samsung Show, which have built-in pico projectors - so outside of market demand (how many of us really need this?) there’s nothing to stop this prediction from coming true.
Meier’s Prediction: 10 years from now, transparent LCD patches that can be applied to regular glasses will be available.
Reality: Transparent LCD displays exist, but that doesn’t mean they’ll be high enough resolution to be worth appliqueing to your glasses anytime soon (even within 10 years). Keep in mind that even Apple’s “Retina” display is only matches the resolution of the human eye when held at arm’s length. Two or three orders of magnitude of increase in LCD resolution would be required - in addition to transparency.
Meier’s Prediction: 20 years from now, we’ll have contact lenses that project a visual feed directly onto your retina.
Reality: This exists, but again, the resolution is terrible. Also, it’s only been tested in rabbits. Granted, you should never, ever bet against the progress of microfabrication, which can make even the 4x4 grid of pixels in today’s contact lens displays usable after some period of exponential growth. But: transmitting images to such displays will be non-trivial. Wouldn’t it be easier to simply perfect augmented reality specs? And then there’s the matter of market acceptance: imagine not being able to switch off your access to the Internet.
“Full keyboards are better. No keyboards is best”
Meier’s Prediction: 5 years from now, we’ll have larger multitouch screens, better gesture input, and flawless voice recognition.
Reality: Nothing too controversial about the first two. But there is plenty of reason to think we’ll never get flawless voice recognition - for one thing, progress in speech recognition accuracy flatlined years ago. One of the reasons is that even humans aren’t capable of it - count the number of times you say “what,” ask someone to repeat themselves or otherwise seek clarification and you’ll realize that substantial amounts of error-correction are built into human speech for a very good reason.
Meier’s Prediction: 10 years from now, full virtual keyboards and voice input eliminate physical keyboards entirely.
Reality: People who suffer from RSI might not be so happy to give up their real-world keyboards, which do much to cushion the impact of fingers on a hard surface. But if we imagine a future in which we could simply hold our hands in front of us and have a computer recognize the movements of our fingers, that’s at least a possibility. The problem, of course, is that no one knows whether or not humans can master “typing in midair.” Regular typing, on the other hand, resembles the playing of a musical instrument - and we’ve been doing that for tens of thousands of years, at least.
Meier’s Prediction: 20 years, from now, we’ll interface with computers through mind control.
Reality: Attempts to control computers with our minds run up against a very basic limitation of human physiology: brains do not have any high-bandwidth interfaces built in, other than the physical body. Detecting changes in brain state / brain waves with external electrodes is easy enough, but these change quite slowly relative to the rapidity we achieve with traditional human/computer interfaces.
In other words, if we want to meld with our machines, it has to be a fairly intrusive physical interface. Much work has been done with implanting arrays of electrodes into the brains of animals and humans, but ask yourself this: at what point are we likely to think that brain surgery is “routine.”
Given the risks associated with any surgery, in addition to the risk of infection of a physical connection traveling outside the skull, or the need to replace batteries in a wireless version, not to mention the constant pressure to upgrade as interfaces improve, you have to ask: who on earth would sign up for this?
Meier’s post is full of other interesting, less-controversial predictions about a future full of lighter, more power-dense batteries and ubiquitous connectivity, and I urge you to check it out if only for the exercise in imagining what our future holds.
What do you think? Do you agree / disagree with the preceding analysis?
Toronto wants to kill the smart city forever
The city wants to get right what Sidewalk Labs got so wrong.
Chinese gamers are using a Steam wallpaper app to get porn past the censors
Wallpaper Engine has become a haven for ingenious Chinese users who use it to smuggle adult content as desktop wallpaper. But how long can it last?
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The US military wants to understand the most important software on Earth
Open-source code runs on every computer on the planet—and keeps America’s critical infrastructure going. DARPA is worried about how well it can be trusted
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.