Reto Meier, an “Android Developer Advocate for Google” recently laid out a fairly science-fiction account of where computer (or at least mobile) interfaces are headed.
In the spirit of the best futurism, all of his predictions - from Augmented Reality eye glasses to advanced batteries - have parallels in the real world. What follows is a walk-through of the future, expressed in terms of the not quite ready for prime time discoveries coming out of labs today.
You Can Never Have Enough Monitors
Working on the average laptop is like working on a desk that’s as big as a sheet of paper. That’s why all our “files” are half an inch high. The key to productivity and immersion is more, bigger screens - hence the proliferation of external monitors, secondary reading devices and even mobile phones with improbably large screens.
Meier’s Prediction: Five years from now, we’ll have the first widely-available flexible displays and built in HD projectors.
Reality: So-called “Pico” projectors (named for their tiny size) already exist - there’s even an HD version, the Forever Plus, that’s less than an inch on its longest dimension. And there are mobile phones, such as the Samsung Show, which have built-in pico projectors - so outside of market demand (how many of us really need this?) there’s nothing to stop this prediction from coming true.
Meier’s Prediction: 10 years from now, transparent LCD patches that can be applied to regular glasses will be available.
Reality: Transparent LCD displays exist, but that doesn’t mean they’ll be high enough resolution to be worth appliqueing to your glasses anytime soon (even within 10 years). Keep in mind that even Apple’s “Retina” display is only matches the resolution of the human eye when held at arm’s length. Two or three orders of magnitude of increase in LCD resolution would be required - in addition to transparency.
Meier’s Prediction: 20 years from now, we’ll have contact lenses that project a visual feed directly onto your retina.
Reality: This exists, but again, the resolution is terrible. Also, it’s only been tested in rabbits. Granted, you should never, ever bet against the progress of microfabrication, which can make even the 4x4 grid of pixels in today’s contact lens displays usable after some period of exponential growth. But: transmitting images to such displays will be non-trivial. Wouldn’t it be easier to simply perfect augmented reality specs? And then there’s the matter of market acceptance: imagine not being able to switch off your access to the Internet.
“Full keyboards are better. No keyboards is best”
Meier’s Prediction: 5 years from now, we’ll have larger multitouch screens, better gesture input, and flawless voice recognition.
Reality: Nothing too controversial about the first two. But there is plenty of reason to think we’ll never get flawless voice recognition - for one thing, progress in speech recognition accuracy flatlined years ago. One of the reasons is that even humans aren’t capable of it - count the number of times you say “what,” ask someone to repeat themselves or otherwise seek clarification and you’ll realize that substantial amounts of error-correction are built into human speech for a very good reason.
Meier’s Prediction: 10 years from now, full virtual keyboards and voice input eliminate physical keyboards entirely.
Reality: People who suffer from RSI might not be so happy to give up their real-world keyboards, which do much to cushion the impact of fingers on a hard surface. But if we imagine a future in which we could simply hold our hands in front of us and have a computer recognize the movements of our fingers, that’s at least a possibility. The problem, of course, is that no one knows whether or not humans can master “typing in midair.” Regular typing, on the other hand, resembles the playing of a musical instrument - and we’ve been doing that for tens of thousands of years, at least.
Meier’s Prediction: 20 years, from now, we’ll interface with computers through mind control.
Reality: Attempts to control computers with our minds run up against a very basic limitation of human physiology: brains do not have any high-bandwidth interfaces built in, other than the physical body. Detecting changes in brain state / brain waves with external electrodes is easy enough, but these change quite slowly relative to the rapidity we achieve with traditional human/computer interfaces.
In other words, if we want to meld with our machines, it has to be a fairly intrusive physical interface. Much work has been done with implanting arrays of electrodes into the brains of animals and humans, but ask yourself this: at what point are we likely to think that brain surgery is “routine.”
Given the risks associated with any surgery, in addition to the risk of infection of a physical connection traveling outside the skull, or the need to replace batteries in a wireless version, not to mention the constant pressure to upgrade as interfaces improve, you have to ask: who on earth would sign up for this?
Meier’s post is full of other interesting, less-controversial predictions about a future full of lighter, more power-dense batteries and ubiquitous connectivity, and I urge you to check it out if only for the exercise in imagining what our future holds.
What do you think? Do you agree / disagree with the preceding analysis?
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.