The early 90’s were awesome. Bill Watterson was still drawing Calvin and Hobbes, the tattered remnants of the Cold War were falling down around our ears, and most of Wall Street was convinced the Macintosh was a computer for effete graphic designers and Apple was more or less on its way out.
Into this time of innocence came a radical vision of the future, epitomized by the movie Lawnmower Man. It was a future in which Hollywood starlets had virtual intercourse with developmentally challenged computer geeks in Tron-style bodysuits and everything looked like it was rendered by a Commodore Amiga.
Anyway, at that time Virtual Reality was a Big Deal. Jaron Lanier, the computer scientist most closely associated with the idea, was bouncing from one important position to another, developing virtual worlds with head mounted displays and, later, heading up the National Tele-immersion initiative, “a coalition of research universities studying advanced applications for Internet 2,” whatever the heck that was.
Even so, some sensed that the technology wasn’t bringing about the revolution that had been promised. In a 1993 column for Wired that earns a 9 out of 10 for hilarity and a 2 out of 10 for accuracy, Nicholas Negroponte, founder of the MIT Media Lab (who I’m praying will have a sense of humor about this) asked the question that was on everyone’s mind: Virtual Reality: Oxymoron or Pleonasm?
It didn’t matter if anyone knew what he was talking about, because time has proved most of it to be nonsense:
“The argument will be made that head-mounted displays are not acceptable because people feel silly wearing them. The same was once said about stereo headphones. If Sony’s Akio Morita had not insisted on marketing the damn things, we might not have the Walkman today. I expect that within the next five years more than one in ten people will wear head-mounted computer displays while traveling in buses, trains, and planes…. One company, whose name I am obliged to omit, will soon introduce a VR display system with a parts cost of less than US$25.”
Affordable VR headsets were just around the corner, really? And the only real barrier to adoption, according to Negroponte? Lag. Computers in 1993 just weren’t fast enough to react in real time when a user turned his or her head, breaking the illusion of the virtual.
According to Moore’s Law, we’ve gone through something like 10 doublings of computer power since 1993, so computers should be about a thousand times as powerful as they were when this piece was written - not to mention the advances in massively parallel graphics processing brought about by the widespread adoption of GPUs, and we’re still not there.
So what was it, really, that kept us from getting to Virtual Reality?
For one thing, we moved the goal posts - now it’s all about augmented reality, in which the virtual is laid over the real. Now you have a whole new set of problems - how do you make the virtual line up perfectly with the real when your head has six degrees of freedom and you’re outside where there aren’t many spatial referents for your computer to latch onto?
And most important of all, how do you develop screens tiny enough to present the same resolution as a large computer monitor, but in something like 1/400th the space? This is exactly the problem that has plagued the industry leader in display headsets, Vuzix. Their products are fine for watching movies, but don’t try using them as a monitor replacement.
Consumer-level Virtual Reality, it turns out, is really, really hard - not quite Artificial Intelligence hard, but so much harder than anyone expected that people just aren’t excited anymore. The Trough of Disillusionment on this technology is deep and long.
That doesn’t mean Virtual Reality is gone forever - remember how many false starts touch computing had before technologists succeeded with, of all things, a phone?
And, just a coda, even though the public long ago gave up on searching for Virtual Reality, the news media never got tired of it. Which just shows you how totally out of touch we can be: