Steve Mann, a pioneer in the field of wearable computing, has been touting the benefits of head-mounted computers for decades. Now the University of Toronto professor is also lending his weight and experience to a company hoping to loosen Google Glass’s grip on the nascent market with a different take on computer glasses that merges the real and the virtual.
The company, Meta, is building computerized headwear that can overlay interactive 3-D content onto the real world. While the device is bulky, Meta hopes to eventually slim it down into a sleek, light pair of normal-looking glasses that could be used in all kinds of virtual activities, from gaming to product design. The company, which was founded by Meron Gribetz and Ben Sand, counts Mann as its chief scientist. One of Mann’s graduate students, Ray Lo, serves as chief technical officer. The company just completed a stint with Y Combinator, the successful startup accelerator based in Mountain View, California.
Meta’s clunky-looking initial product, called Space Glasses, is meant more as a tool for app developers than as a gadget you’d want to actually wear. It doesn’t have a built-in battery or central or graphics processors, so it needs to be physically tethered to a computer in order to work. It includes a see-through projectable LCD for each eye, an infrared depth camera, and a standard color camera, as well as an accelerometer, gyroscope, and compass. The second version of Space Glasses will be lighter and less bulky-looking, the team says, and will include a battery and central and graphics processors, as well as some changes to the software.
“I think it’s a really good time to enter into this world,” says Mann, who has been sporting his own custom glasses, and pushing the idea of head-worn computers, since the 1970s. As all kinds of wearable technology become cheaper and more widespread, “smart” glasses are leading the charge, helped by the promotion of Glass and a slew of other products from various companies. Market researcher IHS predicts that 124,000 pairs of smart glasses will ship this year, mostly to developers, up from 50,000 last year. IHS expects the figure to climb as high as 434,000 next year.
Space Glasses are not yet shipping widely, but a Kickstarter campaign seeking $100,000 to support the device’s creation brought in nearly double its goal. So far, more than 900 developers have paid hundreds of dollars apiece to get an early version of the glasses, which Meta recently began sending out, or preorder a sleeker-looking model, which is expected to go out to buyers in April (currently, both cost $667).
Space Glasses work by, essentially, building up a 3-D model of the world as you’re walking around, using an algorithm Meta built to track flat surfaces in real time; unlike some previous augmented-reality systems, it needs no special physical markers. The coördinates resulting from this tracking are relayed to the computer, which renders the digital information as a 3-D model of your immediate surroundings. This makes it possible, for example, to project a movie onto a chosen piece of paper, as the team showed me in person. Different people could approach the same 3-D object at different angles, or a 3-D model could follow you around.
“You don’t need to change anything about the world you’re in for us to track it, which is a huge breakthrough,” Sand says.
The team envisions Meta as a replacement for the standard computer and as something people can use together, whether they’re architects standing around a table to design a building or friends running around playing a shoot-‘em-up game. Eventually, Gribetz hopes, Meta can build its technology into something even less obvious than glasses, like an optic-nerve implant.
That’s a long way away, though. While the splashy demo video on Meta’s site promises a range of interactive activities—such as virtually sculpting and then 3-D printing a vase, or playing a game of virtual chess with a friend—it’s all courtesy of special effects meant to give an idea of what the wearer will see. (The team says, though, that most of these demos have been built already and will soon be available to developers, along with a software development kit.) In the few minutes I tried it out, I couldn’t do much more than swipe some letters on a virtual keyboard projected in front of me and poke haphazardly at a 3-D mushroom.
The coolest demo I saw involved an animated clip projected onto a sheet of paper Lo held—and moved—in front of me. Though movie watching is a passive activity, this does give an idea of how well Meta’s real-time surface tracking works.
Natan Linder, a graduate student in the Fluid Interfaces Group at the MIT Media Lab, can imagine how a head-mounted computer like Meta might be useful to, say, prep you quickly for a conversation by showing you what an old friend has been up to lately, or show pilots information they need to see (a purpose for which head-mounted technology has been used in the past). He’s not convinced that such a device will be generally useful, though; he likens it to the Bluetooth earpieces some people wore constantly in years past, “only much worse.”
Still, he feels that Mann’s involvement lends cachet. “He basically started this whole thing,” he says. “If anybody can see it through and help them make it relevant, I think it would be him.”