Startup Wants You to Capture the World in 3-D
Mantis Vision is developing 3-D scanning technology that could end up in lots of tablets.
Cheap depth-capture technology could make it easy to map indoor spaces, create realistic models of objects, and chat in 3-D.
Gur Bittan envisions a future where you’re not just capturing a regular video of a child’s first steps with a smartphone; you’re doing it in 3-D, and sharing it with friends who can manipulate the video to watch it from different perspectives—even the kid’s point of view, providing you’ve scanned the scene from enough angles.
Bittan is the chief technology officer of Mantis Vision, an Israel-based 3-D technology company that hopes to make this kind of experience commonplace. If its 3-D technology is included in mobile gadgets like smartphones and tablets, it could make something as simple as communicating with friends more immersive.
The company’s software and hardware designs are part of Google’s Project Tango tablet, which can map environments and objects. It is also working on a pocket-sized 3-D scanner and already offers an enterprise 3-D scanner called the F5.
Mantis Vision has also been working with electronics designer and manufacturer Flextronics on a tablet called Aquila that should be available in September to manufacturers who want to take it into production. And its technology will be added to some other gadgets, though cofounder and CEO Amihai Loven won’t give specifics (Google has said it is working with LG on a Project Tango consumer device; Loven won’t say if his company is involved). “All I can say is in 2015 it will be in the market,” he says.
The company recently raised $12.5 million in venture capital funding from the venture investment arms of Flextronics and Qualcomm, as well as from Sunny Optical Technology and Samsung.
The method Mantis Vision uses to capture 3-D data—projecting an infrared light pattern onto the environment—is similar to that used by PrimeSense, a company Apple purchased last year. But Mantis Vision believes that its method, which works whether a camera is moving or still, maps detailed things in 3-D more easily and accurately than other technologies. And it hopes this will generate more interest from cell-phone and tablet makers, not to mention consumers. The uses the company envisions for 3-D include gaming, gestural interfaces, and indoor navigation.
To capture 3-D information, a projector overlays an infrared light pattern onto whatever it is you’re trying to scan—a teddy bear, for instance. Then a digital camera and a depth sensor, synched to the projector, capture the scene with the light reflected by the bear. The technology works even in complete darkness, since it includes its own illumination; in bright environments the quality of the resulting image depends on the hardware used.
Via Skype, Bittan showed me a scan of a telephone, which looked as if it were covered with a bunch of interlocking letters in various shades of black, white, and gray, covered in turn with an evenly spaced grid of dots. Mantis Vision’s software analyzes the projected pattern and uses it to create a depth map of the object.
During an in-person meeting, Loven showed me, on his smartphone, a pixelated-looking 3-D video of a woman leaning back in a chair against a black background. He twisted the image around by sliding his finger on the screen to show it from another perspective, an effect made possible by having circled the woman with the camera while the video was shot. The result was not photorealistic, and there were plenty of black spots that were devoid of details, but it was pretty cool-looking.
Even the high-profile project with Google and the upcoming tablet may not be enough to win over consumers immediately: 3-D technology has existed for years in various forms, and it has struggled to move beyond the movie theater. But Loven says that’s because the technology is still “not mature enough.” Mantis Vision hopes to change that. “Let’s develop new technology and bring it,” he says.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today