Connectivity

Facebook’s Live-Action Camera Systems Let You Take Steps in Virtual Places

New VR cameras will be great for live events and virtual tourism. Oh, and probably porn, too.

Virtual reality has an image problem.

While plenty of cameras out there capture spherical footage of real-life scenes that you can then look at in VR (see “10 Breakthrough Technologies 2017: The 360-Degree Selfie”), most of them don’t also capture depth data. This means that if you’re wearing a high-end virtual-reality headset like the Oculus Rift and looking at a spherical video of, say, the Eiffel Tower (rather than a computer-rendered version of it), the image will move with you when you crouch, jump, or step from side to side. This is annoying at best, and nausea-inducing at worst.

Facebook, which has been one of the biggest proponents of virtual reality since purchasing headset maker Oculus in 2014, is aiming to fix this with two new spherical camera systems unveiled on Wednesday. Both shoot live-action footage that lets you move around in about a meter and a half of space in virtual reality. The company plans to get them into production later this year.

Called X24 and X6 (the numbers refer to the quantity of individual cameras in each model), the camera systems could make virtual experiences like watching concerts, visiting famous landmarks, or exploring museums much more engaging, whether you do so on your own or with another person. You’ll still need a headset that can track your position in space and the rotation of your head, though, to really take advantage of the footage.

“What we are trying to do with VR in general is bring people up the immersion curve,” said Facebook chief technology officer Mike Schroepfer. “The end vision is [to get you] as close as you can to feeling like you’re actually there.”

A handful of companies have already built high-end cameras for capturing live-action virtual reality, such as Nokia’s Ozo camera, Google’s Jump, and Lytro’s Immerge (see “Lytro Is Building a Camera to Capture Live-Action Virtual Reality”). But only a couple (Lytro’s being one of them) purport to film footage and record depth information as Facebook says its X24 and X6 do, and no company has yet popularized such a device.

While the X24 and X6 are meant for professionals, Schroepfer said the technology will eventually lead to consumer products as well.

The new camera systems, which were introduced at Facebook’s annual F8 developer conference, come a year after the social network rolled out its UFO-like Surround 360, which had 17 cameras and was meant to capture crisp, spherical 3-D images. Facebook didn’t sell the Surround 360, but it made the technology available via GitHub to anyone who wanted to make one with off-the-shelf parts.

The Surround 360 didn’t include the kind of 3-D information that the X24 and X6 will record, though. With their lens positions and accompanying software, the new systems can reconstruct what the world should look like as you move around, Schroepfer said.

In demos of raw footage shot with the X24 that I viewed through an Oculus Rift headset last week, I saw a spherical scene of a lush rainforest exhibit from the vantage point of a catwalk near the top of the exhibit, with butterflies flitting by, as well as a tunnel inside an aquarium with fish swimming around. Because the footage included depth information and the headset can track movement with six degrees of freedom, I could move around in the scenes, checking out the trees in the rainforest, the tourists on benches in the tunnel, and fish wandering above and around us.

Much of the footage looked crisp, and it was very cool to be able to move around freely in a lifelike scene. Yet the technology still needs work, or at least some editing to clean up the footage: the foliage in the rainforest looked streaky, for example, and I noticed some shimmering elsewhere, too.

Brian Cabral, Facebook’s director of engineering and leader of the team that made the devices, said that the physical arrangement of the cameras makes it possible to capture each pixel in a given scene from many different angles, and then math is used to estimate its depth.

“Once you know where it is in the scene, you can move around,” he said, adding that the shimmering will be cleaned up as the systems go into production.

Cabral said Facebook will license the technology to several as-yet-unnamed partners so they can make cameras.

“The idea is to have multiple models of growing the ecosystem,” he said.

Hear more about Facebook at EmTech MIT 2017.

Register now

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

Next in Top Stories

Your guide to what matters today

Want more award-winning journalism? Subscribe to Insider Premium.
  • Insider Premium {! insider.prices.premium !}*

    {! insider.display.menuOptionsLabel !}

    Our award winning magazine, unlimited access to our story archive, special discounts to MIT Technology Review Events, and exclusive content.

    See details+

    What's Included

    Bimonthly magazine delivery and unlimited 24/7 access to MIT Technology Review’s website

    The Download: our daily newsletter of what's important in technology and innovation

    Access to the magazine PDF archive—thousands of articles going back to 1899 at your fingertips

    Special discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

    First Look: exclusive early access to important stories, before they’re available to anyone else

    Insider Conversations: listen in on in-depth calls between our editors and today’s thought leaders

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.