Skip to Content

Look Before You Leap Motion

Leap Motion’s low-cost gesture-control device is not as easy to use as you might think.
July 22, 2013

For the past couple days, I’ve been gesticulating even more than normal—at times, subtly, at other times, wildly—while getting to know the latest in gesture-control technology: the Leap Motion controller.

Hands up: The $80 Leap Motion controller enables gesture recognition on a Mac or PC.

Long anticipated due to its low cost ($80), unobtrusive sardine-can size, and purported accuracy and ease of use as demonstrated in some impressive videos, I was pretty excited to try out the device, which was released today. Gestural interfaces like the Leap Motion controller and Microsoft’s Kinect have generated a lot of buzz over the past few years, and hopes are high that they’ll eventually become as common as the mouse and keyboard, if not supplanting them.

And while it hasn’t been long since I leapt into using the Leap (I had only a few days to try it out), I’m sad to report that, so far, it has fallen flat.

It was easy to set the device up—you plug the sleek-looking Leap Motion controller into one of your computer’s USB ports, and download the software for your operating system from the Leap Motion site.

As the company notes, most of Leap Motion’s innovation is on the software side, not the hardware side: the device houses infrared LEDs and two cameras underneath its black glass top (see “Leaping Into the Gesture-Control Era”); the software tracks the movement of your fingers as you move them above the sensor.

The company says its finger tracking is accurate to a hundredth of a millimeter, and if you open up the diagnostic visualizer, which lets you see a sort of skeletal view of what the Leap Motion is tracking, it does look like it’s keeping tabs on every move your fingers and hands make, even when you use two hands. It doesn’t require any calibration to work, though you can go into the Leap Motion controller settings to calibrate it if you want.

The main way most people will use the Leap Motion controller will be through Airspace, which is a desktop app downloaded as part of the initial setup. Airspace includes a link to the Airspace Store, where you can get Leap Motion-enabled apps (some are free, while others cost 99 cents and up). At launch, there will be over 75 apps in the store, Leap executives told me, and when I was trying the device out, they included some well-known games like Fruit Ninja and Cut the Rope, as well as the drumming app AirBeats and skull-dissection app called Cyber Science—Motion.

I set it up on both a Mac and a PC, and downloaded and tested a variety of apps for both, including the aforementioned ones, a Corel drawing app, several more games, and a utility for controlling my computer. There were some bright spots, but mostly I was frustrated; I didn’t see on-screen what I thought should be taking place as I moved my hands.

Cut the Rope and Cyber Science—Motion are examples of how Leap Motion can work pretty well. If you’re unfamiliar with Cut the Rope, it involves cutting ropes to swing a piece of candy into the mouth of a little green monster. Once I got the hang of how much I needed to move my hand to chop away, I had fun and was actually able to beat the game. Similarly, with Cyber Science—Motion, I figured out how to manipulate the on-screen 3-D skull so I could look at it from different angles, zoom in or out, and select and remove pieces (including individual teeth).

Generally, though, it felt like I could never quite get the controls to work as deliberately as I wanted. With the AirBeats app, for instance, I tried moving my hand in a consistent pattern to hit the on-screen bass drum, but it wouldn’t play a consistent beat. And with Painter Freestyle, I had a hell of a time controlling the position of the virtual brush, switching between brush types, or changing colors. I didn’t have much luck using it to control a Word document or my Web browser, either.

It was irritating to keep trying to select or manipulate items on the screen without getting it right. I tried calibrating the Leap Motion controller, and switching from one operating system to another, but neither worked that impressively. I also tried different lighting settings, including a room with hardly any outside light, yet that didn’t make things much better.

Even if it worked perfectly, I’m not sure how useful it would be to the average person. I could see it enhancing certain computer-aided tasks, like drawing, modeling, and virtual dissections, as well as making it easier to surf the Web. Yet I’m not convinced it would make these activities that much easier or better than performing them with existing tools.

I also noticed something that doesn’t usually happen when using a mouse and keyboard, even though I’m routinely in front of a computer for seven or more hours a day: after an hour or so, my right arm felt really tired, all the way up to my shoulder. Even when I started fresh the next day, making motions as small and precise as I could, it still started to bug me after a while.

As I mentioned previously, I didn’t have much time to use the Leap Motion controller—about a day and a half, at most—and it’s certainly possible that with more time I’d feel more adept at using it. I’m also confident that it will improve in time. For now, though, I’m not leaping for joy.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.