Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Unmanned aerial vehicles (UAVs for short) have proved their usefulness as military tools. But most UAVs aren’t truly autonomous: they’re operated remotely by a human controller from the ground.

To become truly autonomous, UAVs will need to get far better at sensing obstacles and reacting in time to avoid a collision. This will be especially important if they are ever to operate in commercial space.

Sanjiv Singh, a professor and researcher at Carnegie Mellon University, has developed a new system to help UAVs do just this.

Since most UAVs are fairly small and lightweight, they can’t carry the heavy, power-hungry sensors that larger aircraft can use to detect other planes. So Singh and student Debadeepta Dey developed an algorithm that uses an ordinary camera and several software programs to detect potential obstacles.

Their sense-and-avoid system functions across a wide field of view (from up to three miles away) and in a wide range of weather conditions. It does this by finding contrasting points in a video image (such as a dark spot against white clouds) and tracking them to determine movement.

In the video below, the system outlines moving objects in red, such as a plane (distinguished by the green box). It also identifies the characteristic movement of dust–rather than a flying obstacle–on the lens (blue).

Click here to see a bigger version of the video.

“We have proved that sense and avoid for unmanned aerial vehicles using passive sensors is a very real possibility, and with some more time and maturity, this will evolve into a deployable standard technology,” says Dey, who presented details of the system at the International Conference on Field and Service Robotics yesterday.

The sense-and-avoid system can pick out a small, two-seater plane from five miles away, says Dey. So far, he and Singh have tested it from the ground using real aircraft. Currently, it produces some false positives (identifying bugs as planes, for example), but the researchers plan to couple a lidar sensor to the camera to improve it. By bouncing a laser beam off of the obstacle, the lidar will measure its distance to help determine whether it’s really a plane on a collision course or just an insect hitching a ride.

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me