Tiny Drones That Navigate with Insect Eyes
A tiny, biologically inspired motion sensor could help small drones avoid collisions as they buzz around.
Small flying drones could be used to perform valuable tasks in a much wider range of environments compared with larger, more conventional unmanned aerial vehicles.
A tiny artificial eye inspired by the vision systems of insects could help small flying drones navigate their surroundings well enough to avoid collisions while buzzing around in confined, cluttered spaces—a key step in making these small autonomous flying vehicles practical.
An emerging class of very small flying drones has taken off in public and private research labs in recent years (see “Robotic Insect Takes Off”). These mini drones could be valuable in spying and surveillance; they might also be useful for things like monitoring disaster areas or delivering supplies to humans. But there remains a lot of work to be done toward developing miniature navigation systems, particularly for confined spaces. Just avoiding collisions is a still a major technical challenge, says Dario Floreano, director of the Laboratory of Intelligent Systems at the Swiss Federal Institute of Technology.
Some have attempted to address this problem by using digital cameras, but these are bulky, and the need for a small and very lightweight package has led researchers, including Floreano, to look to insect vision for insights. Flying bugs avoid collisions thanks to tiny eyes that have low spatial resolution but are highly sensitive to changes in the way light is reflected as the insect moves, or due to the movement of an object in its field of view. The new sensor his group recently unveiled weighs only two milligrams and takes up only two cubic millimeters, and can detect motion in conditions ranging from a poorly lit room to very bright sunlight outdoors—three times faster than fast flying insects, says Floreano.
The artificial eye is composed of a lens on top of three electronic photodetectors arranged in a triangular pattern. By combining measurements of the individual photodetectors, the device can sense the speed and direction of motion in its view.
Algorithms for processing the signals have already been developed, and they can be programmed into small chips to compute things like distance to objects or the time until a potential collision. One focus of the group’s current work is integrating this system into “very small aerial platforms” like the foldable quadrotor the lab recently developed. The challenge, says Floreano, will be to combine multiple artificial eyes into configurations that allow the drone to “see all around” and avoid collisions, stabilize its flying position, land, and take off. He says the elementary eyes are particularly suited for drones that weigh 50 grams or less, and which cannot lift a payload larger than a few grams.
The sensor could be useful for other things besides flying robots. For demonstration, the group created what Floreano calls “vision tape,” a flexible patch containing many artificial eyes. The tape can be attached to any curved surface, including other kinds of robots, vehicles, and even furniture and clothing, he says.
AI and robotics are changing the future of work. Are you ready? Join us at EmTech Next 2019.Register now