Snipers have always been the soldier’s bane. Today, the problem is especially acute for U.S. troops engaged in urban combat in Iraqi cities. When gunmen hidden inside buildings shoot at troops, they’re abetted by echoes coming off surrounding buildings, which obscure the source of the shots.
But help is on the way.
In Boston last week, a crowd gathered in the noise-filled, two-story atrium of the Boston University Photonics Center to watch a man simulate gunshots by banging on a well-dented metal panel. Each time he repositioned himself – on a set of stairs, for example – and struck the metal, a small, suitcase-sized robot would instantly swivel its cigar box-shaped head and aim two clusters of bright-white LEDs at the metal panel.
The robot’s all-important head, named the Robot Enhanced Detection Outpost with Lasers, or Redowl, is the creation of a team at Biomimetic Systems in Rosindale, MA, led by Socrates Deligeorges, and professors at BU. It’s the most recent of several new technologies to help troops quickly identify the source of enemy attacks.
BBN Technologies of Cambridge, MA, for instance, has already sent more than 100 Humvee-mounted devices to Iraq for detecting gunshots. Their system includes a two-meter mast with a half-meter microphone cluster, and weighs about 25 kilograms. Radiance Technologies of Huntsville, AL, sells another gunshot detection device, a box about the size of the BU robot, but weighing nine kilograms.
But Redowl stands out because of its small size and much lower weight. The acoustic detector, mounted as the “head” of the robot, fits into a box the size of a hard-bound book. And its developers say a new version using digital electronics should fit into a space the size of two cigarette packs. The entire Redowl system, excluding the robot it’s mounted on, weighs about two kilograms, around one-quarter the weight of the Radiance device.
The Redowl system can be so small and light because it’s composed mainly of electronics and software. Deligeorges says he developed the system to mimic the human auditory system, which can detect the direction of a sound using two natural sensors, the ears, located just a few centimeters apart.
“Our external ears, our ear canal, and our middle-ear bones, like the ear drum and the associated pieces, all perform processing that enhances certain features of a sound,” says Deligeorges. He and his advisors at BU, biomedical engineer David Mountain and electronics engineer Allyn Hubbard, studied each part of this mechanism step by step. Then they built “a very intricate mechanical model” of how the ear translates pressure waves in the air into neural signals.
To complete the system, they constructed neural-network circuitry that mimics the behavior of nerve cells. Redowl is not so much programmed as it is trained to recognize gunshots, Deligeorges explains. When it’s exposed to a sound, it guesses the location of its source. The researchers feed in the difference between the guess and the correct location, and the trial is run again. Each time, the “neural” connections in Redowl change slightly until the robot can always guess correctly.
“As long as we know how the processing works in a biological system and what’s important, we can take the best part of the biology and the best part of the electronics and merge them,” says Deligeorges. Redowl’s electronics, for example, make it capable of reacting much faster than the human brain.
To account for the intricate and confusing surfaces that reflect sound in an urban environment, Deligeorges has built echo suppression into Redowl. The system recognizes the distinct soundprint of a gunshot – both the initial blast and the shockwave from the bullet – and stores it in memory. Since the echoes that follow will have a similar print, the system can ignore them.
In addition to suppressing echoes, this soundprinting capability can also reveal the difference between an AK-47, an M-16, and city background noise, such as a car backfiring, says Glenn Thoren, Deputy Director of the BU Photonics Center.
In fact, Thoren has a more ambitious system in mind. Already, Redowl can illuminate the target, something other devices don’t do. But Thoren wants to integrate Redowl’s acoustic sensors with optical sensors and other types of detectors. His device would include multiple infrared lasers for pointing to the target, along with a 300x zoom lens and a laser rangefinder. An onboard GPS unit would translate a shooter’s calculated position into geographical coordinates. Such a robot “scout” could move ahead of troops into dangerous locations, such as buildings and open intersections.
In the Boston University demonstration, Redowl was mounted atop a PackBot, a workhorse robot made by Burlington, MA-based iRobot. More than 300 PackBots with other types of attachments have been deployed in Afghanistan and Iraq to explore caves and ammunition dumps and to dispose of roadside bombs. U.S. soldiers have even adapted the robots to carry supplies, such as bullets and water, to pinned-down troops.
“One of the things that’s key to robots, and to robots’ contribution to network-centered warfare, is speed,” says Joe Dyer, a retired Navy vice admiral who’s now general manager of iRobot’s Government and Industrial Robots division. “Without information, you’re huddled down trying to figure out what to do next.” But with Redowl as its nervous system, Dyer says, the PackBot can help “resolve quickly where a shot is fired from and [let] you take direct action.”
If a sniper disabled a Redowl-bearing PackBot by shooting at it, however, wouldn’t the soldiers be as vulnerable as before? “Were I the bad guy, I wouldn’t shoot the robot,” says Dyer. If the sniper misses, that one shot would give away his location – and fast. What’s more, a single direct hit would be unlikely to destroy the robot. “Realistically, it ain’t gonna happen,” Dyer says.
The big new idea for making self-driving cars that can go anywhere
The mainstream approach to driverless cars is slow and difficult. These startups think going all-in on AI will get there faster.
Inside Charm Industrial’s big bet on corn stalks for carbon removal
The startup used plant matter and bio-oil to sequester thousands of tons of carbon. The question now is how reliable, scalable, and economical this approach will prove.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.