Engineers at the Charles Stark Draper Laboratory in Cambridge, MA, are developing a guidance, navigation, and control system for lunar landings that includes an onboard hazard-detection system able to spot craters, slopes, and rocks that could be dangerous to landing craft. In the Apollo missions of 40 years ago, astronauts steered the lander to a safe spot by looking out the window; the lander itself “had no eyes,” says Eldon Hall, a retired Draper engineer and one of the original electronics designers for Apollo’s navigation computer.
That meant there were some close calls with Apollo, says Tye Brady, the technical director for lunar landing at Draper, who demonstrated his team’s automated-landing and hazard-avoidance technology at last week’s celebration of the 40th anniversary of Apollo 11. “They were really close,” Brady says, “and one- to two-meter craters are deadly. You don’t see them till the last minute.” Apollo 11 astronaut Neil Armstrong had to steer past a field of rocks that didn’t show up on any recon photos beforehand, and Apollo 14 landed at a precarious tilt with one footpad resting about a meter away from a crater.
The new navigation and guidance system is being developed for NASA’s Altair lunar lander, which is scheduled to land on the moon by 2020 as part of the Constellation program. The project is headed by NASA’s Johnson Space Center, with support from other NASA research facilities in addition to Draper Laboratory. The Jet Propulsion Laboratory recently completed a field test of the sensors and mapping algorithms, and it plans to begin full systems tests in May 2010.
Brady says that the best image resolution today, such as the cameras on the orbiter now circling and photographing the moon, cannot resolve smaller holes or boulders at projected landing sites, even in smooth, well-lit areas–which aren’t the targets for NASA’s future landings. Altair aims to land capably at any site on the moon’s surface, and the lunar terrain will vary. For that, Brady says, “you need real-time hazard detection” to adjust as you go.
Draper’s system will use LIDAR laser technology to scan an area for hazards like craters or rocks before the lander touches down on the moon’s surface. Raw data from LIDAR is processed and assembled into a 3-D map of the moon’s surface, using algorithms developed by the Jet Propulsion Laboratory. One advantage of using LIDAR is that “it’s the only type of sensor that measures the 3-D shape of what’s on the ground at high resolution and from high altitude,” says Andrew Johnson, the JPL lead for the hazard-detection system. That allows the system to build a terrain and elevation map of potential landing sites onboard the spacecraft, but from high enough up that there is time to respond to obstacles or craters at the landing site.
Once the map is built, the system designates safe sites based on factors like the tilt angle of the surface, the distance and fuel cost to get to a site, the position of the lander’s footpads, and the crew’s margin for safe distance from hazards. Based on that information, the navigation system presents astronauts with a prioritized list of three to four safe landing sites. The astronauts can then designate any of the sites as first choice, or if they are incapacitated, the system will navigate the lander automatically to the first site on its list.
The ability to land autonomously will enable both crewed and robotic missions to land safely, Brady says (while Apollo’s lunar module had an automatic landing mode, it was never used). In addition to NASA’s Altair, the system could be integrated into vehicles landing on near-Earth asteroids, Mars, and other planets, or used with other lunar vehicles built by private groups.
Another advantage of using LIDAR, Johnson says, is that it works under any lighting conditions. To deal with light at the moon’s equator–where a “day” is equivalent to 14 Earth days, and a “night” lasts 14 Earth nights–Apollo missions had to be timed exactly, with just one launch opportunity per month, so NASA could control the craft’s exposure to light and heat. But because lighting conditions are more varied and extreme at the moon’s poles, with patches of light and dark from the shadows of mountains and deep craters, it will be difficult for astronauts to see to navigate. LIDAR allows the craft to “land at night, or in shadowed regions, because the light is provided by the LIDAR sensor, not the sun,” Johnson says. With real-time hazard detection, he says, the launch and landing limitations of Apollo won’t apply to future missions.
The challenge for a landing system, says Brady, is getting everything to happen in about 120 seconds, including hazard-detection scans to get the data, human interaction for site approval, and then hazard-avoidance maneuvers and touchdown. His team has developed a simulator to create realistic image maps of the moon’s surface, in addition to using computer code from NASA for the guidance and navigation portion of the system. So far, about 20 astronauts have sampled the Draper simulation. “They’re good at going slow and easy, and they’re very patient,” Brady says. “They do a good job relying on the system.” That’s a long way from the early days when the Apollo astronauts “wanted to fly the whole thing themselves,” Hall says.
The Draper team continues to develop high-fidelity models of LIDAR and terrain maps, while coordinating with NASA’s crew office to determine the best way to display information for astronauts. They aim to have the technology ready by 2012.
The gene-edited pig heart given to a dying patient was infected with a pig virus
The first transplant of a genetically-modified pig heart into a human may have ended prematurely because of a well-known—and avoidable—risk.
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.