In 1989, Mary (Missy) Cummings landed a fighter jet on an aircraft carrier for the first time. But the young pilot’s elation did not last long. Minutes later, a close friend died while attempting the same landing.
Although the U.S. Navy never determined what caused that crash on the USS Lexington in the Gulf of Mexico, Cummings suspects that technology played a role. “I can’t tell you how many friends died because of bad designs,” she recalls of her years in the military. “As a pilot, I found it incredibly frustrating to work with technology that didn’t work with me.” That’s one of the reasons why, after more than a decade behind the controls of fighter jets, Cummings decided to shift gears and help design those controls herself. Now, as an associate professor at MIT and director of the Humans and Automation Laboratory (HAL), she’s a leading practitioner of “human factors” engineering, which focuses on making technical systems safer and more intuitive to use.
Cummings began flying planes after graduating from the U.S. Naval Academy in 1988 and received her master’s degree in space systems engineering from the Naval Postgraduate School in 1994. By the time the policy barring women from operational combat squadrons was ended in 1993, she had established herself as an accomplished pilot. She was selected for the first group of women to fly the F/A-18 Hornet, one of the most technologically advanced fighter jets.
But the new policy hardly eliminated sexism in the armed forces. “It’s no secret that the social environment wasn’t conducive to my career,” says Cummings. “Guys were resentful that a woman was doing a job in what was so clearly a warrior realm and, in my case, involved a blonde girl taking on the role of a killer.” (She detailed this experience in her 2000 book Hornet’s Nest.) After one particularly troubling episode (she reports that her plane’s radio frequencies were accidentally switched and her male colleagues intentionally gave her dangerous taxiing instructions), she took a step back and reëvaluated her career. She left the military in 1999.
With more than a decade of her life invested in the service, however, Cummings knew she wanted to enter a related field. Having been frustrated by unintuitive cockpits, radar screens, and hand controls, she wanted to make these systems easier to use so that pilots could focus on the objectives of the mission. She began taking classes in human cognition and psychology, and in 2003 she finished her PhD in systems engineering at the University of Virginia. That year she arrived at MIT, joining the faculty in the Department of Aeronautics and Astronautics and the Engineering Systems Division. And in 2004 she founded HAL, which helps people work more effectively with computers to supervise complex automatic control systems like the ones that regulate nuclear reactors or air traffic patterns.
Learning from boredom
Although she misses flying, Cummings finds a similar thrill in her research. She explores how design features like display screens can affect human factors such as how long people are able to focus on a system, how much they trust it, and whether they feel confident or frustrated when they use it. Much of her work is geared toward preventing the people who monitor automated systems from becoming bored, which is what happened last fall when two Northwest Airlines pilots absentmindedly overshot their destination by 150 miles. She tries to find “that sweet spot” where the user receives just enough mental stimulation to stay engaged without growing overwhelmed. For example, she says, people who monitor robots may do better if they can check the robots’ activities according to a pattern they set up for themselves.
How data is displayed on screen is another important consideration in human factors engineering. Using colors and shapes can help represent information in a way that people can quickly understand. A green circle, for instance, might indicate a properly functioning system. But Cummings cautions against overusing this strategy; she points out that air traffic controllers must now memorize more than 30 colors symbolizing different tasks, which can be mentally exhausting.
Cummings also studies the strategies that people use to get their work done, such as making lists. “We are all very linear in the way we think, and people will gravitate to the top of the list,” she says. Designers need to keep this in mind when setting up systems that require monitoring a group of machines, for example. If these systems deliver instructions in the form of a list, machines that wind up at the bottom might not receive as much attention.
Yossi Sheffi, director of MIT’s Engineering Systems Division, thinks Cummings’s research could help make a big difference in environments where humans and machines must collaborate. “Her research tying the human operator to technology is crucial,” he says, “both to the design of the technology itself and to the operation of the system as a whole, in order to ensure that it operates efficiently and effectively.”
Randall Davis, a professor of computer science and engineering, has worked with Cummings on several projects and believes that her research matters to anyone operating small-scale systems like cell phones and remote controls, which seem to grow more complex each year. “As we are increasingly surrounded by technology of all sorts, it becomes increasingly important for someone to understand how to design this stuff so that it’s easy to use,” he says. “Otherwise, we’ll be surrounded by incomprehensible technology.”
A novel application
Cummings’s work has broad implications for the military, where pilots are increasingly being trained to operate unmanned aerial vehicles (UAVs), or drones, to perform high-risk tasks such as getting a closer look at potential snipers. The technology has been successful in Iraq and Afghanistan, and U.S. Defense Secretary Robert Gates announced last year that it would become a permanent part of the defense budget.
Pilots trying to control such systems may not have a lot of time to change complicated menu settings or zoom and pan a camera. Cummings wants to lower the cognitive overhead so that they can focus on staying alive. “It’s about offloading skill-based tasks so that people can focus specifically on knowledge-based tasks, such as determining whether a potential sniper is a good or bad guy by using the UAV to identify him,” she says. The technology could also help responders search more efficiently for victims after a natural disaster.
Over the past year, Cummings and her students have designed an iPhone application that can control a one-pound quad-rotor UAV–a helicopter with four propellers–that’s equipped with a camera and built-in sensors. When the user tilts the iPhone in the direction he or she wants the UAV to move, the app sends GPS coördinates to the UAV to help it navigate. The UAV uses fast-scanning lasers to create electronic models of its environment and then rapidly sends these models back to the iPhone in the form of easy-to-read maps, videos, and photos.
In recent HAL experiments at MIT, subjects maneuvered the UAV in front of an eye chart in a separate room and read the images the camera captured. Some achieved the equivalent of 20/30 vision, which Cummings says is “pretty good.” But more important, no one crashed the device. The next step will be experiments outside, where the UAV could reach an altitude of 500 feet.
Although the military and Boeing are funding the research, the technology could be used for civilian crowd control–or even, in theory, to fly a quad-rotor to Starbucks to see how long the line is. The app is designed so that anyone who can operate a phone could fly a UAV after just three minutes of training. Military pilots, on the other hand, must undergo thousands of hours of training to fly drones. “This is all about the mission–you just need more information from an image, and you shouldn’t have to spend
$1 million to train someone to get that picture,” she says.
Much as Cummings loves flying, she sees UAVs as a valuable alternative. “When I was flying, it was up to the pilot who was given a set of coördinates before the flight to make a decision as to whether a bomb should be dropped. UAV systems allow many more people to be in this loop, so fewer mistakes are made,” she says.
“I’m ultimately a pilot and always tell people that I got to go out and fly like crazy. It was a lot of fun, but operating with UAVs is safer and more precise at saving lives. It’s the way operations should be.”