A fatal accident that occurred while Tesla’s Autopilot technology was engaged may have a significant bearing on the future of automated driving.
The National Highway Traffic Safety Administration said yesterday that it is investigating the accident, which occurred near Williston, Florida, last month when a Model S crashed into a trailer making a left turn in front of it.
NHTSA’s investigation does not mean the agency believes the technology contributed to the accident or is defective. But the incident will inevitably raise questions about the performance of the technology and the way drivers treat it. The auto industry will certainly watch closely to see how it may shape regulations and influence the public perception of automated-driving technology. The agency is expected to release new guidelines about testing automated vehicles this month.
Tesla’s Autopilot is available for versions of the Model S equipped with the necessary hardware, which includes cameras, radar, and ultrasound sensors. Autopilot can effectively keep a car driving on a highway, following bends, slowing for other vehicles, and even overtaking them. Tesla released a statement saying that when the accident occurred, Autopilot didn’t notice the white tractor-trailer against the bright sky.
Technology experts have warned for some time that self-driving systems are far from perfect, and that the process of handing control from a car back to a driver can be problematic (see “Driverless Cars Are Farther Away Than You Think”).
Dan Galves, chief communications officer for Mobileye, an Israeli company that provides image-processing software for Tesla and other automakers, says his company’s technology was not designed to cope with the type of obstacle that appeared to cause the accident now being investigated. “Today’s collision avoidance technology, or automatic emergency braking (AEB), is defined as rear-end collision avoidance and is designed specifically for that,” he says. “This incident involved a laterally crossing vehicle.”
Bosch, a German company that makes automotive technologies including components for automated driving, said the benefits of automation would make progress on the technology inevitable. “We remain convinced that the gradual introduction of automated vehicles can make a significant contribution to improving road safety,” the company said in a statement. “Automated driving is coming—not overnight, but gradually.”
The result of the NHTSA investigation could have profound implications for Tesla itself. Safety issues can be extremely damaging for carmakers, and it took Audi more than a decade to repair its reputation after a problem with “sudden acceleration” in some vehicles during the 1980s. But Tesla has a reputation for aggressively exploring new technologies, and many of its customers are enthusiastic about being on the cutting edge of automotive innovation. The company also has the capacity to fix problems quickly by updating its vehicles using a cellular data link.
Tesla pointed out in its statement that this was the first fatality in more 130 million miles of driving with Autopilot activated, compared with one fatality in every 94 million miles of regular driving. This is a coarse comparison, however, since Autopilot is only meant to be used for highway driving in certain conditions. It might also prove irrelevant if NHTSA decides Autopilot was somehow at fault.
“This accident is a tragic reminder of the very significant technical and social barriers facing autonomous vehicles,” says David Keith, an assistant professor at the MIT Sloan School of Management, who studies the auto industry. “Safely controlling an automobile in all possible driving situations is an enormously complex task, and one that algorithms have not yet mastered.”
It is unclear what the driver of the Model S was doing at the time of the accident. However, as Google’s car project has shown, automation can apparently encourage drivers to engage in risky behavior (see “Lazy Humans Shaped Google’s New Autonomous Car”).
Keith adds: “NHTSA has been relatively hands-off regarding autonomous vehicles to date, and it is certainly possible that this accident will lead to greater oversight regarding what constitutes safe operation of these semi-autonomous vehicles.”