The chairman of the National Transportation Safety Board said Tuesday that "system safeguards were lacking" in the Tesla S that killed a driver when it struck a truck in Florida in May 2016.
According to Reuters, the new statement from the NTSB suggests that Tesla not only fielded a car with limited autonomous capabilities—something that company has previously acknowledged—but one that was also incapable of ensuring its human driver was paying attention to the road when its Autopilot system was engaged.
“Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention, ” Robert Sumwalt, the agency's chairman, said.
The news comes less than a month after the Wall Street Journal published a detailed account of the Autopilot program at Tesla, which included several accounts of its engineers expressing deep concerns about the system's safety. The system was only meant to provide partial autonomy that required continuous driver attention, but the company, particularly CEO Elon Musk, made public indications that Autopilot was capable of fully autonomous driving.
At the time of the crash, Tesla vehicles were outfitted to detect if a driver's hands were on the wheel, and emit warning sounds if they were removed for more than a few seconds. In its statement, the NTSB said that such measures were insufficient for monitoring whether a driver was paying attention to the road.
Tesla has since ratcheted up how its cars watch for driver engagement and upgraded its sensor packages for new cars. But the NTSB's statement is a necessary reminder that we have a long way still to go in perfecting the human-vehicle interaction problem that lies at the heart of designing semi-autonomous cars.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.