Enthusiastic Tesla owners cheered last Wednesday when the company enabled the use of an automated driving system, called Autopilot, in its Model S all-electric sedans. The wireless update of vehicles to Version 7.0 of Tesla software—which allows properly equipped cars to steer, switch lanes, and manage speed on its own—is exactly the kind of bold move that makes many Tesla fans so excited about the company. In fact, a number of Tesla drivers immediately took to the road to test the limits of Autopilot—taking their hands fully off the wheel and seeing how far the car could drive itself down highways, country lanes, and suburban streets.
That led to dangerous situations and near accidents, as evidenced by videos made by drivers (while driving) and posted to YouTube. In one video, a Model S driver admitted to ignoring warnings until the vehicle automatically swerved over the double-yellow dividing lines toward an oncoming vehicle. “Had I not reacted quickly to jerk the steering wheel in the opposite direction, a devastating head-on collision would have occurred,” he wrote in the YouTube post.
Tesla’s Autopilot system—which uses a combination of forward radar, a forward-looking camera and 12 long-range ultrasonic sensors, and fast processors—can handle straight-ahead predictable highway driving. Yet Tesla CEO Elon Musk has repeatedly warned that Autopilot is not synonymous with fully autonomous driving (see also “Why Self-Driving Cars Must Be Programmed to Kill”).
“Tesla is very clear with what we’re building, features to assist the driver on the road,” said Khobi Brooklyn, a Tesla spokesperson, in an e-mail. “Similar to the autopilot function in airplanes, drivers need to maintain control and responsibility of their vehicle while enjoying the convenience of Autopilot in Model S.” Brooklyn said that customers were informed about Autopilot’s functions through release notes that come with every update, an update to the owner’s manual, and e-mails. Drivers are encouraged to keep their hands on the wheels (see “What Will Tesla Drivers Do Behind the Wheel With Autopilot Engaged?”).
But not all drivers are getting a clear message. “I think it’s wonderful that Tesla has gone out there with this technology, but they might have hyped Autopilot a little bit too much,” says Alain Kornhauser, director of the transportation program at Princeton University. “It doesn’t work in all circumstances. Drivers don’t necessarily know when the car goes from tracking fine to a gray area when the car is confused, and then to a situation when the car doesn’t know where it’s going. These things aren’t well-defined.” Kornhauser drives a 2014 Mercedes-Benz S-550 sedan with Distronic Plus, a suite of assisted driving technologies that closely resembles the Tesla Autopilot system. Distronic Plus has been available in S- and E-Class vehicles since 2013. Yet automakers like Mercedes commonly refer to the technology as assistive, rather than using words like “auto” or “automatic.”
Automakers and regulators have not yet defined the best way, or the required timing, to alert drivers to take control back over the vehicle. Kornhauser warns that drivers need to be very cautious. “You have to show some respect, because you’re driving a lethal weapon,” he says.
He contrasted Tesla’s approach with Google’s autonomous vehicle program. Google is committed to complete autonomy—as a means to avoid any question about when a driver needs to be attentive or not. Its self-driving cars drive themselves in all situations, and sometimes operate without any passengers (see “Lazy Humans Shaped Google’s New Autonomous Car”).
Doug Newcomb, president of the C3group, which holds conferences and offers consulting on connected cars, agrees that Google—and especially mainstream automakers—are being more careful than Tesla.
“This is Tesla’s MO,” says Newcomb. “As a technology company, they’re pushing things more than car companies.”
Newcomb says that Tesla is being “somewhat cavalier” in not fully acknowledging how the technology might be used. “With new technology, people are going to use it in ways that it wasn’t intended,” he says. “But in this case, you’re not talking about smart phone or a computer. You’re talking about a dangerous vehicle.”
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate
Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.
10 Breakthrough Technologies 2023
These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway
Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.