Skip to Content
Smart cities

Drivers Push Tesla’s Autopilot Beyond Its Abilities

Tesla says its new Autopilot feature is not synonymous with autonomous driving, but some drivers are acting like it is.
October 21, 2015

Enthusiastic Tesla owners cheered last Wednesday when the company enabled the use of an automated driving system, called Autopilot, in its Model S all-electric sedans. The wireless update of vehicles to Version 7.0 of Tesla software—which allows properly equipped cars to steer, switch lanes, and manage speed on its own—is exactly the kind of bold move that makes many Tesla fans so excited about the company. In fact, a number of Tesla drivers immediately took to the road to test the limits of Autopilot—taking their hands fully off the wheel and seeing how far the car could drive itself down highways, country lanes, and suburban streets.

That led to dangerous situations and near accidents, as evidenced by videos made by drivers (while driving) and posted to YouTube. In one video, a Model S driver admitted to ignoring warnings until the vehicle automatically swerved over the double-yellow dividing lines toward an oncoming vehicle. “Had I not reacted quickly to jerk the steering wheel in the opposite direction, a devastating head-on collision would have occurred,” he wrote in the YouTube post.

Tesla’s Autopilot system—which uses a combination of forward radar, a forward-looking camera and 12 long-range ultrasonic sensors, and fast processors—can handle straight-ahead predictable highway driving. Yet Tesla CEO Elon Musk has repeatedly warned that Autopilot is not synonymous with fully autonomous driving (see also “Why Self-Driving Cars Must Be Programmed to Kill”).

“Tesla is very clear with what we’re building, features to assist the driver on the road,” said Khobi Brooklyn, a Tesla spokesperson, in an e-mail. “Similar to the autopilot function in airplanes, drivers need to maintain control and responsibility of their vehicle while enjoying the convenience of Autopilot in Model S.” Brooklyn said that customers were informed about Autopilot’s functions through release notes that come with every update, an update to the owner’s manual, and e-mails. Drivers are encouraged to keep their hands on the wheels (see “What Will Tesla Drivers Do Behind the Wheel With Autopilot Engaged?”).

Tesla’s Autopilot feature is meant to assist drivers, not take over the driving completely.

But not all drivers are getting a clear message. “I think it’s wonderful that Tesla has gone out there with this technology, but they might have hyped Autopilot a little bit too much,” says Alain Kornhauser, director of the transportation program at Princeton University. “It doesn’t work in all circumstances. Drivers don’t necessarily know when the car goes from tracking fine to a gray area when the car is confused, and then to a situation when the car doesn’t know where it’s going. These things aren’t well-defined.” Kornhauser drives a 2014 Mercedes-Benz S-550 sedan with Distronic Plus, a suite of assisted driving technologies that closely resembles the Tesla Autopilot system. Distronic Plus has been available in S- and E-Class vehicles since 2013. Yet automakers like Mercedes commonly refer to the technology as assistive, rather than using words like “auto” or “automatic.”

Automakers and regulators have not yet defined the best way, or the required timing, to alert drivers to take control back over the vehicle. Kornhauser warns that drivers need to be very cautious. “You have to show some respect, because you’re driving a lethal weapon,” he says.

He contrasted Tesla’s approach with Google’s autonomous vehicle program. Google is committed to complete autonomy—as a means to avoid any question about when a driver needs to be attentive or not. Its self-driving cars drive themselves in all situations, and sometimes operate without any passengers (see “Lazy Humans Shaped Google’s New Autonomous Car”).

Doug Newcomb, president of the C3group, which holds conferences and offers consulting on connected cars, agrees that Google—and especially mainstream automakers—are being more careful than Tesla. 

“This is Tesla’s MO,” says Newcomb. “As a technology company, they’re pushing things more than car companies.”

Newcomb says that Tesla is being “somewhat cavalier” in not fully acknowledging how the technology might be used. “With new technology, people are going to use it in ways that it wasn’t intended,” he says. “But in this case, you’re not talking about smart phone or a computer. You’re talking about a dangerous vehicle.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.