Skip to Content

Tesla Announces New Sensors and Puts the Brakes on Autopilot

Every new Tesla will be kitted out with all the hardware that Elon Musk claims is required for the cars to go fully autonomous—but they won’t be allowed to for now.
October 20, 2016

Tesla has announced that every car it manufactures will now come equipped with the “hardware needed for full self-driving capability.” But it’s also taken the opportunity to ease off the pace of deployment of Autopilot, its autonomous driving software.

Every new Tesla, including its forthcoming affordable car, the Model 3, will now come equipped with more sensors than previous models. The biggest upgrade is to the camera system—the cars will feature eight cameras dotted around the vehicle, as opposed to the single unit previously fitted behind the rear-view mirror, providing a 360-degree view of the world.

Elsewhere, sensors are being upgraded rather than added. The ultrasound sensors around the car will boast an increase in range to almost 500 meters, alongside a single forward-facing radar system. The data from these sensors will be processed by a so-called Tesla Neural Net running on a computer powered by Nvidia's Titan GPUs—claimed to be 40 times faster than the hardware used in previous models.

The decision to beef up the cars’ camera systems is an intriguing choice. In the wake of this year’s fatal Tesla crash involving Autopilot, Elon Musk said he made a decision to increase the emphasis placed on the radar systems as part of an upgrade to the self-driving software. “The most significant upgrade to Autopilot will be the use of more advanced signal processing to create a picture of the world using the onboard radar,” the company said at the time.

The addition of seven cameras and very little change in the radar system seems at odds with that. Some people working in the industry suggest that lidar sensors may be a safer option than optical and radar sensors, and may have expected them to be part of the new announcement, but Musk has been outspoken about his dislike of the technology. “I'm not a big fan of lidar; I don't think it makes sense in this context,” he’s said in the past.

Perhaps the most striking detail about the upgrade, though, is that despite being laden with new sensing technology, the cars won’t offer any Autopilot function at first. “Before activating the features enabled by the new hardware, we will further calibrate the system using millions of miles of real-world driving,” the company says. “While this is occurring, Teslas with new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware.”

That means that if you buy a new Tesla from a dealer in the near future, you won’t be able to use features, such as emergency braking and active cruise control, that are currently available to Tesla owners. Instead, the features will be provided as over-the-air updates at some point in the future. That essentially leaves current Autopilot users as guinea pigs and allows Tesla to take as long as it needs to ensure that future software is safe. The decision may be an attempt to slow the frenetic pace of self-driving development at Tesla, which has been criticized as risky in the past.

Even so, Musk is still keen to push the limits of what his cars can do in private testing. Alongside the announcement, Tesla showed off a video (see above) of one of its cars autonomously navigating suburban roads, highways, and the company’s own parking lot. In fact, Musk has even promised that a Tesla will drive itself from Los Angeles to New York by the end of 2017. How long you have to wait to be able to do the same remains to be seen.

(Read more: Tesla, The Wall Street Journal, “Tesla’s Strategy Is Risky and Aggressive, but It Has Worked,” “Pay Attention! Tesla’s Autopilot Will Lock Out Lackadaisical Drivers,” “Tesla’s Biggest Edge in Chasing Autonomy Is Treating Drivers Like Guinea Pigs”)

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.