When Tesla Motors introduced the Model S sedan in 2012, one of its many notable features was an always-on cellular Internet connection. A Tesla executive explained today that it has turned into a powerful advantage in the company’s contest with other carmakers and Internet giants such as Google to get self-driving cars onto public roads.
Tesla can pull down data from the sensors inside its customers’ vehicles to see how people are driving and the road and traffic conditions they experience. It uses that data to test the effectiveness of new self-driving features. The company even secretly tests new autonomous software by remotely installing it on customer vehicles so it can react to real road and traffic conditions, without controlling the vehicle.
“The ability to pull high-resolution data from these vehicles and to update the vehicles over the air is a significant part of what’s allowed us in 18 months to go from very behind the curve to what is today one of the more advanced autonomous or semi-autonomous driving features,” said Sterling Anderson, director of Tesla’s Autopilot program, at MIT Technology Review’s EmTech Digital conference in San Francisco on Tuesday (see “No Industry Can Afford to Ignore Artificial Intelligence”).
Tesla began bundling a suite of new sensors into its vehicles in 2014, saying it was for a new emergency braking feature.
But the 12 ultrasonic sensors positioned around the car sense nearby objects, and the forward-facing cameras and radar units were intended for bigger things. Tesla engineers began using data streaming from cars with those sensors and information on their locations to start testing autonomous driving features.
“Since introducing this hardware 18 months ago we’ve accrued 780 million miles,” said Anderson. “We can use all of that data on our servers to look for how people are using our cars and how we can improve things.” Every 10 hours Tesla gets another million miles worth of data, he said.
Tesla’s engineers initially test new self-driving software against those records. Any that perform well can also be tested by secretly installing them onto customer vehicles and watching how they respond to conditions on the road, although the software doesn't actually control the car.
“We will often install an ‘inert’ feature on all our vehicles worldwide,” said Anderson. “That allows us to watch over tens of millions of miles how a feature performs.”
Anderson’s team can also watch closely when a new feature is activated. For example, he showed a chart illustrating how self-driving Teslas using the Autopilot feature hold themselves much more tightly to the center of the lane than humans do when steering the car. Since its launch last October, Tesla has logged 100 million miles of vehicles steering themselves (see “10 Breakthrough Technologies 2016: Tesla Autopilot”).
Tesla’s ability to pull data from its cars and even covertly test autonomous driving software is likely unique. Google has demonstrated some of the most advanced self-driving technology, but it can only pull data from its fleet of prototypes, likely smaller and less widely distributed than the collection of Tesla vehicles on the road.
Other carmakers, such as GM, are also working on self-driving. But they have not embraced the idea of Internet connectivity and over-the-air updates in the way Tesla has.
However, Tesla’s strategy of using its data infrastructure to test and develop its technology in public could run into problems. Google restructured its autonomous car program in 2014 after the concerning results of an experiment in which Google employees could use self-driving prototypes. People quickly became complacent about the technology’s abilities, despite the fact that they were supposed to be ready to take over at all times.
“One guy noticed that his cell-phone battery was low, pulled out his laptop, and plugged it in at 65 miles per hour on the freeway,” Chris Urmson, who leads Google’s project, said at the EmTech event today. “We thought, this is not good.” Google committed itself to car designs without steering wheels or pedals, piloted by software alone (see “Lazy Humans Shaped Google’s New Autonomous Car”).
Anderson takes a different view. He said Tesla’s data-centric strategy will allow the company to keep advancing the company’s Autopilot technology, for example to include the ability to drive in more urban conditions and handle intersections. Tesla must be aware of drivers’ expectations, but doesn’t need to take them out of the equation altogether, he said.
“Autopilot is not an autonomous system and should not be treated as one,” said Anderson. “We ask drivers to keep their hands on [the wheel] and be prepared to take over.”
This story was updated to correct that Tesla receives another million miles of data from customers' cars every 10 hours, not every 10 days.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.