Are Autonomous Cars Ready to Go It Alone?

Training wheels for autonomous vehicles come in the shape of a human behind the wheel. But when you remove the safety driver from a robotic car, two tons of metal is let loose.
Just last week, California’s Department of Motor Vehicles announced that it plans to allow companies to test autonomous cars on its roads without on-board backup drivers before the end of the year. The cars would be required to have a remote operator capable of keeping an eye on the car, but nevertheless, as the Guardian correctly points out, the news feels like a defining moment for the technology. Has it matured enough to finally let us place our faith in the ability of the robot?
The idea of never touching the wheel is so tantalizing that the likes of Waymo, Tesla, and Uber are all racing to develop autonomous vehicles as fast as they can. But in the rush, we shouldn’t forget that the cars cannot yet cope with many complex road conditions.
When our own Will Knight took a ride in one of Uber’s first autonomous vehicles, his ride certainly couldn’t. “The car performed well in many difficult situations—reacting to pedestrians darting into the road, for example,” he wrote at the time. “However, several times the person behind the wheel needed to take control: once so the car didn’t become stuck behind a truck, and once to avoid another vehicle making a sudden turn.”
Now, six months later, they still can’t. Recode has obtained a series of internal Uber documents appearing to suggest that a driver is still very much required in the ride-hailing company’s autonomous vehicles. Just last week, for instance, its 43 cars clocked up a grand total of 20,354 autonomous miles, but safety drivers had reason to intervene every 0.8 miles on average—a figure that’s actually decreased since January.
That sounds awful. But the figure counts all the times the driver took back control of the vehicles, which might be because it started raining heavily, or road markings ran out, or something else mundane happened. A more important figure is the number of miles between what Uber calls “critical” interventions on the driver’s part—situations where if a driver hadn’t stepped in, a person might have been injured or at least $5,000 worth of property damage could’ve occurred. As of last week, that figure was 196 miles between incidents, up from 50 miles in January—getting better, but still clearly problematic.
So it seems Uber’s vehicles may not be quite ready to take to California’s streets without a driver. Last year’s fatal self-driving Tesla crash suggests that its vehicles, too, might not be. The days of forgetting about the wheel will surely come—but for safety’s sake, it pays to be patient.
(Read more: Recode, Guardian, “Fatal Tesla Autopilot Crash Is a Reminder Autonomous Cars Will Sometimes Screw Up,” What to Know Before You Get In a Self-driving Car,” “My Self-Driving Uber Needed Human Help”)
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.