Data Shows Google’s Robot Cars Are Smoother, Safer Drivers Than You or I
Data gathered from Google’s self-driving Prius and Lexus cars shows that they are safer and smoother when steering themselves than when a human takes the wheel, according to the leader of Google’s autonomous-car project.
Chris Urmson made those claims today at a robotics conference in Santa Clara, California. He presented results from two studies of data from the hundreds of thousands of miles Google’s vehicles have logged on public roads in California and Nevada.
One of those analyses showed that when a human was behind the wheel, Google’s cars accelerated and braked significantly more sharply than they did when piloting themselves. Another showed that the cars’ software was much better at maintaining a safe distance from the vehicle ahead than the human drivers were.
“We’re spending less time in near-collision states,” said Urmson. “Our car is driving more smoothly and more safely than our trained professional drivers.”
In addition to painting a rosy picture of his vehicles’ autonomous capabilities, Urmson showed a new dashboard display that his group has developed to help people understand what an autonomous car is doing and when they might want to take over. “Inside the car we’ve gone out of our way to make the human factors work,” he said.
Although that might suggest the company is thinking about how to translate its research project into something used by real motorists, Urmson dodged a question about how that might happen. “We’re thinking about different ways of bringing it to market,” he said. “I can’t tell you any more right now.”
Urmson did say that he is in regular contact with automakers. Many of those companies are independently working on self-driving cars themselves (see “Driverless Cars Are Further Away Than You Think”).
Google has been testing its cars on public roads since 2010 (see “Look, No Hands”), always with a human in the driver’s seat who can take over if necessary.
Urmson dismissed claims that legal and regulatory problems pose a major barrier to cars that are completely autonomous. He pointed out that California, Nevada, and Florida have already adjusted their laws to allow tests of self-driving cars. And existing product liability laws make it clear that a car’s manufacturer would be at fault if the car caused a crash, he said. He also said that when the inevitable accidents do occur, the data autonomous cars collect in order to navigate will provide a powerful and accurate picture of exactly who was responsible.
Urmson showed data from a Google car that was rear-ended in traffic by another driver. Examining the car’s annotated map of its surroundings clearly showed that the Google vehicle smoothly halted before being struck by the other vehicle.
“We don’t have to rely on eyewitnesses that can’t act be trusted as to what happened—we actually have the data,” he said. “The guy around us wasn’t paying enough attention. The data will set you free.”
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.