Google’s self-driving vehicles have covered almost a million miles of road, but after a spate of recent rear-end accidents, their safety has come under scrutiny.
Chris Urmson, the head of Google’s automated-car program, defended the safety record of the project at an event in Ypsilanti, Michigan, today.
“There’s been a lot of noise recently in the press about the fact that our vehicles have been in collisions,” Urmson said. “We’ve been hit 14 times over the lifetime of the program, and there’s been a bunch of speculation about that.”
Urmson said all of the accidents were due to human driver error, and most were examples of the growing problem of driver distraction. He showed video clips of the sensor data captured by Google’s cars during several accidents, including one in which a driver rear-ended one of Google’s cars without slowing down, most likely because his attention was focused on his smartphone.
“We see [the accidents] as an illustration of the epidemic of distracted driving,” Urmson told attendees during his keynote speech at the Automated Vehicle Symposium. He reaffirmed his belief that self-driving cars would prevent such accidents. “[The car] isn’t distracted. It doesn’t worry about missing its first cup of coffee in the morning. It’s going to be paying attention all the time.”
At the event, held for academic and industry representatives, Urmson showed some of the hardware in Google’s own prototype of automated vehicles (see “Why Google’s Self-Driving Bubble Cars Might Catch On”). He showed electronic systems with redundant power controls and actuators for controlling the car’s steering and breaking systems, which continue to work even if the power and control systems fail.
Automated vehicles can also be programmed to deal with specific situations, and Google’s cars use machine learning to recognize particular obstacles and traffic scenarios. Google’s team is currently training its cars to recognize different hand gestures, for example.
However, vehicles will inevitably still encounter unexpected situations. Google’s team has programmed its cars to recognize an abnormal situation and wait for it to end before continuing.
He showed more videos of such situations, including people jumping from trucks and children riding in toy cars on the road ahead. One clip showed a Google car waiting patiently as a woman in a wheelchair pursued a duck in circles in the street ahead.
Google’s cars haven’t always been smarter than human drivers, though. Another clip shown by Urmson showed the first time one of Google’s cars encountered a traffic roundabout, when it decided the safest thing to do was to keep going around. “There were a couple of engineers in the car who were giddy going round and round,” he said. “It felt very Chevy Chase-esque.”