Skip to Content
MIT Technology Review

In a fatal crash, Uber’s autonomous car detected a pedestrian—but chose to not stop

Category:

The company has found the likely cause of its self-driving-car crash in March that killed someone trying to cross the road.

The news: According to a report by the Information, the vehicle’s software did in fact detect the pedestrian, but it chose not to immediately react. The car was programmed to ignore potential false positives, or things in the road that wouldn’t interfere with the vehicle (like a plastic bag). But those adjustments were taken too far.

Why? The car may have been part of a test for increased rider comfort. Autonomous cars aren’t known for their smooth rides, and by ignoring things that are probably not a threat, a vehicle can cut down on the number of start-and-stop jerks riders experience.

What’s next? Uber is conducting a joint investigation with the National Transportation Safety Board, after which more details are expected to be released. In the meantime, the report could inspire other self-driving-vehicle companies to treat potential false positives with more caution.