Skip to Content

Ride-Hailing Apps Have a Racism Problem

Travelers with African-American-sounding names suffer longer waits and more cancellations than their white-sounding counterparts.
October 31, 2016

If you want to hail a ride using an app, it pays to have a white-sounding name.

That’s according to a new study that shows that riders with African-American-sounding names are more likely to wait longer to be accepted for a ride or have their trip canceled than people with white-sounding names. The results are contained in a working paper published by the National Bureau of Economic Research.

Researchers from MIT, Stanford University, and the University of Washington studied almost 1,500 rides on defined routes in both Seattle and Boston.

In the Seattle experiments, the researchers found that people with African-American-sounding names typically had to wait 29 seconds for an Uber acceptance and 23 seconds for a Lyft acceptance, compared to just 21 and 19 seconds respectively for riders with white-sounding names.

Results from the Boston experiments show more of a problem for Uber. Here, the researchers found that riders with African-American-sounding names had Uber rides cancelled 10.1 percent of the time, versus just 4.9 percent of the time for those with white-sounding names. Results for Lyft journeys actually showed a small skew in the opposite direction.

Ben Edelman, an associate professor at the Harvard Business School who studies the economics of online markets but wasn’t involved in this study, told MIT Technology Review that the the work is “a proper and rigorous methodology” that “shows a troubling bottom line.”

To be fair, the researchers also demonstrate that it’s not a problem specific to ride-hailing services. In Seattle, for instance, an experiment in hailing a taxi directly from the curb showed that white travelers had the first taxi that passed them stop 60 percent of the time; black travelers had the first taxi stop just 20 percent of the time.

Adrian Durbin, Lyft’s director of policy communications, said that the company is “extremely proud of the positive impact Lyft has on communities of color,” and added that the company does “not tolerate any form of discrimination.” Uber had not responded to a request for comment at the time of writing.

This is, of course, a complex and entrenched social issue, for which Uber and Lyft don’t bear sole responsibility. But it is clearly troubling that the information provided to drivers ahead of the proposed journey has a tangible impact on the service that travelers receive. The researchers propose that ride-hailing firms could use a variety of approaches to address the problem—like vetting drivers, disincentivizing cancellations, and reducing the use of names before a ride.

Similar approaches have been posited by academics for adoption by Airbnb, which itself struggles with the issue of discrimination, as Edelman has shown in the past. But these kinds of services have been built on the use of information sharing as a means of building trust and creating a more efficient service (not that it always works). Any move away from that model will be a big step for Airbnb, Uber, or Lyft.

Still, it is a step that needs to be taken. “At Uber and Lyft, as at Airbnb in my findings, platform design all but invites service providers to discriminate,” says Edelman. “Consumers should demand more of these platforms—and so should regulators.”

(Read more: Bloomberg, National Bureau of Economic Research,  “Airbnb Isn’t Really Confronting Its Racism Problem,” “This Is How Americans Really Feel About Uber and Lyft,” “Does Uber Have a Sexual Assault Problem?”)

Keep Reading

Most Popular

Russian servicemen take part in a military drills
Russian servicemen take part in a military drills

How a Russian cyberwar in Ukraine could ripple out globally

Soldiers and tanks may care about national borders. Cyber doesn't.

Death and Jeff Bezos
Death and Jeff Bezos

Meet Altos Labs, Silicon Valley’s latest wild bet on living forever

Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

ai learning to multitask concept
ai learning to multitask concept

Meta’s new learning algorithm can teach AI to multi-task

The single technique for teaching neural networks multiple skills is a step towards general-purpose AI.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.