Skip to Content

Ride-Hailing Apps Have a Racism Problem

Travelers with African-American-sounding names suffer longer waits and more cancellations than their white-sounding counterparts.
October 31, 2016

If you want to hail a ride using an app, it pays to have a white-sounding name.

That’s according to a new study that shows that riders with African-American-sounding names are more likely to wait longer to be accepted for a ride or have their trip canceled than people with white-sounding names. The results are contained in a working paper published by the National Bureau of Economic Research.

Researchers from MIT, Stanford University, and the University of Washington studied almost 1,500 rides on defined routes in both Seattle and Boston.

In the Seattle experiments, the researchers found that people with African-American-sounding names typically had to wait 29 seconds for an Uber acceptance and 23 seconds for a Lyft acceptance, compared to just 21 and 19 seconds respectively for riders with white-sounding names.

Results from the Boston experiments show more of a problem for Uber. Here, the researchers found that riders with African-American-sounding names had Uber rides cancelled 10.1 percent of the time, versus just 4.9 percent of the time for those with white-sounding names. Results for Lyft journeys actually showed a small skew in the opposite direction.

Ben Edelman, an associate professor at the Harvard Business School who studies the economics of online markets but wasn’t involved in this study, told MIT Technology Review that the the work is “a proper and rigorous methodology” that “shows a troubling bottom line.”

To be fair, the researchers also demonstrate that it’s not a problem specific to ride-hailing services. In Seattle, for instance, an experiment in hailing a taxi directly from the curb showed that white travelers had the first taxi that passed them stop 60 percent of the time; black travelers had the first taxi stop just 20 percent of the time.

Adrian Durbin, Lyft’s director of policy communications, said that the company is “extremely proud of the positive impact Lyft has on communities of color,” and added that the company does “not tolerate any form of discrimination.” Uber had not responded to a request for comment at the time of writing.

This is, of course, a complex and entrenched social issue, for which Uber and Lyft don’t bear sole responsibility. But it is clearly troubling that the information provided to drivers ahead of the proposed journey has a tangible impact on the service that travelers receive. The researchers propose that ride-hailing firms could use a variety of approaches to address the problem—like vetting drivers, disincentivizing cancellations, and reducing the use of names before a ride.

Similar approaches have been posited by academics for adoption by Airbnb, which itself struggles with the issue of discrimination, as Edelman has shown in the past. But these kinds of services have been built on the use of information sharing as a means of building trust and creating a more efficient service (not that it always works). Any move away from that model will be a big step for Airbnb, Uber, or Lyft.

Still, it is a step that needs to be taken. “At Uber and Lyft, as at Airbnb in my findings, platform design all but invites service providers to discriminate,” says Edelman. “Consumers should demand more of these platforms—and so should regulators.”

(Read more: Bloomberg, National Bureau of Economic Research,  “Airbnb Isn’t Really Confronting Its Racism Problem,” “This Is How Americans Really Feel About Uber and Lyft,” “Does Uber Have a Sexual Assault Problem?”)

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.