Airbnb has a racism problem—and will have to make dramatic changes to shake it off.
Researchers from Harvard Business School have found that people with names that sound African-American are 16 percent less likely to get a positive response to a room request than people with white-sounding names. When those results were first published last year, Airbnb’s head of diversity, David King, said that the company faced “significant challenges” to overcome the issue.
It’s likely, though unproven by rigorous study, that Uber faces a similar problem. Certainly, when the ride-hailing company refused to add a tipping function to its app earlier this year, it cited its customers’ unconscious racial biases as a reason.
Airbnb has now published a report outlining its plans to tackle discrimination. The plan was written by Laura Murphy, a civil rights lawyer who has worked at the American Civil Liberties Union. Its main message: that the company is establishing “a permanent, full-time team of engineers, data scientists, researchers and designers whose sole purpose is to advance belonging and inclusion and to root out bias.”
In particular, the report suggests that the company will experiment with reducing the prominence of guest photos during the booking process, encourage adoption of its Instant Book service, and find accommodations for anyone who has been discriminated against. At the time of writing, Airbnb had not responded to questions about further measures the team might take.
The question is: will its interventions be enough?
“It is possible to prevent racial discrimination on Airbnb using a technological solution. The key task is picking the right solution,” explains Ben Edelman, the lead author of the Harvard Business School report that challenged the company. “Airbnb’s proposed steps do not seem likely to succeed.”
In fact, Edelman outlined some of the ways that he believes Airbnb could prevent discrimination earlier this year—none of which have made the company’s list.
“The natural approach is to conceal the information about race that is giving rise to discrimination,” Edelman says. As an example, he points to a famous decision made by the Boston Symphony Orchestra in 1952: it began auditioning players blind. It quickly became less segregated by gender, age, and race.
Jamila Jefferson-Jones, who teaches law at the University of Missouri, Kansas City, agrees. “I think that profile pictures should either be eliminated or only shared after the booking is confirmed [and] that names may need to be treated the same way,” she says. At its most extreme, this approach is comparable to eBay, and would mean that users relied solely on ratings and reviews to make a judgement on whether or not to transact with someone.
That would be a big step for Airbnb. Until now, a core part of its business model has been extensive sharing of information as a way to build trust. Its Instant Book feature is a step in this direction, but it’s underused and still contains loopholes that allow users to cancel bookings after seeing a guest’s profile.
If the radical step of using pseudonyms is too much for the company, Jefferson-Jones suggests it could ask guests to add other, potentially more useful, details—such as interests or a reason for the trip. There would still be the problem of the ratings that are awarded after the guest’s stay, though. “That’s a much harder nut to crack,” she says, because they are usually based on in-person interactions.
It is unfair to ask a company like Airbnb to solve purely social problems. But we should ask of it a genuine commitment to the cause—which may, report or otherwise, be lacking. “My personal account was suspended for about a year in response to the data collection for my article on this subject,” Edelman says. “If fixing discrimination is truly Airbnb’s top priority ... then why ban research about it?”
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.