Skip to Content
Silicon Valley

Facebook has been charged over housing ads that discriminate on race, color, and religion

March 29, 2019

The US Department of Housing and Urban Development (HUD) says Facebook is breaking the law by allowing housing ads to be targeted in this way.

The details: HUD claims Facebook broke the Fair Housing Act by “encouraging, enabling, and causing” discrimination through its advertising platform. It claims that Facebook allows would-be advertisers to draw a red line around certain neighborhoods where they do not want to advertise. Advertisers can also choose not to advertise to users who have certain interests, such as “Hijab fashion” or “Hispanic culture.” Facebook said it was “surprised” by the decision.

The claims: HUD says Facebook mines data about users, then uses that to determine who sees which ads, based on “protected characteristics,” which include race, color, religion, and sex. HUD says this behavior is “just like an advertiser who intentionally targets or excludes users based on their protected class.” 

Timing: Facebook settled on exactly this issue with the ACLU and two other groups just last week. It has promised to stop anyone running housing, employment, or credit advertisements from targeting by location, age, race, or gender.

Facebook has given itself a generous time line to do so. It’s said it will enact the changes by the end of the year, despite the fact such practices are illegal right now. This time lag, and the fact it isn’t currently complying with the law, means it’s surely open to more lawsuits like this one filed by HUD.

Long-standing issues: Facebook has known about this specific problem since 2016. Perhaps legal action might help to hasten its response.

Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.