Though the company promised a fix months ago, Facebook’s ad system still allows advertisers to target people in ways that could run afoul of antidiscrimination laws.

The investigative journalism shop ProPublica has been on the case for over a year now. During its initial investigation, the social-media platform allowed ProPublica reporters who bought ads to block anyone with an “affinity” for African-American, Asian-American, or Hispanic people. That possibly put Facebook in violation of the Fair Housing Act, which makes housing discrimination for certain protected groups illegal. In response, Facebook announced an antidiscrimination initiative in February that included an automated system to spot problematic ads.

A new story from ProPublica out this week suggests things haven’t changed much. Investigators were still able to block ads from being shown to “African Americans, mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Argentina and Spanish speakers.” These are also all protected classes under the Fair Housing Act.

This latest finding adds to what is becoming a litany of problems for Facebook’s ad targeting system. As we well know by now, Russian accounts bought political ads that were shown to millions of Americans as part of an effort to sway the 2016 presidential election. And yet another ProPublica investigation recently showed that people could buy ads that targeted “Jew haters.”

Facebook hasn’t described in detail how the automated system to prevent discrimination is supposed to work, besides that it involves a machine-learning algorithm that is supposed to get better with time. If an ad is not approved, there is an option to ask for a manual review. But the algorithm allowed all ProPublica's ads through, so it seems that whatever technique Facebook is using, it still isn’t up to the task of policing how people use (or misuse) its ad platform.