MIT Technology Review Subscribe

The guy who made a tool to track women in porn videos is sorry

The programmer supposedly used face recognition to match social-media photos with images from porn sites. Collecting that data would have been illegal in some countries but not others.

An anonymous programmer based in Germany caused outrage this week for supposedly using face-recognition technology to “catch” women who had appeared in porn. He says he’s since deleted the project and all its data, but that’s not an act of altruism. Such a project would have violated European privacy law anyway, though it would have been okay elsewhere.

There is still no proof that the global system—which allegedly matched women’s social-media photos with images from sites like Pornhub—actually worked, or even existed. Still, the technology is possible and would have had awful consequences. “It’s going to kill people,” says Carrie A. Goldberg, an attorney who specializes in sexual privacy violations and author of the forthcoming book Nobody’s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls. “Some of my most viciously harassed clients have been people who did porn, oftentimes one time in their life and sometimes nonconsensually [because] they were duped into it. Their lives have been ruined because there’s this whole culture of incels that for a hobby expose women who’ve done porn and post about them online and dox them.” (Incels, or “involuntary celibates,” are a misogynistic online subculture of men who claim they are denied sex by women.)

Advertisement

The European Union’s GDPR privacy law prevents this kind of situation. Though the programmer—who posted about the project on the Chinese social network Weibo—originally insisted everything was fine because he didn’t make the information public, just collecting the data is illegal if the women didn’t consent, according to Börge Seeger, a data protection expert and partner at German law firm Neuwerk. These laws apply to any information from EU residents, so they would have held even if the programmer weren’t living in the EU.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Under GDPR, personal data (and especially sensitive biometric data) needs to be collected for specific and legitimate purposes. Scraping data to figure out if someone once appeared in porn is not that. And if the programmer had charged money to access this information, he could have faced up to three years in prison under German criminal law, adds Seeger.

Women in the US have some protections too. Though there’s no federal privacy law, California has strong privacy legislation that would block this type of data collection, explains Christina Gagnier, a lawyer and adjunct faculty teaching privacy at UC Irvine School of Law. Because California has so many residents and industries, and data travels across state lines, the state ends up setting privacy law for the rest of the nation. It would be illegal for someone in South Dakota, for example, to set up this database while using California data — which would be hard to avoid given that the porn industry is based in Los Angeles County. 

That still leaves people in many other countries vulnerable. And enforcement of these laws is tricky, adds Gagnier. Data protection authorities in individual countries are responsible for compliance, but they need to choose their battles and it can be hard to serve people with lawsuits. Reached last night via Weibo, the programmer (who did not give his real name) insisted that the technology was real, but acknowledged that it raised legal issues. He’s sorry to have caused trouble. But he’s not the only one able to build this technology, or the only one interested in using it for dangerous purposes. Policymakers concerned with global privacy law need to start thinking ahead.

This story was updated with a comment about enforcement from Gagnier.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement