Facebook has blocked Cambridge Analytica, which provided data to the Trump election campaign
The news: In a blog post late on March 16, Facebook’s deputy general counsel said Strategic Communications Laboratories (SCL) and its political data arm, Cambridge Analytica, had their access to the platform suspended for violating data use policies.
The details: According to Facebook, Aleksander Kogan, a psychology professor at Cambridge University in the UK, had collected various kinds of personal data using an app on the social network. Billed as a research vehicle for psychologists, the app, downloaded by approximately 270,000 people, asked them to share things such as content they had liked and the city they’d listed on their Facebook profile.
The social network says that Kogan passed the information to SCL/Cambridge Analytica and Christopher Wylie of Eunoia Technologies in violation of its rules. When it discovered this in 2015, it removed the app and asked Kogan, SCL/Cambridge Analytica, and Wylie to certify they had destroyed the data collected. All three parties said they had done so.
But according to its post, Facebook recently received reports that not all the information had been deleted. The social network says it’s looking into these claims and is suspending Kogan, Wylie, and SCL/Cambridge Analytica pending further information.
But wait, there’s more: In a related story on March 17, the New York Times reported that Cambridge Analytica collected private information from more than 50 million Facebook profiles without permission, in what it described as “one of the largest data leaks in the social network’s history.” The story cites documents and former employees, including on-the-record statements from Wylie.
Why this matters: The use of social media to target political messages during the 2016 election has caused plenty of controversy. Facebook’s move will be seen as a sign it’s taking seriously accusations that it was used to manipulate public opinion, but it also raises the question of whether other data was siphoned off without the network’s knowledge.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.