There’s a weapon of mass manipulation that Facebook is reportedly struggling to overcome: Photoshop.
Fake pictures: The Wall Street Journal explains that doctored images were a “crucial and deceptively simple technique used by Russian propagandists to spread fabricated information during the 2016 election.”
Pictures are perfect: People love images and share them more frequently on social media than text-only posts. Editing them to inject fake content or misleading messages is a useful way to spread propaganda fast.
Hard to block: AI may be able to identify objects in images, but understanding how and why pictures have been altered is more difficult—not least because it requires an understanding of context. And at any rate, Facebook is nervous about censoring content, and doesn’t even automate the process of taking down text posts, which should be easier.
Next up: The Journal says the social network is working on systems that could identify when images are used out of context. But expect fake images to circulate on news feeds for a while yet.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Driving companywide efficiencies with AI
Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.