Computer vision algorithms aren’t the only forms of “intelligence” that can be tricked by manipulated photos.
Uh-oh: Researchers have used an AI to design the first photos that fool both humans and computer vision algorithms, like the above: an unaltered image of a cat, on the left, next to a version that’s been tweaked to look weirdly like a dog.
For science! Finding human weaknesses in this way could improve our AI systems. In the paper that describes these manipulated photos—coauthored by Ian Goodfellow, the creator of generative adversarial networks (GANs)—the researchers point out that if we find a certain class of altered images that can’t fool the human mind, it suggests that a “similar mechanism” could exist in machine learning. Most AI systems are loosely based on the human brain, after all.
Why it matters: Now that there are stickers that can be put on physical objects to confuse computer vision systems, so-called adversarial examples are a real-world problem. And if autonomous vehicles don’t have a way to make sure their systems can see every stop sign, they won’t be ready for the road anytime soon.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
The viral AI avatar app Lensa undressed me—without my consent
My avatars were cartoonishly pornified, while my male colleagues got to be astronauts, explorers, and inventors.
Roomba testers feel misled after intimate images ended up on Facebook
An MIT Technology Review investigation recently revealed how images of a minor and a tester on the toilet ended up on social media. iRobot said it had consent to collect this kind of data from inside homes—but participants say otherwise.
How to spot AI-generated text
The internet is increasingly awash with text written by AI software. We need new tools to detect it.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.