Skip to Content
Artificial intelligence

Facebook used AI for an eye-opening trick

June 18, 2018

Facebook has demonstrated a neat, and slightly creepy, trick: its AI can now automatically open people’s eyes in photos.

Eye-opening: The technology could help save photos in which someone has blinked at the wrong moment. It shows how much easier it’s going to become to mess with images and video in coming years thanks to progress in artificial intelligence.

Dueling networks: Facebook’s researchers used what’s known as a “generative adversarial network,” which involves two dueling neural networks. One network learns from a data set (photos of open and closed eyes) and tries to generate synthetic examples. The other tries to tell fakes from the real thing, thereby pushing the first to create more convincing fakes.

Kinda creepy: In testing, Facebook’s eye-opening software often fooled humans, too. But the results can sometimes look a bit strange—if a person’s closed eyes are partly covered by hair, for example. This just goes to show that the underlying system has no idea what eyes actually are.

Deep Dive

Artificial intelligence

What does GPT-3 “know” about me? 

Large language models are trained on troves of personal data hoovered from the internet. So I wanted to know: What does it have on me?

An AI that can design new proteins could help unlock new cures and materials 

The machine-learning tool could help researchers discover entirely new proteins not yet known to science.

DeepMind’s new chatbot uses Google searches plus humans to give better answers

The lab trained a chatbot to learn from human feedback and search the internet for information to support its claims.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.