Skip to Content
Uncategorized

Google’s New AI Smile Detector Shows How Embracing Race and Gender Can Reduce Bias

December 4, 2017

Computer vision is becoming increasingly good at recognizing different facial expressions, but for certain groups that aren’t adequately represented in training data sets, like racial minorities or women with androgynous features, algorithms can still underperform.

A new paper published in arXiv by Google researchers has improved upon state-of-the-art smile detection algorithms by including and training racial and gender classifiers in their model. The racial classifier was trained on four race subgroups and two for gender (the researchers didn't name the racial groups, but the images appear to consist of Asian, black, Hispanic, and white people).

Their method got to nearly 91 percent accuracy at detecting smiles in the Faces of the World (FotW) data set, a set of 13,000 images of faces collected from the Web that is sometimes used as a benchmark for such algorithms. That represents an improvement of a little over 1.5 percent from the previous mark. The results showed an overall improved accuracy across the board, showing that paying attention to race and gender can yield better results than trying to build an algorithm that is “color blind.”

Many researchers are hesitant to include classifiers like this under the assumption that it’s easier to be guilty of bias (or at least be accused of it) when your system has explicit racial or gender categories. The Google team’s results prove that the effort put forth to train racial or gender classifiers can actually reduce the bias problem. The researchers also used classifications like “Gender 1” and “Gender 2” to avoid introducing unconscious, societal bias whenever possible.

Even with the promising results and care taken to be aware of bias in all its forms, though, the researchers included a section in their paper called “Ethical Considerations,” in which they take pains to note that their work is not intended to “motivate race and gender identification as an end-goal.” They also point out that there is no “gold standard” for breaking down racial categories, and that gender should maybe be considered a spectrum in future work, rather than a binary state.

Deep Dive

Uncategorized

Uber Autonomous Vehicles parked in a lot
Uber Autonomous Vehicles parked in a lot

It will soon be easy for self-driving cars to hide in plain sight. We shouldn’t let them.

If they ever hit our roads for real, other drivers need to know exactly what they are.

stock art of market data
stock art of market data

Maximize business value with data-driven strategies

Every organization is now collecting data, but few are truly data driven. Here are five ways data can transform your business.

Cryptocurrency fuels new business opportunities

As adoption of digital assets accelerates, companies are investing in innovative products and services.

Mifiprex pill
Mifiprex pill

Where to get abortion pills and how to use them

New US restrictions could turn abortion into do-it-yourself medicine, but there might be legal risks.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.