Skip to Content
Artificial intelligence

Microsoft is creating an oracle for catching biased AI algorithms

As more people use artificial intelligence, they will need tools that detect unfairness in the underlying algorithms.
Noun Project | Andrejs Kirma | Ms. Tech

Microsoft is building a tool to automatically identify bias in a range of different AI algorithms. It is the boldest effort yet to automate the detection of unfairness that may creep into machine learning—and it could help businesses make use of AI without inadvertently discriminating against certain people.

Big tech companies are racing to sell off-the-shelf machine-learning technology that can be accessed via the cloud. As more customers make use of these algorithms to automate important judgements and decisions, the issue of bias will become crucial. And since bias can easily creep into machine-learning models, ways to automate the detection of unfairness could become a valuable part of the AI toolkit.

“Things like transparency, intelligibility, and explanation are new enough to the field that few of us have sufficient experience to know everything we should look for and all the ways that bias might lurk in our models,” says Rich Caruna, a senior researcher at Microsoft who is working on the bias-detection dashboard.

Algorithmic bias is a growing concern for many researchers and technology experts (see “Inspecting algorithms for bias”). As algorithms are used to automate important decisions, there is a risk that bias could become automated, deployed at scale, and more difficult for the victims to spot.

Caruna says Microsoft’s bias-catching product will help AI researchers catch more instances of unfairness, although not all. “Of course, we can’t expect perfection—there’s always going to be some bias undetected or that can’t be eliminated—the goal is to do as well as we can,” he says.

“The most important thing companies can do right now is educate their workforce so that they’re aware of the myriad ways in which bias can arise and manifest itself and create tools to make models easier to understand and bias easier to detect,” Caruna adds.  

Facebook announced its own tool for detecting bias at its annual developer conference on May 2. Its tool, called Fairness Flow, automatically warns if an algorithm is making an unfair judgement about someone based on his or her race, gender, or age. Facebook says it needed Fairness Flow because more and more people at the company are using AI to make important decisions.

Bin Yu, a professor at UC Berkeley, says the tools from Facebook and Microsoft seem like a step in the right direction, but may not be enough. She suggests that big companies should have outside experts audit their algorithms in order to prove they are not biased. “Someone else has to investigate Facebook's algorithms—they can't be a secret to everyone,” Yu says.

Deep Dive

Artificial intelligence

Geoffrey Hinton tells us why he’s now scared of the tech he helped build

“I have suddenly switched my views on whether these things are going to be more intelligent than us.”

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

Deep learning pioneer Geoffrey Hinton has quit Google

Hinton will be speaking at EmTech Digital on Wednesday.

We are hurtling toward a glitchy, spammy, scammy, AI-powered internet

Large language models are full of security vulnerabilities, yet they’re being embedded into tech products on a vast scale.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.