Skip to Content

Microsoft’s CEO Calls for Accountable AI, Ignores the Algorithms That Already Rule Our Lives

Satya Nadella warns that the smart software of the future may be capable of discrimination. In fact, biased algorithms are already here.
June 29, 2016

Microsoft CEO Satya Nadella is concerned about the power artificial intelligence will wield over our lives. In a post on Slate yesterday he advised the computing industry to start thinking now about how to design intelligent software to respect our humanity.

“The tech industry should not dictate the values and virtues of this future,” he wrote.

Nadella called for “algorithmic accountability so that humans can undo unintended harm.” He said that smart software must be designed in ways that let us inspect its workings and prevent it from discriminating against certain people or using private data in unsavory ways.

Microsoft CEO Satya Nadella.

These are noble and rational concerns—but ones tech leaders should have been talking about some time ago. There is ample evidence that the algorithms and software shaping daily life can already be marked by troubling biases.

Studies from the Federal Trade Commission have found signs that racial and economic biases decried in pre-Internet times are now reappearing in the systems powering targeted ads and other online services. In Wisconsin a fight is taking place over why the workings of a system that tries to predict whether a criminal will reoffend—and is used to determine jail terms—must be kept secret.

Just today the ACLU filed suit against the U.S. government on behalf of researchers with a plan to look for racial discrimination in online job and housing ads. They can’t carry it out because of restrictions in federal hacking laws and the way tech firms write their terms and conditions.

It’s clear that some of the problems Nadella says could be created by future artificial intelligence are in fact already here. Microsoft researcher Kate Crawford nicely summarized the root of algorithmic bias in a recent New York Times op-ed, writing that software “may already be exacerbating inequality in the workplace, at home and in our legal and judicial systems.”

Nadella concludes his forward-looking post on artificial intelligence by saying: “The most critical next step in our pursuit of A.I. is to agree on an ethical and empathic framework for its design.” What better way to be ready for the AI-dominated future than to start work now on applying an ethical and empathic framework to the “dumb” software that already surrounds us?

(Read more: Slate, Vice, Ars Technica, New York Times)

Keep Reading

Most Popular

mouse engineered to grow human hair
mouse engineered to grow human hair

Going bald? Lab-grown hair cells could be on the way

These biotech companies are reprogramming cells to treat baldness, but it’s still early days.

Death and Jeff Bezos
Death and Jeff Bezos

Meet Altos Labs, Silicon Valley’s latest wild bet on living forever

Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.

ai learning to multitask concept
ai learning to multitask concept

Meta’s new learning algorithm can teach AI to multi-task

The single technique for teaching neural networks multiple skills is a step towards general-purpose AI.

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.