This company audits algorithms to see how biased they are
Mathematician Cathy O’Neil is offering businesses a chance to test their algorithms for fairness.
Opening the black box: As artificial-intelligence systems get more advanced, the logic paths they follow can be difficult or even impossible to understand, creating a so-called “black box.” As these algorithms come to control increasingly important parts of our lives, like whether we get a job or a loan, it is crucial to understand their biases and decisions.
The solution: O’Neil, who wrote the book Weapons of Math Destruction, has started O’Neil Risk Consulting and Algorithmic Auditing to perform third-party audits on algorithms. It examines everything from the people who programmed the software to the training data to the output, flagging any bias in the process.
Why would businesses choose to do this? Companies aren’t knocking down her door yet (she has only six clients). But they should be: not only is it in society’s best interest, it’s also good marketing. Getting a your algorithm certified for fairness can prove to your customers that your service is equitable, effective, and trustworthy.
Deep Dive
Policy
Is there anything more fascinating than a hidden world?
Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.
A brief, weird history of brainwashing
L. Ron Hubbard, Operation Midnight Climax, and stochastic terrorism—the race for mind control changed America forever.
What Luddites can teach us about resisting an automated future
Opposing technology isn’t antithetical to progress.
Africa’s push to regulate AI starts now
AI is expanding across the continent and new policies are taking shape. But poor digital infrastructure and regulatory bottlenecks could slow adoption.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.