This company audits algorithms to see how biased they are
Mathematician Cathy O’Neil is offering businesses a chance to test their algorithms for fairness.
Opening the black box: As artificial-intelligence systems get more advanced, the logic paths they follow can be difficult or even impossible to understand, creating a so-called “black box.” As these algorithms come to control increasingly important parts of our lives, like whether we get a job or a loan, it is crucial to understand their biases and decisions.
The solution: O’Neil, who wrote the book Weapons of Math Destruction, has started O’Neil Risk Consulting and Algorithmic Auditing to perform third-party audits on algorithms. It examines everything from the people who programmed the software to the training data to the output, flagging any bias in the process.
Why would businesses choose to do this? Companies aren’t knocking down her door yet (she has only six clients). But they should be: not only is it in society’s best interest, it’s also good marketing. Getting a your algorithm certified for fairness can prove to your customers that your service is equitable, effective, and trustworthy.
Deep Dive
Policy
What happened to the microfinance organization Kiva?
A group of strikers argue that the organization seems more focused on making money than creating change. Are they right?
How one elite university is approaching ChatGPT this school year
Why Yale never considered banning the technology.
Six ways that AI could change politics
A new era of AI-powered domestic politics may be coming. Watch for these milestones to know when it’s arrived.
Cryptography may offer a solution to the massive AI-labeling problem
An internet protocol called C2PA adds a “nutrition label” to images, video, and audio.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.