This company audits algorithms to see how biased they are
Mathematician Cathy O’Neil is offering businesses a chance to test their algorithms for fairness.
Opening the black box: As artificial-intelligence systems get more advanced, the logic paths they follow can be difficult or even impossible to understand, creating a so-called “black box.” As these algorithms come to control increasingly important parts of our lives, like whether we get a job or a loan, it is crucial to understand their biases and decisions.
The solution: O’Neil, who wrote the book Weapons of Math Destruction, has started O’Neil Risk Consulting and Algorithmic Auditing to perform third-party audits on algorithms. It examines everything from the people who programmed the software to the training data to the output, flagging any bias in the process.
Why would businesses choose to do this? Companies aren’t knocking down her door yet (she has only six clients). But they should be: not only is it in society’s best interest, it’s also good marketing. Getting a your algorithm certified for fairness can prove to your customers that your service is equitable, effective, and trustworthy.
Deep Dive
Policy
What happened to the microfinance organization Kiva?
A group of strikers argue that the organization seems more focused on making money than creating change. Are they right?
How one elite university is approaching ChatGPT this school year
Why Yale never considered banning the technology.
Worldcoin just officially launched. Here’s why it’s already being investigated.
The project is backed by some of tech's biggest stars, but four countries are probing its privacy practices.
Google has a new tool to outsmart authoritarian internet censorship
Its Outline VPN can now be built directly into apps—making it harder for governments to block internet access, particularly during protests.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.