Skip to Content
Policy

All automated hiring software is prone to bias by default

December 13, 2018

A new report out from nonprofit Upturn analyzed some of the most prominent hiring algorithms on the market and found that by default, such algorithms are prone to bias.

The hiring steps: Algorithms have been made to automate four primary stages of the hiring process: sourcing, screening, interviewing, and selection. The analysis found that while predictive tools were rarely deployed to make that final choice on who to hire, they were commonly used throughout these stages to reject people.

When bias creeps in: If past hiring data is used, the training information can be biased or unrepresentative, carrying these biases over to the software. Just removing data about gender and race won’t keep bias out of software. Other pieces of information, like distance from the office, can correlate strongly to more sensitive factors. Amazon saw this in a hiring algorithm it tried to develop. Hiring managers can also be prone to giving too much credence to the recommendations made by hiring algorithms.

Regulating AI: The study also found that the current laws aren’t equipped to deal with the problem. The laws designed to regulate human hiring decisions don’t easily allow for investigation and enforcement when dealing with machine-based discrimination.

The positives: On the other hand, predictive tools can help uncover past human biases and assumptions that had previously been overlooked. This would allow companies to make positive changes in their practices. “Several vendors are taking steps not only to audit their products for bias, but building equity-promoting features into their products,” says one of the report’s authors, Miranda Bogen. “We hope more vendors will embrace these types of interventions, but there’s still a long way to go before we can be confident that predictive tools aren’t causing more harm than benefit.”

Fixing the problem: To make the use of hiring AIs more fair, the report recommends:

     —allowing independent auditing of employer and vendor software
     —having governments update their regulations to cover predictive hiring software
     —scrutinizing ad and job platforms in more detail to analyze their growing influence on hiring

“Because there are so many different points in that process where biases can emerge, employers should definitely proceed with caution,” says Bogen. “They should be transparent about what predictive tools they are using and take whatever steps they can to proactively detect and address biases that arise—and if they can’t confidently do that, they should pull the plug.”

This article was originally published in our future of work newsletter, Clocking In. You can sign up here.

Deep Dive

Policy

Is there anything more fascinating than a hidden world?

Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.

Yes, remote learning can work for preschoolers

The largest-ever humanitarian intervention in early childhood education shows that remote learning can produce results comparable to a year of in-person teaching.

Three technology trends shaping 2024’s elections

The biggest story of this year will be elections in the US and all around the globe

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.