The data on which the artificial-intelligence algorithm was trained created a preference for male candidates.
The news: According to a report by Reuters, Amazon began developing an automated system in 2014 to rank job seekers with one to five stars. But last year, the company scrapped the project after seeing it had developed a preference for male candidates in technical roles.
Why? The AI tool was trained on 10 years’ worth of résumés the company had received. Because tech is a male-dominated industry, the majority of those résumés came from men.
The result: The system was unintentionally trained to choose male candidates over female candidates. It would reportedly penalize résumés containing the word “women’s” or the names of certain all-women colleges. Although Amazon made changes to make these terms neutral, the company lost confidence that the program was indeed gender neutral in all other areas.
Why it matters: We can’t treat artificial intelligence as inherently unbiased. Training the systems on biased data means the algorithms also become biased. If unfair AI hiring programs like this aren’t uncovered before being implemented, they will perpetuate long-standing diversity issues in business rather than solve them.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate
Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.
10 Breakthrough Technologies 2023
These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway
Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.