Is the Gig Economy Rigged?
A new study suggests that racial and gender bias affect the freelancing websites TaskRabbit and Fiverr—and may be baked into underlying algorithms.
Apps and sites that can be used to hire people for individual tasks like picking up groceries or designing a new logo have taken off in recent years, promising a more efficient and fairer marketplace for employment. However, a new study out of Northeastern University in Boston suggests that racial and sexual discrimination may be common on two popular “gig economy” platforms.
Researchers led by Christo Wilson, an assistant professor at Northeastern, and Ancsa Hannák, a PhD student, examined TaskRabbit, a platform for hiring people to run errands, and Fiverr, a marketplace for creative services. On both, they found evidence of bias along racial and gender lines.
It’s just one example of how bias creeps into online platforms and services. And it’s troubling because the gig economy promised to be not only more efficient and flexible, but also less biased—since algorithms do the work of connecting people.
On Fiverr, the researchers found evidence that black and Asian workers received lower ratings than white people. And on TaskRabbit, women received fewer reviews than men, and black workers received lower ratings than white ones. Perhaps most troubling, the researchers also found evidence of such bias in the recommendation algorithm on TaskRabbit. The research will be presented at an academic conference in New York this week.
It’s impossible to say for certain that the correlation identified by Wilson and Hannák is due to racial and gender bias on the part of hirers, as opposed to some unknown confounding factor, but Wilson says the pattern is concerning. “We’re told this is the future of labor,” he says. “If you’re going to roll out an algorithm that’s going to be used by millions of people, you have some kind of responsibility to the public to examine what you’re deploying, evaluate it, and see if it’s going to have any of these negative side effects.”
A spokesperson for Fiverr argues that the study’s methodology was flawed in that it ignores factors such as international boundaries and language differences. She also notes that users do not have to provide any demographic information in order to use the service, making it easy to avoid discrimination. TaskRabbit did not respond to a request for comment.
There is, however, growing evidence that bias can affect all sorts of digital services. Last month, researchers from MIT, Stanford, and the University of Washington discovered that that Uber drivers in Boston canceled trips more often for customers with African-American-sounding names, and that black Uber customers in Seattle faced longer wait times than their white counterparts. In a study published last year, researchers at CMU found evidence that ads for high-paying jobs were shown more often to men than to women.
In many cases the bias seen just reflects what’s found in the real world, such as the conscious and subconscious prejudice employers may bring to hiring decisions. So for recommendation engines or machine-learning systems, the question is how bias might be removed, either from the data sets fed to algorithms or from the algorithms themselves.
“People have this idea that because it’s a computer it’s neutral,” Wilson adds. “If you have data that’s biased, it makes sense that you’re going to train an algorithm that’s biased.”
Don MacKenzie, an assistant professor at the University of Washington and one of the authors of the recent Uber study, stresses that the study doesn’t prove racial or gender bias is at play. But he says it is important to consider bias in the gig economy and underlying algorithms—adding that the problem should be manageable if companies are careful.
“This is an emerging area, and if there is a set of best practices, I am not aware of it,” MacKenzie says. “From my perspective, companies, developers, and data scientists should be watchful, listen to feedback, and not be afraid to try out different solutions. I think if everyone approaches these issues in good faith, constructively, and with a willingness to try different things, we can get closer to eliminating bias in these systems.”
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today