Building Crowds of Humans into Software
By using crowdsourcing for difficult tasks such as understanding speech or images, the software could enable smarter apps.
Enabling software to punt its toughest tasks to humans should result in smarter mobile apps and other programs, say the founders of the newly launched company MobileWorks. The startup makes it possible for programmers to build human intelligence into their software using crowdsourcing—the practice of parceling out relatively small parts of a larger problem to many different people over the Web.
Sites such as Amazon Mechanical Turk already provide a place to post tasks to be solved by a crowd of anonymous workers, paid small amounts for each task they complete. But Anand Kulkarni, one of MobileWorks’s three founders, says that Amazon’s service and others are too inaccurate and slow to be built into software that needs to solve problems with a quick turnaround.
“Crowdsourcing is attractive because computers are much worse than humans at some tasks,” says Kulkarni, “but what is out there today is not giving us the full potential of having a human inside your computer program.” Many of MobileWorks’s ideas originated at the University of California, Berkeley, where Kulkarni used to research crowdsourcing and its potential to solve immediate problems, such as robot navigation, that are challenging for software. “A task like that, where you need an answer in real time, could not be solved by Mechanical Turk because it does not behave like a computer,” says Kulkarni. “It can take days to get an answer back, and it may be wrong.”
MobileWorks can take on such tasks, he says. Existing crowdsourcing services involve a person filling out an online form to specify a task to be completed. By contrast, MobileWorks takes on jobs sent in by software using application programming interfaces (APIs), which allow one piece of software to tap into another. MobileWorks’s software translates the job sent in over its APIs into tasks distributed to the company’s crowd of workers. The results are then collated and sent back to the software that made the request, which behaves as if it got the answer from another piece of software, not a crowd of humans. “It’s a black box for human intelligence,” says Kulkarni. “Software can treat us like another piece of software with the intelligence of a human.”
MobileWorks has so far created dedicated APIs to extract data from Web pages or transcribe handwriting into text. Kulkarni says the company can also “push the limits” of crowdsourcing and tackle tasks such as speech transcription or image processing in real time. Such requests are flagged as needing rapid answers. MobileWorks’s software pushes those requests ahead of others and will call on extra workers by text message if the current number online is not sufficient.
The company’s workers have been recruited from the developing world, including people from the slums of Mumbai, India, and a group of housewives in Rewadi, near Delhi, India. One-third of those workers contribute using mobile phones. “Our crowd is not anonymous, and we sometimes call and talk with them directly,” says Kulkarni. “You get a workforce that is loyal and motivated that way.” That, along with a commitment to paying reasonable wages, ensures reliable performance, he says, and provides work to motivated people stuck in places with scant other opportunities.
In a test that used both MobileWorks and Mechanical Turk to find e-mail addresses on a Web page, MobileWorks won, its founders claim. Mechanical Turk provided answers in 40 minutes, but they were only half correct. MobileWorks got fully correct answers back in under a minute.
Says Michael Bernstein, who researches crowdsourcing at MIT’s Computer Science and Artificial Intelligence Laboratory and last year developed a word processor called Soylent that can do things like shorten a selected sentence by tapping into Mechanical Turk, “It’s exciting that MobileWorks is taking a trend that’s happening in research and making it more widely available.” Although growing numbers of academics are working on ways to build crowds into software, the strategy hasn’t been used commercially.
Bernstein adds that the startup’s hands-on attitude to its workforce, and ability to use text messaging to tell them when their help is needed, also sets MobileWorks apart. “The ability to spin up more workers as you need them is very powerful,” says Bernstein. “On Mechanical Turk, your tasks can just stall because not enough people choose to work on them.” Amazon’s crowdsourcing platform has also earned such a reputation for unreliable quality, says Bernstein, that researchers like him typically send each task to Mechanical Turk three to five times to be sure of a good answer.
Become an Insider to get the story behind the story — and before anyone else.