Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

The company’s workers have been recruited from the developing world, including people from the slums of Mumbai, India, and a group of housewives in Rewadi, near Delhi, India. One-third of those workers contribute using mobile phones. “Our crowd is not anonymous, and we sometimes call and talk with them directly,” says Kulkarni. “You get a workforce that is loyal and motivated that way.” That, along with a commitment to paying reasonable wages, ensures reliable performance, he says, and provides work to motivated people stuck in places with scant other opportunities.

In a test that used both MobileWorks and Mechanical Turk to find e-mail addresses on a Web page, MobileWorks won, its founders claim. Mechanical Turk provided answers in 40 minutes, but they were only half correct. MobileWorks got fully correct answers back in under a minute.

Says Michael Bernstein, who researches crowdsourcing at MIT’s Computer Science and Artificial Intelligence Laboratory and last year developed a word processor called Soylent that can do things like shorten a selected sentence by tapping into Mechanical Turk, “It’s exciting that MobileWorks is taking a trend that’s happening in research and making it more widely available.” Although growing numbers of academics are working on ways to build crowds into software, the strategy hasn’t been used commercially.

Bernstein adds that the startup’s hands-on attitude to its workforce, and ability to use text messaging to tell them when their help is needed, also sets MobileWorks apart. “The ability to spin up more workers as you need them is very powerful,” says Bernstein. “On Mechanical Turk, your tasks can just stall because not enough people choose to work on them.” Amazon’s crowdsourcing platform has also earned such a reputation for unreliable quality, says Bernstein, that researchers like him typically send each task to Mechanical Turk three to five times to be sure of a good answer.

3 comments. Share your thoughts »

Tagged: Communications

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me