Skip to Content

How Bets Among Employees Can Guide a Company's Future

Internal prediction markets enable colleagues to wager on the fate of crucial projects and the success of products in the pipeline.
December 9, 2010

The need to predict the future, as exciting as it sounds, crops up in corporate life in terribly mundane ways. Case in point: large video-game companies need to know where to put their marketing dollars many months before they complete their games. Inevitably, some games will be stinkers, hardly worth the investment of an ad campaign. But how do you know which ones?

Crowdcasting: In an internal prediction market, employees with inside knowledge of projects bet play money anonymously. In this example, employees wager on whether they believe a team will reach a delivery milestone for a software project.

Here’s how one very large video-game company used to guess the answer: its marketing people would predict the score their games in progress would garner on the website Metacritic, which aggregates game reviews. But why would the marketing people know more than the game’s developers?

Three years ago, a startup called Crowdcast suggested a different tactic. Why not take hundreds of your lowliest employees, the ones in the trenches who are actually making and testing these games, and ask them what they think the Metacritic scores will be? Better yet, why not give them each $10,000 in play money and ask them to bet on the outcome? Let them accumulate a pot of pretend wealth if they’re right. Turn game marketing prediction into, well, a game.

To the executives’ delight, the employees’ Metacritic predictions turned out to be 32 percent more accurate. More disturbingly, their accuracy was inversely proportional to their place in the hierarchy. The closer you got to the C-suites, the less of a clue you had—and the lower your pretend wealth in Crowdcast’s game.

That embarrassing factoid might explain why this particular video-game company, like many Crowdcast customers, wants such stories to remain anonymous. “It’s kind of experimental,” explains Mat Fogarty, Crowdcast’s sardonic British CEO, “and it may undermine the credibility of their awesome management.”

Indeed, anonymity and uncomfortable revelations in the boardroom are Crowdcast’s stock in trade. The San Francisco startup already boasts clients and partners as diverse as Hallmark, Hershey’s, and Harvard Business School. It is built on the back of years of research into how internal prediction markets work. In such a market, managers ask employees questions about the future of their product and let them bet on the answers, without knowing who bet what. The results can be scarily accurate.

In September, Crowdcast ran a prediction market for a large American car company, one that normally runs its designs for new autos through a car clinic—a lengthy and expensive kind of focus group of buyers. Crowdcast’s project involved asking engineers and factory supervisors what they thought the outcome of the car clinic would be.

Forty questions were in front of these ground-floor experts at any given time during the market. For example: What percentage of buyers will list this car’s dashboard as its most important feature? The trial market was so accurate that the car company will be trying another in January. The auto giant now has a new way to cut costs: use these prediction markets instead of expensive car clinics at least a third of the time.

Crowdcast calls its field “social business intelligence” rather than crowdsourcing. “Within your organization, there are people who know true future outcomes and metrics,” says Leslie Fine, Crowdcast’s chief scientist. “When is your product going to ship? How well will it do? Normally, crowdsourcing asks for creative content. We’re asking for quantitative opinions.”

Put that way, it sounds a lot more respectable than “get your employees to play a kind of fantasy football with sales and shipping dates.” But make no mistake—that’s actually what Crowdcast does. That used to be a hard sell, Fogarty admits: “It seemed weird to be talking about playing games at work and using Monopoly money.”

But the gaming approach has gotten easier as more executives have heard about large-scale prediction markets like InTrade, which accurately forecast the results of the 2008 and 2010 elections, and the Hollywood Stock Exchange, which predicts the success or failure of major movies. The current craze for “game dynamics” in apps like Foursquare and Scvngr, which let users rack up points for various tasks, also helped drive the idea home. Crowdcast differs from other prediction markets, however, because users don’t get to bet on any outcome they can name. Instead, the company runs closed markets where top executives get to pose the questions.

Part of what’s fascinating about Crowdcast’s approach is how wildly inequitable it is—much like capitalism itself. The democratic, politically correct thing to do would have been to hand those auto employees survey forms, and count all their voices equally. But that would not have given the more prescient ones a louder voice. “We’re trying to create a meritocracy of information,” says Fine, who spent more than a decade studying prediction markets at HP Labs and holds several patents in the field. In theory, if you make bad bets, you go bankrupt. (In practice, these virtual bankruptcies happen rarely, and Fine can provide a back-end bailout—say, by giving every employee an extra $10,000.)

What crowdcasting proves is that even play money talks—and losers walk. Studies show there’s no practical difference between using real money and using play money—both represent represent a person’s intention. It turns out you put your money where your mouth is, even if it’s Monopoly money. And as much as the best players like racking up millions of fake dollars, here’s the answer participants most frequently give as the reason they enjoy the game: “I believe management is listening to me through this tool.”

Management, however, doesn’t always like what it’s hearing. The biggest blow of Crowdcast’s young life came when it ran a trial market for a consumer goods company, one that makes a popular household lubricant. The market was asked about sales figures, new customer acquisition, and the price of oil (vital for lubricants) at the end of the month. In every metric, the market was more accurate than the company’s official forecast. “We nailed it,” says Fine. After presenting their results, “we were high-fiving each other.”

But Crowdcast didn’t win the contract—because it had failed to connect with the head of sales, a 20-year veteran of the company who simply ignored the evidence. As much as it believes math should win any argument, the startup is learning the importance of the personal touch. If the marketing department says a product will ship on time, but engineering is more bearish, some boardrooms may prefer to cling to the marketing fantasy. Crowdcast’s next task, therefore, is figuring out how to make an eminently disruptive tool look less threatening to “awesome management.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.