The trend of applying elements of computer games to nongame situations relies on two assumptions. One is that people are more likely to do something—shop for something, let’s say, or go to a bar, or share information with people at work—if they enjoy it. The other is that they’ll be more likely to enjoy anything if it’s part of a game.
But just because games can be fun doesn’t mean they are necessarily a good way to motivate someone to do something, says Judd Antin, an anthropologist and social psychologist who studies online experiences at Yahoo Research (and was one of this year’s TR35 winners). Gamification, especially in offices, can actually discourage some people, he says; understanding what works in different settings will be crucial if gamelike systems are to have any staying power. Antin spoke with Technology Review’s deputy editor, Brian Bergstein.
TR: Do you see a lot of evidence of companies that are installing gamelike programs in hopes of encouraging their workforces to do certain things?
Antin: There are lots of examples of this. I think it’s all driven by hype to a degree, and also by the idea that “what harm could it possibly do?” When I give talks about this, I have a slide with tiny print: “Why not gamify? Well, here are 50 reasons.” I can barely fit them on the slide.
See the rest of our Business Impact report on The Business of Games.
What are some of the reasons?
From research in social psychology and behavioral economics, we know that the most likely thing that will happen is you’ll motivate some people, you’ll demotivate other people, and for a third group there’ll be no effect at all. And we won’t really know who are the people we’re motivating more or less, or why. And we won’t know what we’re encouraging them to do. It could be that we’re motivating shallow people who are interested in the quick serotonin reward of winning a leaderboard on a given day to [for example] post a thousand comments.
Why do some incentives actually demotivate some people?
There’s this really well-known phenomenon in behavioral economics and social psychology called crowding out. It is about the interaction between intrinsic and extrinsic rewards—if you love to do something and then I pay you to do it, it can become about the money, and not about the love, and ultimately your motivation is lower.
Think about Google Knol [an attempt to create an expert-written encyclopedia to rival Wikipedia]. That thing was destined for failure from the beginning. Because they thought you apply the market to anything. You want people to be motivated by tangible rewards, by monetary rewards, so you give them a cut of the advertising. But what you lose is all of the intrinsic motivation. You find that when you’re in it for the money, you’re not in it for the community as much.
Are you saying there are more reasons not to install some game program in an office than there are reasons to do it?
I don’t want to come across as a wet blanket. That’s an academic shtick, right? Take something that everybody thinks is cool and then talk about why it’s not cool. That’s not me. I believe this is a promising way of motivating people to collaborate more, and what we need is to have a more nuanced view of it. We can look more carefully and say, “Well, it’s not just that we want to motivate contributions; we want to motivate specific types of people who might respond to specific types of things to do specific behaviors. And then we want to reward them appropriately for the context they’re in.”
What’s an example of someone doing that correctly?
Wikipedia. I don’t know if you’re familiar with Wikipedia barnstars. Barnstars are badges that are given from one Wikipedian to another for doing work the community values. So they have this social context. The point is, if you get a badge, you should be proud of having gotten a badge. And because they have this social context, I believe that’s more motivational.
Another advantage to the barnstars would seem to be that you can’t game the system. They’re awarded for quality work, not quantity.
Exactly—you can’t fake it. The barnstar example is great for another reason, which is that if I want to be a member of the Wikipedia community and I don’t know how, one of the things I might do is look at the list of barnstars. It tells me something about what this community values.
So you’re not down on adding a “game layer” necessarily—you just want to see it done with more sophistication?
That’s right. For example, people have different dispositions, and we can measure them. Some people are more pro-social, which means they care about my rewards and your rewards. Some people are selfish—they don’t care about mine. Some people are more competitive, meaning “it doesn’t matter about mine and yours—I want to maximize the difference.”
That’s what I mean about nuance. It’s not just this simple behaviorist-psychology idea, which is that you give people a reward just like you give a rat a piece of cheese. But what is the reward? Is it status? Is it reputation? Is it group identification, is it goal-seeking, or skill development? If you can identify those things, you can message people differently and cater to their being a pro-social or self-interested or competitive person, and maybe even get more involved and thoughtful participation out of people.
How a Russian cyberwar in Ukraine could ripple out globally
Soldiers and tanks may care about national borders. Cyber doesn't.
Meet Altos Labs, Silicon Valley’s latest wild bet on living forever
Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Meta’s new learning algorithm can teach AI to multi-task
The single technique for teaching neural networks multiple skills is a step towards general-purpose AI.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.