On your way to this article, you probably took part in several experiments. You may have helped a search engine test a new way of displaying its results or an online retailer fine-tune an algorithm for recommending products. You may even have helped a news website decide which of two headlines readers are most likely to click on.
In other words, whether you realize it or not, the Web is already a gigantic, nonstop user-testing laboratory. Experimentation offers companies a powerful way to understand what customers want and how they are likely to behave, but it also seems that few people realize quite how much of it is going on.
This became clear in June, when Facebook experienced a backlash after publishing a study on the way negative emotions can spread across its network. The study, conducted by a team of internal researchers and academics, involved showing some people more negative posts than they would otherwise have seen, and then measuring how this affected their behavior. They in fact tended to post more negative content themselves, revealing a kind of “emotional contagion” (see “Facebook’s Emotion Study Is Just Its Latest Effort to Prod Users”).
Businesses have performed market research and other small experiments for years, but the practice has reached new levels of sophistication and complexity, largely because it is so easy to control the user experience on the Web, and then track how people’s behavior changes (see “What Facebook Knows”).
So companies with large numbers of users routinely tweak the information some of them see, and measure the resulting effect on their behavior—a practice known in the industry as A/B testing. Next time you see a credit card offer, for example, you might be one of a small group of users selected at random to see a new design. Or when you log onto Gmail, you may be one of a chosen subset that gets to use a new feature developed by Google’s engineers. (see “Seeking Edge, Websites Turn to Experiments.”)
“When doing things online, there’s a very large probability you’re going to be involved in multiple experiments every day,” Sinan Aral, a professor at MIT’s Sloan School of Management, said during a break at a conference for practitioners of large-scale user experiments last weekend in Cambridge, Massachusetts. “Look at Google, Amazon, eBay, Airbnb, Facebook—all of these businesses run hundreds of experiments, and they also account for a large proportion of Web traffic.”
At the Sloan conference, Ron Kohavi, general manager of the analysis and experimentation team at Microsoft, said each time someone uses the company’s search engine, Bing, he or she is probably involved in around 300 experiments. The insights that designers, engineers, and product managers can glean from these experiments can be worth millions of dollars in advertising revenue, Kohavi said.
Kohavi’s group has developed a platform to allow other parts of the company to perform their own user experiments. The company’s flagship productivity software, Office, would likely benefit from more user experimentation, he said.
Facebook’s emotion study, published in the Proceedings of the National Academies of Science, went further than most daily Web experiments that measure only miniscule differences after influencing people’s behavior in subtle ways. But MIT’s Aral notes that eliciting an emotional response does not make an experiment unethical. “I was very surprised to see that people were upset about that,” he said, pointing out that that many television ads and newspaper headlines are arguably just as emotionally manipulative.
Yet, perhaps just as importantly, the Facebook study may also have revealed how few people realize they are being prodded and probed at all.
“I found the backlash quite paradoxical,” said Alessandro Acquisti, a professor at Carnegie Mellon University who studies attitudes towards privacy and security. Although he thinks user experiments need to be conducted with care and oversight, he felt the Facebook study was actually remarkably transparent. “If you’re really worried about experimentation, you should look at how it’s being used opaquely every day you go online,” Acquisti said.
Some practitioners say experimenters need to think carefully about how they present their work to users. Duncan Watts, a principal researcher at Microsoft (and previously a professor of sociology at Columbia University), said this was a problem with the Facebook study. “When people hear the word ‘experiment’ and they hear the word ‘manipulation’ they think of lab rats,” he said. “Inside this community we have a very different interpretation. We think of a systematic test of a hypothesis by randomizing assignment to different treatment conditions.”
To assuage concerns among its users, Facebook said this month that it would introduce a new process for reviewing potentially sensitive research, although it did not say what that would mean.
But even if experiments continue to have only subtle effects, some may find their scope, scale, and growing sophistication unsettling. “What’s happened in the last few years—and this to me is crucial—is that it’s becoming very specific to a person based on their personal information,” said Acquisti. “It’s becoming ubiquitous, and it’s becoming much more measurable.”
This startup wants to copy you into an embryo for organ harvesting
With plans to create realistic synthetic embryos, grown in jars, Renewal Bio is on a journey to the horizon of science and ethics.
VR is as good as psychedelics at helping people reach transcendence
On key metrics, a VR experience elicited a response indistinguishable from subjects who took medium doses of LSD or magic mushrooms.
This nanoparticle could be the key to a universal covid vaccine
Ending the covid pandemic might well require a vaccine that protects against any new strains. Researchers may have found a strategy that will work.
This artist is dominating AI-generated art. And he’s not happy about it.
Greg Rutkowski is a more popular prompt than Picasso.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.