Algorithm Awareness
Increasingly, it is algorithms that choose which products to recommend to us and algorithms that decide whether we should receive a new credit card. But these algorithms are buried outside our perception. How does one begin to make sense of these mysterious hidden forces?

The question gained resonance recently when Facebook revealed a scientific study on “emotion contagion” that had been conducted by means of its news feed. The study showed that displaying fewer positive updates in people’s feeds causes them to post fewer positive and more negative messages of their own. This result is interesting but disturbing, revealing the full power of Facebook’s algorithmic influence as well as its willingness to use it.
To explore the issue of algorithmic awareness, in 2013 three colleagues and I built a tool that helps people understand how their Facebook news feed works.
Using Facebook’s own programming interface, our tool displayed a list of stories that appeared on one’s news feed on the left half of the screen. On the right, users saw a list of stories posted by their entire friend network—that is, they saw the unadulterated feed with no algorithmic curation or manipulation.
A third panel showed which friends’ posts were predominantly hidden and which friends’ posts appeared most often. Finally, the tool allowed users to manually choose which posts they desired to see and which posts they wanted to discard.
We recruited 40 people—a small sample but one closely representative of the demographics of the U.S.—to participate in a study to see how they made sense of their news feed. Some were shocked to learn that their feed was manipulated at all. But by the end of our study, as participants chose what posts they wanted to see, they found value in the feed they curated.
When we followed up months later, many said they felt empowered. Some had changed their Facebook settings so they could manipulate the feed themselves. Of the 40 participants, one person quit using Facebook altogether because it violated an expectation of how a feed should work.
The public outcry over Facebook’s emotion study showed that few people truly grasp the way algorithms shape the world we experience. And our research shows the importance of empowering people to take control of that experience.
We deserve to understand the power that algorithms hold over us, for better or worse.
Karrie Karahalios is an associate professor of computer science at the University of Illinois.
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.