Skip to Content

Algorithm Awareness

How the news feed on Facebook decides what you get to see.

Increasingly, it is algorithms that choose which products to recommend to us and algorithms that decide whether we should receive a new credit card. But these algorithms are buried outside our perception. How does one begin to make sense of these mysterious hidden forces?

Karrie Karahalios
Karrie Karahalios

The question gained resonance recently when Facebook revealed a scientific study on “emotion contagion” that had been conducted by means of its news feed. The study showed that displaying fewer positive updates in people’s feeds causes them to post fewer positive and more negative messages of their own. This result is interesting but disturbing, revealing the full power of Facebook’s algorithmic influence as well as its willingness to use it.

To explore the issue of algorithmic awareness, in 2013 three colleagues and I built a tool that helps people understand how their Facebook news feed works.

Using Facebook’s own programming interface, our tool displayed a list of stories that appeared on one’s news feed on the left half of the screen. On the right, users saw a list of stories posted by their entire friend network—that is, they saw the unadulterated feed with no algorithmic curation or manipulation.

A third panel showed which friends’ posts were predominantly hidden and which friends’ posts appeared most often. Finally, the tool allowed users to manually choose which posts they desired to see and which posts they wanted to discard.

We recruited 40 people—a small sample but one closely representative of the demographics of the U.S.—to participate in a study to see how they made sense of their news feed. Some were shocked to learn that their feed was manipulated at all. But by the end of our study, as participants chose what posts they wanted to see, they found value in the feed they curated.

When we followed up months later, many said they felt empowered. Some had changed their Facebook settings so they could manipulate the feed themselves. Of the 40 participants, one person quit using Facebook altogether because it violated an expectation of how a feed should work.

The public outcry over Facebook’s emotion study showed that few people truly grasp the way algorithms shape the world we experience. And our research shows the importance of empowering people to take control of that experience.

We deserve to understand the power that algorithms hold over us, for better or worse.

Karrie Karahalios is an associate professor of computer science at the University of Illinois.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.