Skip to Content

Facebook Tweaks Its News Feed Algorithm, and the Winner Is: Facebook

The company says its News Feed is meant to “inform,” but that can sometimes be at odds with what users find most engaging.

Facebook announced today that it’s giving the power back to the people. Well, sort of.

In a post called “Building a Better News Feed for You,” Facebook said it is tweaking the algorithm that controls what posts you see, weighting it in favor of things your friends and family are sharing, as opposed to things that publishers like BuzzFeed or the New York Times (or Technology Review!) are sharing on their Facebook pages. The post then goes on to lay out Facebook’s “News Feed Values,” making a big deal of the idea that the company is interested in “informing” people and “being inclusive of all perspectives.”

But there are some fundamental contradictions that Facebook leaves unresolved, and they speak volumes about the company’s motivations.

First, there’s this sentence:

Our integrity depends on being inclusive of all perspectives and view points, and using ranking to connect people with the stories and sources they find the most meaningful and engaging.

That sounds good, but what happens when the users themselves aren’t inclusive of all perspectives and viewpoints? “Meaningful and engaging” would by definition contradict “inclusive of all perspectives.” This is a tension that has been raised before—the so-called “echo chamber” effect has been well-studied by Facebook employees, in fact.

But when faced with the choice between “inclusive” and “engaging,” the company will pick the latter every time. This isn’t a nefarious choice; it’s just business. Facebook generates value based on access to people’s attention. If people pay more attention to their feeds, Facebook can charge more for ads.

The thing is, the company has become so large that its News Feed is, like it or not, among the most influential media outlets in history. It has a profound effect on how its 1.6 billion users get information—and indeed, the company lists one of its News Feed Values as “Your feed should inform.” But here’s it's explanation of what that means:

... this could be a post about a current event, a story about your favorite celebrity, a piece of local news, or a recipe. We’re always working to better understand what is interesting and informative to you personally, so those stories appear higher up in your feed.

In other words, “informing” means “telling people whatever they will pay attention to.” This makes sense: Facebook is in the business of maximizing attention paid to the site. It wants a sticky experience. And as the number of original posts has gone down of late and other attention hogs like Snapchat have come on the scene, Facebook has found itself with stiff competition.

So it appears Facebook’s thinking is that people pay more attention to their feeds when they are served posts from friends and family than when they get stories from publishers. This may be bad news for publishers (The Verge told readers “Bye!” in its story about the algorithm tweak today). They already took a sharp hit in traffic when Facebook performed a similar algorithmic tweak earlier this year.

But Facebook doesn’t care much about publishers (even those to whom it pays millions of dollars for content), and why should it? Facebook controls the audience. People will come to Facebook, no matter what.

The question, then, is: what is Facebook’s responsibility as a media powerhouse? The company’s statements today suggest that even though it wields enough power to, for example, influence voter turnout in an election, it has very little interest in thinking of itself as a news organization.

Once again, that’s fine insofar as Facebook is a place for people to go and check in on their friends, their family, and a bunch of random people they once met at a party. But the company has become perhaps the largest single distributor of information on the planet, and it is opting to show people what they like, rather than tackling the much tougher question of what, perhaps, they should see.

(Read more: PressThink, New York Times, Recode, Financial Times, “What Facebook Knows”)

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.