Facebook Says You Filter News More Than Its Algorithm Does
Ever wonder how much news Facebook’s algorithm may be sorting out of your News Feed that you don’t agree with politically? Not much, the social network says.
Facebook studied millions of its most political users and determined that while its algorithm tweaks what you see most prominently in your feed, you’re the one really limiting how much news and opinion you take in from people of different political viewpoints.
In an effort to explore how people consume news shared by friends of different ideological leanings, Facebook’s researchers pored over millions of URLs shared by its U.S.-based users who identify themselves in their profiles as politically liberal or conservative. The work, which sheds more light on how we glean information from our ever-growing, technologically enhanced tangles of social connections, was published in a paper in Science on Thursday.
Eytan Bakshy, a research scientist on Facebook’s data science team and coauthor of the paper, says the group found that Facebook’s News Feed algorithm only slightly decreases users’ exposure to news shared by those with opposing viewpoints.
“In the end, we find individual choices, both in terms of who they choose to be friends with and what they select, matters more than the effect of algorithmic sorting,” he says.
The work comes more than three years after Bakshy and other researchers concluded that while you’re more likely to look at and share information with your closest connections, most of the information you get on Facebook stems from the web of people you’re weakly connected to—refuting the idea that online social networks create “filter bubbles” limiting what we see to what we want to see (see “What Facebook Knows”).
However, Bakshy says, the previous research, published in 2012, didn’t directly measure the extent to which you’re exposed to information from people whose ideological viewpoints are opposite from yours.
In an effort to sort that out, researchers looked at anonymized data for 10.1 million Facebook users who define themselves as liberal or conservative, and seven million URLs for news stories shared on Facebook from July 7 to January 7. After using software to identify URLs that consisted of “hard” news stories (pieces focused on topics like national news and politics) that were shared by a minimum of 20 users who had a listed political affiliation, researchers labeled each story as being aligned with liberal, neutral, or conservative ideologies, depending on the average political leaning of those who shared the stories.
Researchers found that 24 percent of the “hard” stories that liberal Facebook users’ friends shared were aligned with conservative users, while 35 percent of the “hard” stories that conservative Facebook users’ friends shared were aligned with liberal users—an average of 29.5 percent exposure, overall, to content from the other side of the political spectrum.
The researchers also looked at the impact of Facebook’s News Feed ranking algorithm on the kind of news you see. Bakshy says that overall, the algorithm reduces users’ exposure to content from friends who have opposing viewpoints by less than 1 percentage point—from 29.5 percent to 28.9 percent.
And when it came down to what users ended up actually reading, researchers report that conservatives were 17 percent less likely to click on liberally aligned articles than other “hard” stories in their news feeds, while liberals were 6 percent less likely to click on conservatively aligned articles presented to them.
Sharad Goel, an assistant professor at Stanford who has studied filter bubbles, says people in the field have talked about this issue for several years but Facebook alone was in a position to explore it. He says one thing worth keeping in mind is that people may get their news from many sources, which can dwarf the impact of what they see on Facebook.
“I do agree with one of their main messages—that the algorithm itself is not driving a lot of polarization,” he says.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.
GPT-4 is bigger and better than ChatGPT—but OpenAI won’t say why
We got a first look at the much-anticipated big new language model from OpenAI. But this time how it works is even more deeply under wraps.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.