Facebook announced today that it’s giving the power back to the people. Well, sort of.
In a post called “Building a Better News Feed for You,” Facebook said it is tweaking the algorithm that controls what posts you see, weighting it in favor of things your friends and family are sharing, as opposed to things that publishers like BuzzFeed or the New York Times (or Technology Review!) are sharing on their Facebook pages. The post then goes on to lay out Facebook’s “News Feed Values,” making a big deal of the idea that the company is interested in “informing” people and “being inclusive of all perspectives.”
But there are some fundamental contradictions that Facebook leaves unresolved, and they speak volumes about the company’s motivations.
First, there’s this sentence:
Our integrity depends on being inclusive of all perspectives and view points, and using ranking to connect people with the stories and sources they find the most meaningful and engaging.
That sounds good, but what happens when the users themselves aren’t inclusive of all perspectives and viewpoints? “Meaningful and engaging” would by definition contradict “inclusive of all perspectives.” This is a tension that has been raised before—the so-called “echo chamber” effect has been well-studied by Facebook employees, in fact.
But when faced with the choice between “inclusive” and “engaging,” the company will pick the latter every time. This isn’t a nefarious choice; it’s just business. Facebook generates value based on access to people’s attention. If people pay more attention to their feeds, Facebook can charge more for ads.
The thing is, the company has become so large that its News Feed is, like it or not, among the most influential media outlets in history. It has a profound effect on how its 1.6 billion users get information—and indeed, the company lists one of its News Feed Values as “Your feed should inform.” But here’s it's explanation of what that means:
... this could be a post about a current event, a story about your favorite celebrity, a piece of local news, or a recipe. We’re always working to better understand what is interesting and informative to you personally, so those stories appear higher up in your feed.
In other words, “informing” means “telling people whatever they will pay attention to.” This makes sense: Facebook is in the business of maximizing attention paid to the site. It wants a sticky experience. And as the number of original posts has gone down of late and other attention hogs like Snapchat have come on the scene, Facebook has found itself with stiff competition.
So it appears Facebook’s thinking is that people pay more attention to their feeds when they are served posts from friends and family than when they get stories from publishers. This may be bad news for publishers (The Verge told readers “Bye!” in its story about the algorithm tweak today). They already took a sharp hit in traffic when Facebook performed a similar algorithmic tweak earlier this year.
But Facebook doesn’t care much about publishers (even those to whom it pays millions of dollars for content), and why should it? Facebook controls the audience. People will come to Facebook, no matter what.
The question, then, is: what is Facebook’s responsibility as a media powerhouse? The company’s statements today suggest that even though it wields enough power to, for example, influence voter turnout in an election, it has very little interest in thinking of itself as a news organization.
Once again, that’s fine insofar as Facebook is a place for people to go and check in on their friends, their family, and a bunch of random people they once met at a party. But the company has become perhaps the largest single distributor of information on the planet, and it is opting to show people what they like, rather than tackling the much tougher question of what, perhaps, they should see.