News Feed Filtering: A Threat to Online Political Expression?

facebook-organic-reach

In recent years, it has become conventional wisdom that social media sites like Facebook and Twitter serve as democratic platforms for citizens to express their political views and opinions. The key assumption underlying this idea is that these sites operate on a peer-to-peer sharing structure that allows anyone to communicate to a public audience with their posts—provided, of course, that these audience members are connected to them as “Friends,” “Followers,” etc. Therefore, a social media user’s level of reach and influence (including political influence) could be understood as simply a matter of how many connections they have in their networks.

However, as Facebook has moved more and more towards a model of complex algorithmic news feed filtering, the notion that one’s posts will actually be seen by one’s peers is increasingly being called into question. This development, which is likely to become industry standard, may have important consequences for the use of social media as a platform of political expression.

At the moment, these sorts of concerns about Facebook are largely centered on the diminishing “organic reach” of organizational Facebook Pages, such as those of non-profit activist groups (in addition to businesses and brands). What is “organic reach,” you ask? As B. Traven explains in a great piece for ValleyWag,

Put simply, “organic reach” is the number of people who potentially could see any given Facebook post in their newsfeed. Long gone are the days when Facebook would simply show you everything that happened in your network in strict chronological order. Instead, algorithms filter the flood of updates, posts, photos, and stories down to the few that they calculate you would be most interested in. (Many people would agree that these algorithms are not very good, which is why Facebook is putting so much effort into refining them.) This means that even if I have, say, 400 friends, only a dozen or so might actually see any given thing I post. One way to measure your reach, then, is as the percentage of your total followers who (potentially) see each of your posts. This is the ratio that Facebook has more-or-less publicly admitted it is ramping down to a target range of 1-2% for Pages. In other words, even if an organization’s Page has 10,000 followers, any given item they post might only reach 100-200 of them.

As Traven argues, this lowered capacity for political organizations to send messages to their own followers significantly compromises Facebook’s role as an open marketplace of political expression. The only way that organizations can truly bypass these filters is to pay Facebook to promote their posts, and while major commercial brands (and perhaps major party candidates) can afford the price, small non-profits and advocacy groups largely cannot. Thus, Facebook is starting to look less like an even playing field for political communication and more like television, where big-ticket ad buys dominate and the resource-rich get a much louder megaphone than the resource-poor.

Facebook-Reach

However, I can’t help but think that this is just the tip of the iceberg. The impact of news feed filtering will likely extend far beyond the realm of organizational Facebook Pages, which are already experiencing the effects of a diminished “organic reach.” Now that we know that Facebook is making behind-the-scenes decisions about which posts you will see from your connections and which posts will be hidden from you, how else could this model end up skewing the flow of online political discourse?

To explore this question, it is helpful to understand exactly how Facebook currently filters its News Feeds. However, this is not as easy as it sounds. According to TechCrunch, the algorithm that sorts Facebook News Feeds uses over 100,000 different factors to determine the relevance of posts. Among the most important factors are the following (as told to TechCrunch by Facebook News Feed Director of Product Management Will Cathart):

• How popular (Liked, commented on, shared, clicked) are the post creator’s past posts with everyone
• How popular is this post with everyone who has already seen it
• How popular have the post creator’s past posts been with the viewer
• Doe the type of post (status update, photo, video, link) match what types have been popular with the viewer in the past
• How recently was the post published

In addition, Facebook recently announced that it would push to prioritize “high-quality” content in users’ feeds and reduce the amount of spam posts. What constitutes “high quality” in Facebook’s eyes? Here is how the company defines it on its website:

While the goal of News Feed is to show high quality posts to people, we wanted to better understand what high quality means. To do this we decided to develop a new algorithm to factor into News Feed. To develop it, we first surveyed thousands of people to understand what factors make posts from Pages high quality. Some of the questions we asked included:
• Is this timely and relevant content?
• Is this content from a source you would trust?
• Would you share it with friends or recommend it to others?
• Is the content genuinely interesting to you or is it trying to game News Feed distribution? (e.g., asking for people to like the content)
• Would you call this a low quality post or meme?
• Would you complain about seeing this content in your News Feed?

In other words, Facebook is ostensibly adopting a model of ‘giving people what they want,’ rather than making editorial decisions about quality based on their professional judgment. Indeed, in a piece for Fortune, Matthew Ingram writes that “Facebook’s director of news partnerships, Andy Mitchell, rejected the idea that the social platform is some kind of gatekeeper when it comes to the discovery of news, saying Facebook doesn’t control the news-feed — users control it by telling Facebook what they are interested in. In other words, Facebook sees itself as merely reflecting the desires of its users.” Critics, on the other hand, have pointed out that the process is still susceptible to censorship based on various national laws as well as the site’s own community standards.

However, an even bigger potential issue here is how news feed filtering may pave the way for Facebook (and social media sites more generally) to manipulate the flow of news and political expression along ideological lines. Surely, the company would vehemently deny doing this. Yet I couldn’t help but think of what the recent gay rights-celebrating rainbow profile picture campaign suggests about Facebook’s capacity to elevate the flow of posts that display a particular political point of view. Did Facebook actually favor these rainbow picture posts in its News Feed algorithm, above and beyond the criteria outlined above? Honestly, I have no idea. Yet the very fact that the company itself orchestrated this particular meme on its own platform indicates that it is not entirely above the fray of partisan political battles, and furthermore, appears willing to leverage its prominent position in the viral media culture to advance certain political ideas.

Since I strongly support the sentiment behind the rainbow profile pictures, it’s hardly a sticking point for me. However, the campaign seems to complicate Facebook’s insistence on being a fully neutral channel for political discourse with no vested interests beyond serving the most relevant content to its users. Could it be possible that the site would play with its algorithms to filter down the reach of posts that it deems inimical to its worldview, or boost the flow of posts that support issues favored by its leadership? This may sound like conspiracy theory territory, but Facebook has opened Pandora’s Box in a very real sense by moving to a model of news feed manipulation.

What, then, could be a solution to this looming threat? It seems unreasonable to argue that Facebook should simply go back to its original model of showing every piece of content from Friends in the order in which they were posted. As the company often points out, there are so many people and organizations now on Facebook that an average user has approximately 1,500 posts that are eligible to show up in their News Feed every day—i.e. far too many to wade through. Twitter, by contrast, has had a long-standing commitment to showing every tweet from a user’s connections in chronological order, but recently announced via the New York Times that they are “question[ing] our reverse chronological timeline” in order to stay competitive and profitable.

At a time of increasing news feed manipulation, what I’d like to see is not only more transparency about how these feeds are actually filtered (which could alleviate fears over ideological bias), but also more user choice about how the criteria are applied. In other words, Facebook users should be able to decide how much “organic reach” they receive from their own connections. Now, it is true that Facebook already allows users to control their feeds to some extent by hiding posts from specific “Friends.” However, these options can be greatly expanded. Want to see more posts from your political organization Pages than just the default 1-2%? You should be able to choose such an option with a simple click of a button. Want to see more political posts that diverge from your views, rather than just seeing those that the algorithm determines you’d be most likely to “Like?” You should be able to have that choice in the settings. Putting users in more control of their own news feeds would go a long way towards mitigating concerns that social media sites like Facebook are becoming the new political media gatekeepers.

Leave a comment