top of page

What is the problem?

The problem that our campaign Publicize the Truth is determined to address is how Facebook’s invisible filtering process is contributing to the polarization of users’ views and how this polarization is negatively affecting the level of civil discourse on the networking site.  

 

Like all other social media sites, Facebook uses an algorithm to decide what stories are displayed on users’ News Feed. They do this in order to present users with posts that they would find personally interesting. In all, the algorithm takes 57 different components into account. The most influential determinants is the amount of  virtual interaction you have with “friends”, the  posts you have clicked on in the recent past, the time you spend reading them and the medium they are in (photo, video or articles).

 

Facebook is not hiding the fact that they are filtering users’ News Feeds.  On their Help Center page, the corporation clearly states “The stories that show in your News Feed are influenced by your connections and activity on Facebook. This helps you to see more stories that interest you from friends you interact with the most”.  However, according to a study published by the University of Illinois Urbana-Champaign, 62% of users do not know that their News Feeds are being manipulated.  It’s scary to think that over half of Facebook users are completely unaware of Facebook’s filtering process.

 

But what’s even more troubling is how this filtering process is affecting our views on important issues. An ideal unfiltered News Feed would have a compilation of users’ liberal and conservative friends’ posts. However, most of users’ feeds lean one way or another, they either predominantly display liberal stories or conservative stories. In a liberal user’s case, their News Feed are often dominated by their liberal friends’ posts. Meanwhile, their conservative friends have completely disappeared, almost to the point where user has completely forgotten their “friends” because they no longer see them on their feed. This is a perfect example of an online filter bubble because the user is only being presented with one-sided information and opinions. As a result users have highly polarized views and are unlikely to interact with people who have different perspectives.

 

The existence of these filter bubbles is troubling. Although the liberal user from the example above may not agree his or her conservative friends’ political views its still important the he or she is still exposed to their posts. This is because the presence of varied posts allows users to understand different rationales, engage in debate and exchange ideas and ultimately become well informed users.  However filter bubbles prevent this from happening. They keep users “isolated in a web of one”.

 

Frequently asked questions

bottom of page