Filter bubbles are created by algorithms that search engines, message aggregators, and social networking platforms use to personalize and customize information for each user. When taking information and opinion diversity as a democratic ideal, thus, filter bubbles could represent a threat to democracies. This leads to the following two questions: (i) How can people identify whether they are already inside a filter bubble? (ii) What can people do to escape filter bubbles?

Politically relevant debates can sometimes become unpleasant, especially when two sides believe in the exclusive correctness of their views. There is human tendency to classify information in media coverage that does not support one’s opinion as biased or incorrect (Giner-Sorolla & Chaiken, 1994; Shapiro & Bloch-Elkon, 2008). Shannon Fisher, who studied political leadership at the University of Virginia, made a general and important statement about this topic on Quora, an online question-and-answer platform available to everyone. She pointed out that “it is important for everyone to independently verify information gathered through social media and many news sources with a known political persuasion before presenting it to others as fact. Unfortunately, few people do this research” (Fisher, 2018). In his book 2.0, Suntein described the importance of unplanned and unanticipated encounters with material (news, articles) that people would not choose in advance; their coming across different topics and views that they had not first selected could protect against fragmentation and extremism in a good democratic system (Sunstein, 2009, pp. 5–6).


Presentation of additional information from the German web browser extension “Kontext”.

Sunstein’s depiction would be an ideal scenario, but sometimes, people may be or feel already “trapped in a filter bubble”; they receive their news from one unique group of sources that reflect their own ideologies. However, there are already some software tools that investigate the distribution of news in detail and provide further references for the original source. One of these tools, is the German web browser extension “Kontext”, which was developed by Arne Semsrott, Moritz Klack, and Andy Lindemann. The browser extension presents additional information about the reactionary views of politicians and refers to the original sources. This tool might be useful for people who wish to have more context information (e.g., on politician’s previous statements). The following figure shows an example of the web extension. There are also several other algorithms and digital tools to help you avoid being in a filter bubble, such as “Read Across The Aisle,“ where you can read news from a variety of sources and check the political orientation of each source, and ““, which helps you find out how polarizing the content on your news feed is when compared to the feed of your friends as a whole.

Bozdag et al. (2015) evaluated and discussed the standards of these online tools compared to their democracy model and the tools exemplified in their paper: “Breaking the Filter Bubble: Democracy and Design Different.” They demonstrated that “not all relevant models of democracy are represented in the overview of the instruments available for promoting diversity” (Bozdag et al., 2015). Also, the majority of the instruments they studied to avoid filter bubbles were designed in line with the standards of liberal or deliberative models of democracy.

In the end, it’s up to every user to actively work against filter bubbles if they wish to not get trapped in them. Please consider the following thoughts when it comes to filter bubbles:

  1. It surely makes sense to always scrutinize one’s point of view about political issues and cross-check them with facts.
  2. It might be worthwhile to give it a try and actively look for political information outside one’s interpersonal and informational network.
  3. Develop a better understanding of technologies such as Google, Facebook and YouTube, which indirectly could promote filter bubbles.

To conclude this article, the words of Pariser are effective: “If we want to know what the world really looks like, we have to understand how filters shape and skew our view of it” (Pariser, 2001, p. 83). To develop this understading and make it accessible to each and every user seems to be one crucial challenge for technology researchers and media educators. If you are further interested in the idea of “filter bubbles,” take a look at Pariser’s TED talk.



Bozdag, E., & van den Hoven, J. (2015). Breaking the filter bubble: Democracy and design. Ethics and Information Technology17(4), 249–265.

Fisher S. (2018). In what ways, if any, have you noticed a “feedback loop” in the media you consume (social network, video, and podcast, and how do you expose yourself to opposing viewpoints)? [Online comment]. Retrieved from

Giner-Sorolla, R., & Chaiken, S. (1994). The causes of hostile media judgments. Journal of Experimental Social Psychology30(2), 165-180.

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.

Semsrott, A., Moritz, K., and Lindemann, A. (2018). Kontext [Web browser extension]. Retrieved from

Shapiro, R. Y., & Bloch‐Elkon, Y. (2008). Do the facts speak for themselves? Partisan disagreement as a challenge to democratic competence. Critical Review20(1-2), 115-139.

Sustein, C. (2009). 2.0. Nova Jersey: Princeton University Press.

Myth 4: Algorithms leave users helplessly exposed to filter bubbles.
Tagged on: