The so-called filter bubbles represent a frequently discussed subject. Filter bubbles refer to homogeneous virtual spaces which originate from algorithms, leading people to get confronted predominantly with information that is in line with their preferences, interests, and opinions (“Addressing Myths on Echo Chambers and Filter Bubbles in Online Networks”). Concerns about personalized media and their effects were already expressed before the emergence of the filter bubble concept (Sunstein, 2001). When Eli Pariser published his book “The Filter Bubble: What the Internet Is Hiding from You“  in 2011, he coined the corresponding term and provoked a debate about whether users‘ access to information and, thus, their perception of social reality is distorted by personalized results. In a self-experiment, he asked two friends to type in the search word “BP” in Google. The first one received searching hits which referred to natural destruction whereas the other friend received investment tips regarding the oil industry. His conclusion: Algorithms try to provide search results the user wants to receive. These distorted information processes, in turn, could split societies in sub-groups which get only certain types of information (Pariser, 2011; Thiel, 2012).

Do algorithms deliver personalized results so that users live in filter bubbles? The crowdsourcing project #Datenspende in Germany investigated specifically this question. Involved partners were TU Kaiserslautern, AlgorithmWatch and six media state authorithies. The research group focused on the Google algorithm and collected data of search hits from 4.400 donators during five weeks up to the parliamentary elections in Germany in 2017. Therefore, the donators installed a browser plug-in which enabled researchers to send a query to Google via the computer of the donators up to six times a day. The results of politicians and parties were examined. The queries were carried out with the browser settings of the respective donator. Further data (collected by the research team) were the approximate locations, the time of inquiry and whether the user was logged in with Google. Private search inquiries or personal data were not stored (Datenspende: BTW17, 2017).

On average, every user got the same seven to eight search hits out of nine results when it came to search for politicians. When searching for political parties, users got an average of five to six same results. The hits differed with regard to regionally or locally relevant information. Whether the donator was logged in or not, did not significantly influence the results. However, the analysts also found small clusters of users which received absolutely different hits than those present in most other search result lists. Remarkable patterns could not be found within these clusters up to now. Based on these findings, the research team of #Datenspende concludes that the Google algorithm allows only little space for personalization and, thus, do not necessarily support the assumption of a technically generated filter bubble (Zweig, 2017a, 2017b).

Whether six same results of nine hits are reassuring is left to one‘s own resources. These results, though, might not serve as empirical evidence specifically „supporting“ or „rejecting“ assumptions about the existence of filter bubbles. They can be seen as preliminary hints.

 

Literature

Datenspende:BTW17. (2017). Datenspende. AlgorithmWatch. Retrieved from https://datenspende.algorithmwatch.org/

Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. New York: Penguin Press.

Sunstein, C. R. (2001). Republic.com. Princeton: Princeton University Press.

Thiel, T. (07.03.2012). Im Netz wartet schon der übermächtige Doppelgänger. Frankfurter Allgemeine. Retrieved from http://www.faz.net/aktuell/feuilleton/buecher/rezensionen/sachbuch/eli-pariser-filter-bubble-im-netz-wartet-schon-der-uebermaechtige-doppelgaenger-11675351.html

Zweig, K. A. (2017a). Personalisierung bei der Google-Suche geringer als gedacht – hauptsächlich regionale Effekte. AlgorithmWatch. Retrieved from https://algorithmwatch.org/de/bei-der-google-suche-personalisierung-geringer-als-gedacht-hauptsaechlich-regionale-effekte/

Zweig, K. A. (2017b). Filterblase geplatzt? Kaum Raum für Personalisierung bei Google-Suchen zur Bundestagswahl 2017. AlgorithmWatch. Retrieved from https://algorithmwatch.org/de/filterblase-geplatzt-kaum-raum-fuer-personalisierung-bei-google-suchen-zur-bundestagswahl-2017/

#NoFilter: Is the Google algorithm leading to a filter bubble? Findings of the German project #Datenspende