Abstract

This paper examines the negative consequences of the filter bubble phenomenon, described by Eli Pariser in his book “The Filter Bubble. What the Internet Is Hiding from You” – the limitation of access to certain information due to the excessive and constant personalization of the content on web. The focus is given to the special role the search engines like Google, Yahoo or Bing play in the creation of filter bubble. The general aim of this paper is to raise awareness about the drawback of the current situation, to highlight possible threats posed to the deliberative democracy and briefly examine the possible solutions. Although it is arguable, the problem we might be facing with the personalization of Web and search results is that it poses a number of threats to the democratic society as well as to the values embodied in it e.g. freedom of speech, access to information or pluralism of opinions. The first chapter examines the filter bubble phenomenon in the context of how search engines contribute and might contribute to its existence. In this author’s view, although the search engines are just one of the personalized services creating the filter bubble, because of the role they play as actual information intermediaries in the information-based society, they are one of the most important factors in creation or destruction of the filter bubble. The part of the present paper discusses this interplay between search engines and information cocoons in which Living in The Filter Bubble we apparently live. The second chapter examines the threats posed by the filter bubble and so thus by the personalized Web search services. Following Sunstein’s thought, this paper assumes that the requirement of the deliberative democracy is not only that citizens have freedom of speech and the right to pluralism of thoughts, but that they are effectively exposed to other’s views and opinions. Being close in the filter bubble does not allow that to the significant degree, therefore it is possible that although the search engines do not block access to certain information, but somehow ‘hide’ it from users, that might be sufficient to pose a threat to the effective functioning of the exchange of opinions and thought, so thus to the deliberative democracy in general. Should then this issue be regulated? How? This issue is discussed in the final chapter of the paper, which suggest, that while the legal regulation is possible and needed, the field where the changes could happen now more easily and likely are social norms on the use of search engines. On the one hand, as to the principle, the public enjoys personalized content of the Web, as well as enhanced search results which it finds more accurate and helpful (individual interest). On the other hand, as indicated previously, the public interest might be endangered. Any discussion on possible regulation needs to acknowledge two opposite points of view. Therefore, the question of regulation remains not easily solvable. Following Lessig’s Code, the author examines whether the solution to the identified problem might be found through the change of architecture, social norms, and law or by the market forces. At the end of the day, the author suggests that the issue of the filter bubble and search engines appear to be a moral question on civil responsibility and therefore the first response to it could be found in the proper shaping of social norms. The ultimate response to the phenomenon of the filter bubble depends from the choice of values to be made by the members of society – little comfort or small act of social responsibility in pursue of protection of democratic values? Individualism over collectivism.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call