Abstract
While multiple scholarly disciplines have scrutinised algorithms and their social power, this article examines algorithmic recommender systems in terms of their potential to reduce exposure diversity online. The paper positions the debate on algorithmic content curation within the theoretical framework of media diversity studies, emphasising the normative ideal of media diversity as a social good, and examining the impact of algorithmic behaviour on personal autonomy, specifically focusing on exposure diversity in online environments. The discussion then introduces a complementary concept to Pariser's (2011) notion of the “filter bubble,” namely the shrinkage funnel, and discusses this concept in terms of online exposure diversity, personal choice sovereignty, and its potential to initiate the illusory truth effect. The paper reviews selected examples of content restrictive algorithmic behaviour to demonstrate how shrinkage funnels can undermine personal autonomy and prompt detrimental societal or political outcomes. By distinguishing between filter bubbles and shrinkage funnels, this study provides a foundation for future empirical investigations and potential policy interventions aimed at promoting exposure diversity in algorithmically curated online environments.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have