Abstract

In recent years, a growing number of journalistic and scholarly publications have paid particular attention to the broad societal impact of YouTube’s video recommendation system. Significantly, the YouTube algorithm’s alleged contributions to the formation of echo-chambers, filter-bubbles, polarization, radicalization, disinformation, and malicious use of information are among the top concerns. On top of the given issues, potential biases of the recommendation system in favor of a small number of videos, content producers, or channels would further exacerbate the magnitude of the problem, especially if a systematic understanding of the inherent nature and characteristics of the algorithm is lacking. In this study, we investigate the structure of recommendation networks and probabilistic distributions of the node-centric influence of recommended videos. Adopting a stochastic approach, we observe PageRank distributions over a diverse set of recommendation graphs we collected and built based on eight different real-world scenarios. In total, we analyzed 803,210 recommendations made by YouTube’s recommendation algorithm, based on specific search queries and seed datasets from previous studies. As a result, we demonstrate the existence of a structural, systemic, and inherent tendency to impose bias by YouTube’s video recommendation system in favor of a tiny fraction of videos in each scenario. We believe that this work sets the stage for further research in creating predictive modeling techniques that reduce bias in video recommendation systems and make algorithms fairer. The implementation of such attempts aims to reduce their potential harmful social and ethical impacts and increase public trust in these social systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call