Abstract

Recently, a growing collection of interdisciplinary literature has been suggesting that algorithmic bias in recommender systems may cause severe consequences for sociotechnical systems. Audits of multiple social media platforms documented the implicit bias in content, products, information, or connections recommended to users, sometimes based on gender, sociocultural background, or political affiliation. Within the given context, YouTube’s video recommendation algorithm has been subject to a popular public debate in terms of its broad societal impact and alleged contribution to the spread of harmful content. However, the current literature still lacks a comprehensive understanding of, and broad agreement on, whether the video recommendations on YouTube are biased in favor of a small set of items in generalizable conditions. In addition, the characterization of potential recommendation bias is a non-trivial problem given the black-box characteristics of the system. This study addresses the given pair of problems by adopting a graphical probabilistic approach and assessing the structural properties of video recommendation networks. We adopt a stochastic approach and examine PageRank distributions over a diverse set of recommendation graphs (256,725 videos, 803,210 recommendations) built upon 8 distinct datasets categorized by languages, topics, and system entry-points. As a result, we find structural and systemic bias in video recommendations, although the level and behavior of biased recommendations vary between experiments. This study sets the stage for further research in creating comprehensive evaluation methodologies that would address the severe effects of bias in terms of fairness, diversity, coverage, exposure, security, and utility.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call