Abstract
With over two billion monthly active users, YouTube currently shapes the landscape of online political video consumption, with 25% of adults in the United States regularly consuming political content via the platform. Considering that nearly three-quarters of the videos watched on YouTube are delivered via its recommendation algorithm, the propensity of this algorithm to create echo chambers and deliver extremist content has been an active area of research. However, it is unclear whether the algorithm may exhibit political leanings toward either the Left or Right. To fill this gap, we constructed archetypal users across six personas in the US political context, ranging from Far Left to Far Right. Utilizing these users, we performed a controlled experiment in which they consumed over eight months worth of videos and were recommended over 120,000 unique videos. We find that while the algorithm pulls users away from political extremes, this pull is asymmetric, with users being pulled away from Far Right content stronger than from Far Left. Furthermore, we show that the recommendations made by the algorithm skew left even when the user does not have a watch history. Our results raise questions on whether the recommendation algorithms of social media platforms in general, and YouTube, in particular, should exhibit political biases, and the wide-reaching societal and political implications that such biases could entail.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.