Abstract

What people see and read in the media is no longer only determined by journalists, but increasingly by algorithms. These algorithms select, sort, and prioritize our information. Automatic processes like YouTube’s recommendation algorithm influence our views of the world. An important democratic question is whether YouTube’s recommendation algorithm incidentally exposes the audience to political information after watching entertainment content and whether the algorithm creates a filter bubble by primarily recommending content with a similar political perspective. During the Danish parliamentary elections 2019, the recommendation algorithm was more likely to lead viewers away from news and public affairs than towards political content. After watching a video posted by political parties Venstre or Stram Kurs, the algorithm primarily recommends videos from these same parties, which could strengthen confirmation bias and reinforce political beliefs. For other parties, this was less the case. Little evidence was found that the recommendation algorithm leads viewers from mainstream content to extreme right content.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.