Abstract

In today’s digital world, understanding how YouTube’s recommendation systems guide what we watch is crucial. This study dives into these systems, revealing how they influence the content we see over time. We found that YouTube’s algorithms tend to push content in certain directions, affecting the variety and type of videos recommended to viewers. To uncover these patterns, we used a mixed methods approach to analyze videos recommended by YouTube. We looked at the emotions conveyed in videos, the moral messages they might carry, and whether they contained harmful content. Our research also involved statistical analysis to detect biases in how these videos are recommended and network analysis to see how certain videos become more influential than others. Our findings show that YouTube’s algorithms can lead to a narrowing of the content landscape, limiting the diversity of what gets recommended. This has important implications for how information is spread and consumed online, suggesting a need for more transparency and fairness in how these algorithms work. In summary, this paper highlights the need for a more inclusive approach to how digital platforms recommend content. By better understanding the impact of YouTube’s algorithms, we can work towards creating a digital space that offers a wider range of perspectives and voices, affording fairness, and enriching everyone’s online experience.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.