Abstract

In January 2019, YouTube announced its platform would exclude potentially harmful content from video recommendations while allowing such videos to remain on the platform. While this action is intended to reduce YouTube's role in propagating such content, continued availability of these videos via hyperlinks in other online spaces leaves an open question of whether such actions actually impact sharing of these videos in the broader information space. This question is particularly important as other online platforms deploy similar suppressive actions that stop short of deletion despite limited understanding of such actions' impacts. To assess this impact, we apply interrupted time series models to measure whether sharing of potentially harmful YouTube videos in Twitter and Reddit changed significantly in the eight months around YouTube's announcement. We evaluate video sharing across three curated sets of anti-social content: a set of conspiracy videos that have been shown to experience reduced recommendations in YouTube, a larger set of videos posted by conspiracy-oriented channels, and a set of videos posted by alternative influence network (AIN) channels. As a control, we also evaluate these effects on a dataset of videos from mainstream news channels. Results show conspiracy-labeled and AIN videos that have evidence of YouTube's de-recommendation do experience a significant decreasing trend in sharing on both Twitter and Reddit. At the same time, however, videos from conspiracy-oriented channels actually experience a significant increase in sharing on Reddit following YouTube's intervention, suggesting these actions may have unintended consequences in pushing less overtly harmful conspiratorial content. Mainstream news sharing likewise sees increases in trend on both platforms, suggesting YouTube's suppression of particular content types has a targeted effect. In summary, while this work finds evidence that reducing exposure to anti-social videos within YouTube potentially reduces sharing on other platforms, increases in the level of conspiracy-channel sharing raise concerns about how producers -- and consumers -- of harmful content are responding to YouTube's changes. Transparency from YouTube and other platforms implementing similar strategies is needed to evaluate these effects further.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.