Abstract
Vaccinations play a critical role in mitigating the impact of COVID-19 and other diseases. Past research has linked misinformation to increased hesitancy and lower vaccination rates. Gaps remain in our knowledge about the main drivers of vaccine misinformation on social media and effective ways to intervene. Our longitudinal study had two primary objectives: (1) to investigate the patterns of prevalence and contagion of COVID-19 vaccine misinformation on Twitter in 2021, and (2) to identify the main spreaders of vaccine misinformation. Given our initial results, we further considered the likely drivers of misinformation and its spread, providing insights for potential interventions. We collected almost 300 million English-language tweets related to COVID-19 vaccines using a list of over 80 relevant keywords over a period of 12 months. We then extracted and labeled news articles at the source level based on third-party lists of low-credibility and mainstream news sources, and measured the prevalence of different kinds of information. We also considered suspicious YouTube videos shared on Twitter. We focused our analysis of vaccine misinformation spreaders on verified and automated Twitter accounts. Our findings showed a relatively low prevalence of low-credibility information compared to the entirety of mainstream news. However, the most popular low-credibility sources had reshare volumes comparable to those of many mainstream sources, and had larger volumes than those of authoritative sources such as the US Centers for Disease Control and Prevention and the World Health Organization. Throughout the year, we observed an increasing trend in the prevalence of low-credibility news about vaccines. We also observed a considerable amount of suspicious YouTube videos shared on Twitter. Tweets by a small group of approximately 800 "superspreaders" verified by Twitter accounted for approximately 35% of all reshares of misinformation on an average day, with the top superspreader (@RobertKennedyJr) responsible for over 13% of retweets. Finally, low-credibility news and suspicious YouTube videos were more likely to be shared by automated accounts. The wide spread of misinformation around COVID-19 vaccines on Twitter during 2021 shows that there was an audience for this type of content. Our findings are also consistent with the hypothesis that superspreaders are driven by financial incentives that allow them to profit from health misinformation. Despite high-profile cases of deplatformed misinformation superspreaders, our results show that in 2021, a few individuals still played an outsized role in the spread of low-credibility vaccine content. As a result, social media moderation efforts would be better served by focusing on reducing the online visibility of repeat spreaders of harmful content, especially during public health crises.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.