Abstract

Misinformation has been rapidly spreading online. The common approach to dealing with it is deploying expert fact-checkers who follow forensic processes to identify the veracity of statements. Unfortunately, such an approach does not scale well. To deal with this, crowdsourcing has been looked at as an opportunity to complement the work done by trained journalists. In this article, we look at the effect of presenting the crowd with evidence from others while judging the veracity of statements. We implement variants of the judgment task design to understand whether and how the presented evidence may or may not affect the way crowd workers judge truthfulness and their performance. Our results show that, in certain cases, the presented evidence and the way in which it is presented may mislead crowd workers who would otherwise be more accurate if judging independently from others. Those who make appropriate use of the provided evidence, however, can benefit from it and generate better judgments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call