Abstract

We provide a model to investigate the tension between information aggregation and spread of misinformation in large societies (conceptualized as networks of agents communicating with each other). Each individual holds a belief represented by a scalar. Individuals meet pairwise and exchange information, which is modeled as both individuals adopting the average of their pre-meeting beliefs. When all individuals engage in this type of information exchange, the society will be able to effectively aggregate the initial information held by all individuals. There is also the possibility of misinformation, however, because some of the individuals are forceful, meaning that they influence the beliefs of (some) of the other individuals they meet, but do not change their own opinion. The paper characterizes how the presence of forceful agents interferes with information aggregation. Under the assumption that even forceful agents obtain some information (however infrequent) from some others (and additional weak regularity conditions), we first show that beliefs in this class of societies converge to a consensus among all individuals. This consensus value is a random variable, however, and we characterize its behavior. Our main results quantify the extent of misinformation in the society by either providing bounds or exact results (in some special cases) on how far the consensus value can be from the benchmark without forceful agents (where there is efficient information aggregation). The worst outcomes obtain when there are several forceful agents and forceful agents themselves update their beliefs only on the basis of information they obtain from individuals most likely to have received their own information previously.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call