Abstract

AbstractWe study learning via shared news. Each period agents receive the same quantity and quality of firsthand information and can share it with friends. Some friends (possibly few) share selectively, generating heterogeneous news diets across agents. Agents are aware of selective sharing and update beliefs by Bayes’s rule. Contrary to standard learning results, we show that beliefs can diverge in this environment, leading to polarization. This requires that (i) agents hold misperceptions (even minor) about friends’ sharing and (ii) information quality is sufficiently low. Polarization can worsen when agents’ friend networks expand. When the quantity of firsthand information becomes large, agents can hold opposite extreme beliefs, resulting in severe polarization. We find that news aggregators can curb polarization caused by news sharing. Our results hold without media bias or fake news, so eliminating these is not sufficient to reduce polarization. When fake news is included, it can lead to polarization but only through misperceived selective sharing. We apply our theory to shed light on the polarization of public opinion about climate change in the United States.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call