Abstract

ABSTRACT Social media platforms have been found to be the primary gateway through which individuals are exposed to fake news. The algorithmic filter bubbles and echo chambers that have popularized these platforms may also increase exposure to fake news. Because of this, scholars have suggested disrupting the stream of congruent information that filter bubbles and echo chambers produce, as this may reduce the impact and circulation of misinformation. To test this, a survey experiment was conducted via Amazon MTurk. Participants read 10 short stories that were either all fake or half real and half fake. These treatment conditions were made up of stories agreeable to the perspective of Democrats, Republicans, or a mix of both. The results show that participants assigned to conditions that were agreeable to their political world view found fake stories more believable compared to participants who received a heterogeneous mix of news stories complementary to both world views. However, this “break up” effect appears confined to Democratic participants; findings indicate that Republicans assigned to filter bubble treatment conditions believed fake news stories at approximately the same rate as their fellow partisans receiving a heterogeneous mix of news items. This suggests that a potential “break up” may only influence more progressive users.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call