Abstract

Global debate over the impact that algorithms and search on shaping political opinions has been increasing in the aftermath of controversial election results in Europe and the US. Powerful images of the Internet enabling access to a global treasure trove of information have shifted to worries over the degree to which those who use social media, and online tools such as search engines, are being fed inaccurate, fake, or politically-targeted information that could distort public opinion and political change. There are serious questions raised over the political implication of any biases embedded in the algorithms that drive search engines and social media. Do digital media biases shape access to information shaping public opinion? To address these issues, we conducted an online survey of stratified random samples in seven nations, including Britain, France, Germany, Italy, Poland, Spain, and the US. We asked Internet users how they use search, social media, and other media, for political information, and what difference it makes for them. The findings cast doubt on technologically deterministic perspectives on search, such as filter bubbles. For example, our findings show that search is among an array of media consulted by those interested in politics. Internet users are not trapped in a bubble on a single platform. Another deterministic narrative is around the concept of echo chambers, where social media enable users to cocoon themselves with likeminded people and viewpoints. However, most of those interested in politics search for and double check problematic political information, and expose themselves to a variety of viewpoints. Thus, prevailing views on search and politics not only over-estimate technical determinants, but also underestimate the social shaping of the Internet, social media, and search. National media cultures and systems play an important role in shaping search practices, along with individual differences in political and Internet orientations. The findings suggest there are disproportionate levels of concern, often approaching panic, over the bias of search and social media, and that targeted interventions could help reduce the risks associated with fake news, filter bubbles, and echo chambers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call