Abstract

Personalisation of media content is not a new phenomenon. Now, however, by configuring our search results and data feeds, algorithms that ‘learn’ from our digital footprint are determining what we see and hear. Pariser calls this the ‘Filter Bubble Effect’. Yet, despite concerns that this effect is a threat to deliberative democracy, we are told there is relatively little evidence to substantiate its existence. This article draws on a case study to argue that this is because the existing research looks for technical effects while neglecting our social lives. If we follow Foucault’s reasoning that systems of thought are also technologies, then we can see that material technologies (or what Foucault called ‘technologies of production’) and immaterial technologies (ideas formed in discourse) can co-constitute filter bubbles. Borrowing language from computing and science and technology studies, this leads to a redefinition of filter bubbles as socio-technical recursion. This case study illustrates just one potential combination of such material and immaterial technologies (namely, search engines and ideas that are encountered and formed during an individual’s social life within their culture and class) that can create socio-technical recursion. The article concludes by arguing the advantage of conceptualising filter bubbles in this way is that it offers us a theoretical foundation for breaking out of this recursion by simultaneously challenging the mediums and messages that sustain them.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call