Abstract

Following the release of large language models in the late 2010s, the backers of this new type of artificial intelligence (AI) publicly affirmed that the technology is controversial and harmful to society. This situation sets contemporary AI apart from 20th-century controversies about technnoscience, such as nuclear power and genetically modified (GM) foods, and disrupts established assumptions concerning public controversies as occasions for technological democracy. In particular, it challenges the idea that such controversies enable inclusion and collective processes of problem definition (‘problematisation’) across societal domains. In this paper, we show how social research can contribute to addressing this challenge of AI controversies by adopting a distinctive methodology of controversy analysis: controversy elicitation. This approach actively selects, qualifies and evaluates controversies in terms of their capacity to problematise AI across the science and non-science binary. We describe our implementation of this approach in a participatory study of recent AI controversies, conducted through consultation with UK experts in AI and society. Combining an online questionnaire, social media analysis and a participatory workshop, our study suggests that civil society actors have developed distinctive strategies of problematisation that counter the strategic affirmation of AI’s controversiality by its proponents and which centre on the public mobilisation of AI-related incidents: demonstrations of bias, accidents and walkouts. Crucially, this emphasis on ‘AI frictions’ does not result in the fragmentation of AI controversies, but rather enables the articulation of AI as a ‘super-controversy’: the explication of connections between technical propositions, situated troubles and structural problems in society (discrimination, inequalities and corporate power).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call