Abstract

Digital technologies have provided governments across the world with new tools of political and social control. The development of algorithmic governance in China is particularly alarming, where plans have been released to develop a digital Social Credit System (SCS). Still in an exploratory stage, the SCS, as a collection of national and local pilots, is framed officially as an all-encompassing project aimed at building trust in society through the regulation of both economic and social behaviors. Grounded in the case of China’s SCS, this article interrogates the application of algorithmic rating to expanding areas of everyday life through the lens of the Frankfurt School’s critique of instrumental reason. It explores how the SCS reduces the moral and relational dimension of trust in social interactions, and how algorithmic technologies, thriving on a moral economy characterized by impersonality, impede the formation of trust and trustworthiness as moral virtues. The algorithmic rationality underlying the SCS undermines the ontology of relational trust, forecloses its transformative power, and disrupts social and civic interactions that are non-instrumental in nature. Re-reading and extending the Frankfurt School’s theorization on reason and the technological society, especially the works of Horkheimer, Marcuse, and Habermas, this article reflects on the limitations of algorithmic technologies in social governance. A Critical Theory perspective awakens us to the importance of human reflexivity on the use and circumscription of algorithmic rating systems.

Highlights

  • The development of big data and algorithmic technologies has enabled governments across the world to fashion new modes of political and social control

  • I discuss in the rest of the article how the algorithmic rationality (Lowrie, 2017) or algorithmic governmentality (Rouvroy, 2013) animating the Social Credit System (SCS) contradicts the ontology of relational trust and undermines its formation in social/civic interactions

  • I argue that algorithmic scoring of trustworthiness defeats the purpose of moral engineering as it undermines the ontology of relational trust by authorizing a moral economy antithetical to the workings of trust in social and civic relationships

Read more

Summary

Introduction

The development of big data and algorithmic technologies has enabled governments across the world to fashion new modes of political and social control. An epitome of this emerging trend of algorithmic governance is China’s plan to build a Social Credit System (SCS), which has evoked fear internationally of an Orwellian technodystopia. A growing literature has empirically investigated the mechanics of the SCS in China, this fundamental theoretical question remains unanswered. This article addresses this void and interrogates, more broadly, the increasing embrace of algorithmic rationality in social governance. In the rest of the article, I draw on the Frankfurt theorists’ critique of formalized or instrumental reason, the works of Horkheimer, Marcuse, and Habermas, to elaborate on the ways in which the SCS—as an epitome of the quantification of the social—disenchants and flattens moral values such as trust and trustworthiness

Beyond Surveillance
Trust as a Moral Concept
Trust and Moral Autonomy
From Instrumental Reason to Algorithmic Rationality
Disenchanting Trust
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.