Abstract
The article explores algorithmic governance through the lenses of Foucault's work on governmentality. Algorithms are understood as “technologies of power” that literally “subjectify” the individuals upon which they act. Our main focus is on Syrian refugees in two national contexts - Estonia and Turkey - and we consider four types of algorithms to which refugees are subjected: relocation, police risk scoring, recommendation algorithms and online advertisements. The goal is to explore the “algorithmic imaginaries” of both refugees and Estonian experts who work with migration data about these technologies, via a series of interviews with 19 refugees and 24 data experts. Our findings show that, while relocation and police risk scoring algorithms are perceived as technologies of power responsible for producing macro-differences without paying sufficient attention to individual needs, recommendation and ad algorithms are seen as less threatening, i.e. as “technologies of the self”. From here, we suggest reconsidering algorithmic governance as an iterative practice to eventually transform the datafied “knowledge of the self”, suitable for algorithms, into a true “care of the self”.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.