Abstract

Data science, big data, artificial intelligence, and other vocabulary related to data technologies are dominant in today’s popular and academic discourse. These buzzwords undoubtedly denote a contemporary dynamic in science, business, and government. At the same time, the terminology sometimes tends to obscure their belonging to a broader historical development of measurement and classification, as well as to a more established scientific discipline, statistics. Philosophical, historical, and sociological treatments of measurement, quantification, and statistics should not be regarded as merely adjacent but as integral to the research on the implications of data technologies. In this paper, I will try to exemplify how to transfer findings from the philosophy of measurement to the current debate on data ethics. Thereby, I will make use of a recent epistemic turn in understanding data and measurement in the form of the epistemology of measurement and relational account of data. I will focus on how the standardization of measurement and data relate to our systems of knowledge. Based on such investigation, I propose an understanding of data technologies as potential arbiters of an unprecedentedly standardized reality. Due to their unparalleled expansion, statistical data technologies have the potential to spread stable, standardized conceptions of phenomena into ever more aspects of reality. Ultimately, this understanding of modern statistical data technologies warrants a critical reflection on the underlying interpretation of the phenomenon when adopting a metric, an “ontological responsibility” of the data practitioner.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call