Abstract

The twenty-first-century rise of big data marks a significant break with statistical notions of what is of interest or concern. The vast expansion of digital data has been closely intertwined with the development of advanced analytical algorithms with which to make sense of the data. The advent of techniques of knowledge discovery affords some capacity for the analytics to derive the object or subject of interest from clusters and patterns in large volumes of data, otherwise imperceptible to human reading. Thus, the scale of the big in big data is of less significance to contemporary forms of knowing and governing than what we will call the little analytics. Following Henri Bergson's analysis of forms of perception which ‘cut out’ a series of figures detached from the whole, we propose that analytical algorithms are instruments of perception without which the extensity of big data would not be comprehensible. The technologies of analytics focus human attention and decision on particular persons and things of interest, whilst annulling or discarding much of the material context from which they are extracted. Following the algorithmic processes of ingestion, partitioning and memory, we illuminate how the use of analytics engines has transformed the nature of analysis and knowledge and, thus, the nature of the governing of economic, social and political life.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.