Abstract

The people who make algorithmic recommender systems want apparently incompatible things: they pride themselves on the scale at which their software works, but they also want to treat their materials and users with care. Care and scale are commonly understood as contradictory goals: to be careful is to work at small scale, while working at large scale requires abandoning the small concerns of care. Drawing together anthropological work on care and scale, this article analyzes how people who make music recommender systems try to reconcile these values, reimagining what care and scale mean and how they relate to each other in the process. It describes decorrelation, an ethical technique that metaphorically borrows from the mathematics of machine learning, which practitioners use to reimagine how values might relate with each other. This “decorrelative ethics” facilitates new arrangements of care and scale, which challenge conventional anthropological theorizing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call