Abstract
When US President Barack Obama publicly addressed the data mining and analysis activities of the National Security Agency (NSA), he appealed to a familiar sense of the weighing of the countervailing forces of security and privacy. “The people at the NSA don't have an interest in doing anything other than making sure that where we can prevent a terrorist attack, where we can get information ahead of time, we can carry out that critical task,” he stated. “Others may have different ideas,” he suggested, about the balance between “the information we can get” and the “encroachments on privacy” that might be incurred (Obama 2013). In many ways, conventional calculations of security weigh the probability and likelihood of a future threat on the basis of information gathered on a distribution of events in the past. Obama's sense of a trading-off of security and privacy shares this sense of a calculation of the tolerance for the gathering of data on past events in order to prevent threats in the future. In fact, though, the very NSA programs he is addressing precisely confound the weighing of probable threat, and the conventions of security and privacy that adhere to strict probabilistic reasoning. The contemporary mining and analysis of data for security purposes invites novel forms of inferential reasoning such that even the least probable elements can be incorporated and acted upon. I have elsewhere described these elements of possible associations, links, and threats as “data derivatives” (Amoore 2011) that are decoupled from underlying values and do not meaningfully belong to an identifiable subject. The analysis of data derivatives for security poses significant difficulties for the idea of a data subject with a recognizable body of rights to privacy, to liberty, and to justice.
Accepted Version (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have