Abstract

Police departments large and small have begun to use data mining techniques to predict the where, when, and who of crime before it occurs. But data mining systems can have a disproportionately adverse impact on vulnerable communities, and predictive policing is no different. Reviewing the technical process of predictive policing, the Article begins by illustrating how use of predictive policing technology will often result in disparate impact on communities of color. After evaluating the possibilities for Fourth Amendment regulation and finding them wanting, the Article turns toward a new regulatory proposal.The Article proposes the use of a rulemaking procedure centered on “discrimination impact assessments.” Predictive policing, like a great deal of data mining solutions, is sold in part as a “neutral” method to counteract unconscious biases. At the moment, however, police departments adopting the technology are not evaluating its potential for a discriminatory impact, which might reproduce or exacerbate the unconscious biases its proponents claim it will cure. Modeled on the environmental impact statements of the National Environmental Policy Act, discrimination impact assessments would require police departments to evaluate the potential discriminatory effects of competing alternative algorithms and to publicly consider mitigation procedures. This regulation balances the need for police expertise in the adoption of new crime control technologies with transparency and public input regarding the potential for harm. Such a public process will also serve to increase trust between police departments and the communities they serve.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call