When a crime is captured on video, law enforcement agencies increasingly have used facial recognition technology (FRT) to generate suspects to investigate. However, there are increasing examples of people who have been wrongfully arrested based because of the inaccurate results returned from these artificial intelligence-assisted searches of facial databases, despite very low error rates in the accuracy of these systems. We discuss the reliability of the evidence provided by a match returned by FRT, propose a framework for identifying potential problems with the use of FRT in criminal investigations, and review the research on the general trauma that comes from justice system involvement, trauma that is compounded by wrongful arrest and conviction. We also provide an analysis of how database size affects the evidentiary value of the matches returned by FRT. Variables like facial database size, race of the culprit, and quality of the probe photo can increase the likelihood that FRT systems will return false positive matches. The use of FRT for developing suspects in criminal investigations is likely to exacerbate the already profound racial disparities in the outcomes produced by the criminal legal system and increase trauma experienced by those who are wrongfully arrested or convicted. We recommend extreme caution surrounding its use. In addition, we call for more research on the trauma associated with wrongful arrest, which is likely to occur with the current use of FRT. (PsycInfo Database Record (c) 2024 APA, all rights reserved).
Read full abstract