Abstract

Judges, attorneys, and academics commonly use case law surveys to ascertain the law and to predict or make decisions. In some contexts, however, certain legal outcomes may be more likely to be published (and thus observed) than others, potentially distorting impressions from case surveys. In this paper, I propose a method for detecting and correcting legal publication bias based on ideas from multiple systems estimation (MSE), a technique traditionally used for estimating hidden populations. I apply the method to a simulated dataset of admissibility decisions to confirm its efficacy, then to a newly collected dataset on false confession experts, where the model estimates that the observed 16% admissibility rate may be in reality closer to 28%. The article thus identifies and draws attention to the potential for legal publication bias, and offers a practical statistical tool for detecting and correcting it.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call