Abstract

Scientifically trained and untrained judges read descriptions of an expert's research in which the peer review status and internal validity were manipulated. Seventeen percent of the judges said they would admit the expert evidence, irrespective of its internal validity. Publication in a peer-reviewed journal also had no effect on judges' decisions. Training interacted with the internal validity manipulation. Scientifically trained judges rated valid evidence more positively than did untrained judges. Untrained judges rated a study with a confound more positively than did trained judges. Training did not affect judge evaluations of studies with a missing control group or potential experimenter bias. Admissibility decisions were correlated with judges' perceptions of the study's validity, jurors' ability to evaluate scientific evidence, and the effectiveness of cross-examination and opposing experts to highlight flaws in scientific methodology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call