Abstract

AbstractExperts are expected to make well‐calibrated judgments within their field, yet a voluminous literature demonstrates miscalibration in human judgment. Calibration training aimed at improving subsequent calibration performance offers a potential solution. We tested the effect of commercial calibration training on a group of 70 intelligence analysts by comparing the miscalibration and bias of their judgments before and after a commercial training course meant to improve calibration across interval estimation and binary choice tasks. Training significantly improved calibration and bias overall, but this effect was contingent on the task. For interval estimation, analysts were overconfident before training and became better calibrated after training. For the binary choice task, however, analysts were initially underconfident and bias increased in this same direction post‐training. Improvement on the two tasks was also uncorrelated. Taken together, results indicate that the training shifted analyst bias toward less confidence rather than having improved metacognitive monitoring ability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.