Abstract

AbstractIn two studies, we investigated the effectiveness of an automated form of calibration training via individualized feedback as a means to improve calibration in forecasts. In Experiment 1, this training procedure was tested in a realistic forecasting situation, namely, predicting the outcome of baseball games. Experiment 2 was similar but used a more controlled forecasting task, predicting whether competitors would bust in a modified version of blackjack. In comparison to a control group without training, participants provided with calibration training had reduced confidence levels, which translated into reduced overconfidence and better overall calibration in Experiment 2. The results across both studies suggest that an automated form of individualized performance feedback can reduce the confidence of initially overconfident forecasters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call