Abstract

AbstractWe describe an exploratory study examining the effectiveness of an interactive app and a novel training process for improving calibration and reducing overconfidence in probabilistic judgments. We evaluated the training used in the app by conducting an American college football forecasting tournament involving 153 business school students making 52 forecasts over 11 weeks. A coarsened exact matching analysis found statistical evidence that, in under 30 min, the more challenging training was able to modestly reduce overconfidence, improve calibration and improve the accuracy of probabilistic judgments (measured by the Brier score). The experimental results also suggest that the generic training can generalize across domains and that effective calibration training is possible without expert facilitators or pedagogical training materials. Although no previous studies have reported similar results, due to the modest effect, we conclude that these results should only be interpreted as a proof of concept and that further evaluation and validation of mechanisms of the app's effect is necessary.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call