Abstract

Human forecasts and other probabilistic judgments can be improved by elicitation and aggregation methods. Recent work on elicitation shows that deriving probability estimates from relative judgments (the ratio method) is advantageous, whereas other recent work on aggregation shows that it is beneficial to transform probabilities into coherent sets (coherentization) and to weight judges' assessments by their degree of coherence. We report an experiment that links these areas by examining the effect of coherentization and multiple forms of coherence weighting using direct and ratio elicitation methods on accuracy of probability judgments (both forecasts and events with known distributions). We found that coherentization invariably yields improvements to accuracy. Moreover, judges' levels of probabilistic coherence are related to their judgment accuracy. Therefore, coherence weighting can improve judgment accuracy, but the strength of the effect varies among elicitation and weighting methods. As well, the benefit of coherence weighting is stronger on “calibration” items that served as a basis for establishing the weights than for unrelated “test” items. Finally, echoing earlier research, we found overconfidence in judgment, and the degree of overconfidence was comparable between the two elicitation methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.