Abstract
In this article, we address the apparent discrepancy between causal Bayes net theories of cognition, which posit that judgments of uncertainty are generated from causal beliefs in a way that respects the norms of probability, and evidence that probability judgments based on causal beliefs are systematically in error. One purported source of bias is the ease of reasoning forward from cause to effect (predictive reasoning) versus backward from effect to cause (diagnostic reasoning). Using causal Bayes nets, we developed a normative formulation of how predictive and diagnostic probability judgments should vary with the strength of alternative causes, causal power, and prior probability. This model was tested through two experiments that elicited predictive and diagnostic judgments as well as judgments of the causal parameters for a variety of scenarios that were designed to differ in strength of alternatives. Model predictions fit the diagnostic judgments closely, but predictive judgments displayed systematic neglect of alternative causes, yielding a relatively poor fit. Three additional experiments provided more evidence of the neglect of alternative causes in predictive reasoning and ruled out pragmatic explanations. We conclude that people use causal structure to generate probability judgments in a sophisticated but not entirely veridical way.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.