Abstract

Belief and Delusion as Palliative Responses to Uncertainty Philip R. Corlett In December 1954, the Chicago Tribune reported that Dr. Charles Laughead of Michigan foresaw the end of the world via tidal wave and volcano. He was speaking on behalf of Dorothy Martin, who was supposedly relaying a prophecy from extraterrestrials. The prophecy did not manifest. Martin was placed in psychiatric care to avoid legal charges for creating disturbances and scaring children with her prophecies. However, on leaving that care, she traveled to the Peruvian Andes, Mount Shasta, in California and ultimately settled in Sedona, Arizona, where she lived until she was 92, continuing to proselytize about aliens and their ministrations on earth, but essentially evading interaction with psychiatric services. Did Laughead, Martin, and their followers have delusions? Their beliefs were certainly bizarre and firm. At times, being a follower, sharing those beliefs, was distressing (although that distress usually arose when the beliefs were challenged, rather than when adherents considered the consequences of the beliefs—that the world would end). The beliefs were definitely outside of the doxastic norms of the culture. However, something seems different about these followers compared to the clinical cases with which we are more familiar. One way to explore the overlap and distinctions between belief and delusions is to consider their function. I believe that healthy and unhealthy beliefs are responses to uncertainty or ambiguity. By explaining away the inexplicable, they permit continued engagement with the world. I define uncertainty and ambiguity in terms of decision‐making and probability distributions. We can have uncertainty with regard to a particular event, we can assign it some subjective probability based on what we know and believe. That probability (based on knowledge and beliefs) may be very different between individuals. Ambiguous situations are so uncertain that we do not have enough information to be sure that our particular belief—our specific prediction—is the correct one. In perception, an uncertain situation would involve listening to a friend speak at a noisy party (you can resolve the uncertainty by making predictions based on what you know about your friend). Listening to someone you have just met, at the same noisy party, someone about whom you have no prior beliefs would engender ambiguity. Both are at best frustrating and at worst distressing. We respond to both uncertainty and ambiguity by relying on prior beliefs. And we can respond strongly and sometimes counterintuitively when those priors themselves are challenged. Unbeknownst to Martin, some of her followers were imposters: social psychologists, led by Leon Festinger. The academics infiltrated the group as the end‐times loomed. The result was a book; “When Prophecy Fails: A social psychological study of a modern group that predicted the destruction of the world” (Festinger et al. ). They developed the theory of cognitive dissonance, the internal discord felt from holding conflicting beliefs simultaneously (Festinger )—in this case, between the prophecy and real world events. People in the cult responded in a variety of ways to reduce their dissonance. Many relinquished their beliefs. In some cases, however, a dissonant experience actually increased conviction. For example, failed predictions were recontextualized as actually having come to fruition (“the aliens did come for us, but they were scared off by the crowds of press”). These deft sleights of mind (McKay et al. ) will be familiar to those who have spoken to patients with delusions (Garety , ). One major challenge for humans is to form and maintain a set of beliefs about the world that are sufficiently accurate and strong to guide decision‐making, but flexible enough to withstand changes in the underlying contingencies. One way this might be achieved involves Bayesian learning; we sustain a set of prior beliefs based on past experience, and we combined them with new data. If those new data are highly precise and compelling, they garner updating of the prior. If they are not, those data can be discarded. However, sometimes people do not update their beliefs in this manner. For example, when confronted with evidence that challenges a deeply cherished belief, such evidence may backfire and strengthen people's belief. Is such behavior contrary to the Bayesian model? Furthermore, in the face of uncertainty and ambiguity, people...

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.