Abstract

Cognitive biases, or systematic errors in cognition, are important contributors to diagnostic error in medicine. In our review, we explore the psychological underpinnings of cognitive bias and highlight several common biases using clinical cases. We conclude by reviewing strategies to improve diagnostic accuracy and by discussing controversies and future research directions.RésuméLes préjugés cognitifs, ou les erreurs systématiques dans la cognition, sont des contributeurs importants à l'erreur diagnostique dans la médecine. Dans notre examen, nous explorons les fondements psychologiques du biais cognitif et soulignons plusieurs préjugés communs en utilisant des cas cliniques. Nous concluons en examinant les stratégies visant à améliorer la précision diagnostique et en discutant des controverses et des futures orientations de recherche.Research in the field of behavioural psychology and its application to medicine has been ongoing for several decades in an effort to better understand clinical decision making.1 Cognitive biases (systematic errors in cognition) are increasingly recognized in behavioural economics2 and more recently have been shown to affect medical decision making.3 Over 100 such cognitive biases have been identified and several dozen are postulated to play a major role in diagnostic error.4 Cognitive errors can take many forms and in one study contributed to as many as 74% of diagnostic errors by internists. 5 Most of these errors were due to “faulty synthesis” of information, including premature diagnostic closure and failed use of heuristics.5 Inadequate medical knowledge, on the other hand, was rare and mostly identified in cases concerning rare conditions. 5 Professional organizations such as the Royal College of Physicians and Surgeons of Canada and the Canadian Medical Protective Association have since been working to raise awareness of cognitive bias in clinical practice.6In our review, we explore the role of cognitive bias in diagnostic error through the use of clinical cases. We also review the literature on de-biasing strategies and comment on limitations and future directions of research.The Dual Process TheoryA prevailing theory to explain the existence of cognitive bias is the dual process theory, which asserts that two cognitive systems are used in decision making, herein called System 1 and System 2 (Table 1). 2,7System 1 can be thought of as our intuitive mode of thinking. It generates hypotheses rapidly, operates beneath our perceptible consciousness and makes judgments that are highly dependent on contextual clues. System 1 is characterized by heuristics (short cuts, or “rules of thumb”) and is an important component of clinical judgment or expertise. In contrast, System 2 is slow, deliberate, analytical and more demanding on cognition. It applies rules that are acquired through learning and it can play a “monitoring role” over System 1, and thus overrides heuristics when their use is inappropriate. The dual process theory implies that errors result when inappropriate judgments generated by System 1 fail to be recognized and corrected by System 2. Maintaining constant vigilance over System 1 would be both impractical and time consuming for routine decisions and would diminish the value of intuition. It follows that a more practical way of improving reasoning is to identify the most common biases of System 1 and to recognize situations when mistakes are most likely to occur.2Alternative Theories of CognitionVariations of dual process theory have further refined our understanding of medical decision making. Fuzzy trace theory, for example, proposes that individuals process information through parallel gist and verbatim representations.8 The “gist” is analogous to System 1 and represents the bottom-line “meaning” of information. This representation is subject to an individual’s worldview, emotions and experiences. In contrast, verbatim representations are precise, literal and analogous to System 2. Fuzzy trace theory is particularly useful in explaining how patients might interpret health information. Proponents of this theory contend that in order for information to lead to meaningful behavioural change, physicians must appeal to both gist and verbatim representations when communicating with patients.8 Other models, such as dynamic graded continuum theory, do away with the dichotomy of System 1 and System 2 and instead represent implicit, automatic and explicit cognitive processes on a continuous scale.9 These single system models are useful to compare against dual process theory but have not replaced it as a well-established framework for understanding and mitigating cognitive bias in clinical decision making.7Case 1: A 55-Year-Old Male with Retrosternal Chest PainA 55year-old non-smoking male was assessed in a busy Emergency Department (ED) for retrosternal chest pain. Past medical history is significant for osteoarthritis for which he takes naproxen. On review of his history, the patient has had multiple visits for retrosternal chest pain in the previous two months. At each encounter, he was discharged home after a negative cardiac workup.Vital signs in the ED were within normal limits except for sinus tachycardia at 112 beats per minute. On exam, the patient was visibly distressed. Cardiac and respiratory exams were normal. There was mild tenderness in the epigastrium. Basic blood-work revealed leukocytosis (16.0 × 109/L), a mildly elevated high sensitivity cardiac troponin, and no other abnormalities. An ECG revealed T wave flattening in leads V3-V4.The patient was referred to the internal medicine service with a diagnosis of non-ST-elevation myocardial infarction and treated with aspirin, clopidogrel, and fondaparinux. Several hours later, the patient became more agitated and complained of worsening retrosternal and epigastric pain. On re-examination, heart rate had increased to 139 beats per minute, blood pressure dropped to 77/60 and he had a rigid abdomen. Abdominal radiography revealed free air under the right hemi-diaphragm and the patient was rushed to the operating room where a perforated gastric ulcer was detected and repaired. The case above illustrates numerous cognitive biases, including:1. Premature diagnostic closure: the tendency to accept a diagnosis before it is fully verified.42. Anchoring: the tendency to over-emphasize features in the patient’s initial presentation and failing to adjust the clinical impression after learning new information.43. Confirmation bias: the tendency to look for confirming evidence to support a diagnosis, rather than to look for (or explain) evidence which puts the diagnosis in question.4In this case, the physician based the diagnosis of myocardial infarction primarily on symptoms of chest pain and an elevated cardiac troponin. However, several other objective findings were present and when taken together, suggested a diagnosis other than myocardial infarction. These included a tender epigastrium, leukocytosis, and resting sinus tachycardia. These symptoms/signs were not explicitly explained or investigated before a treatment decision was made. Premature diagnostic closure is one of the most common cognitive biases underlying medical errors5 and it affects clinicians at all levels of training.10 It is multifactorial in origin5 and is especially common in the face of other cognitive biases such as anchoring and confirmation bias.The physician in this case “anchored” to a diagnosis of cardiac chest pain given the patient’s previous ED visit history and his/her best intentions of ruling out a “worst case scenario.” Anchoring can be especially powerful in the face of abnormal screening investigations that have been reviewed even before the physician has acquired a history or performed a physical examination. If the physician had reviewed the screening investigations before seeing the patient, he/she might have narrowed the differential diagnosis prematurely, failed to gather all the relevant information and failed to adjust the clinical impression based on new information.The physician demonstrated confirmation bias by failing to explain the abnormalities that put the diagnosis of myocardial infarction in question (e.g. tender epigastrium, leukocytosis). Confirmation bias arises from an attempt to avoid cognitive dissonance, a distressing psychological conflict which occurs when inconsistent beliefs or theories are held simultaneously.11 In one study evaluating clinical decision making amongst 75 psychiatrists and 75 medical students,12 13% of psychiatrists and 25% of medical students demonstrated confirmation bias when searching for information after having made a preliminary diagnosis. In this study, confirmation bias resulted in more frequent diagnostic errors and predictably impacted subsequent treatment decisions.An appropriate consideration of all diagnostic possibilities is the first step in avoiding diagnostic error. While acquiring information, physicians should step back and consolidate new data with the working diagnosis, as failure to do so can result in confirmation bias.13 All abnormal findings and tests, especially if considered clinically relevant should be explained by the most probable diagnosis. An alternate diagnosis or the possibility of more than one diagnosis should be considered when an abnormal finding or test cannot reasonably be explained by the working diagnosis.Tschen et al observed a team of physicians working through a simulated scenario which had diagnostic ambiguity.14 Two approaches were found to be effective in reducing the effect of confirmation bias: explicit reasoning and talking to the room. Explicit reasoning involves making causal inferences when interpreting and communicating information. Talking to the room is a process whereby diagnostic reasoning is explained in an unstructured way to a team member or colleague in the room. This allows the clinician the opportunity to elaborate on their thoughts and observers to point out errors or suggest alternate diagnoses in a shared mental model.Case 2: A 30-Yearold Male with Confusion and SeizuresA 30-year-old homeless male is found confused on the street by paramedics and brought to the ED for assessment. Empty bottles of alcohol were noted at the scene. The CIWA (Clinical Institute Withdrawal Assessment for Alcohol) protocol is initiated and he is given several doses of lorazepam to minimal effect. Several hours after the patient is admitted, a resident on-call is paged for elevated CIWA scores on the basis of diaphoresis and agitation. Several additional doses of lorazepam are ordered which fail to completely resolve the symptoms. Gradually, the patient becomes more obtunded. The on-call resident orders a capillary blood glucose and it measures 1.1 mmol/L. Intravenous D50W is promptly administered, the blood glucose normalizes and the patient’s level of consciousness improves.The case above illustrates the following biases:1. Availability bias: the tendency to weigh a diagnosis as being more likely if it comes to mind more readily.42. Diagnostic momentum: the tendency for labels to “stick” to patients and become more definite with time.4Although the symptoms of diaphoresis and agitation are not specific to alcohol withdrawal, this diagnosis was deemed most likely based on how readily it came to mind, the empty alcohol bottles at the scene, and potentially on the patient’s demographics. The unproven diagnosis of alcohol withdrawal “stuck” with the patient despite minimal improvement after a therapeutic trial of benzodiazepines.Availability bias has been shown to affect internal medicine residents. In one single-centre study,15 18 first-year and 18 second-year residents were exposed to case descriptions with associated diagnoses as part of an exercise. They were then asked to diagnose a series of new cases, some of which appeared similar to those they had previously encountered but with pertinent differences that made an alternate diagnosis more likely. Second year residents had lower diagnostic accuracy on these similar-appearing cases; a result consistent with availability bias. First year residents were less prone to this bias because of their limited clinical experience. Most importantly, subsequent reflective diagnostic reasoning countered the bias and improved accuracy.General Strategies to Avoid Cognitive BiasInterventions aimed at mitigating diagnostic error due to cognitive bias take several approaches.1. Improving clinical reasoning2. Reducing cognitive burden3. Improving knowledge and experienceDespite a large number of proposed interventions, there is a lack of empirical evidence supporting the efficacy of many de-biasing strategies. 16 What follows is a brief review of the current evidence.Improving Clinical ReasoningSeveral “de-biasing” strategies have been proposed to improve clinical reasoning. De-biasing strategies assume that System 1 processes are more prone to bias due to their heavy reliance on heuristics and therefore the solution is to activate System 2 at critical points in decision making. De-biasing occurs in several stages: at first an individual is educated about the presence of a cognitive bias, they then employ strategies to eliminate that bias and finally they maintain those strategies in the long term.17Metacognition, or “thinking about thinking,” involves reflecting on one’s own diagnostic reasoning. Internal reflection along with awareness of potential biases should allow the clinician to identify faulty reasoning. However, the evidence underlying reflective practice is mixed.16 Several studies have tried to encourage reflective practice and System 2 processes by instructing participants to proceed slowly through their reasoning18 or by giving participants the opportunity to review their diagnoses.19 These studies have found minimal or no impact on reducing the rate of diagnostic error. On the other hand, some studies have shown improved diagnostic accuracy when physicians are asked to explicitly state their differential diagnoses along with features that are consistent or inconsistent with each diagnosis.20 These results suggest that if reflective practice is to be effective, it must involve a thorough review of the differential diagnosis as opposed to simply taking additional time.Reducing Cognitive BurdenTools that reduce the cognitive burden placed on physicians may reduce the frequency of diagnostic errors. One suggestion has been to incorporate the use of checklists in the diagnostic process. These checklists would be matched to common presenting symptoms and include a list of possible diagnoses. One randomized controlled trial failed to show a statistically significant reduction in the diagnostic error rate with the use of checklists, except in a small subgroup of patients treated in the ED. 21 These findings challenge the results of two other studies that found checklists to be effective in improving scrutiny22 and diagnostic accuracy23 when interpreting electrocardiograms. More advanced forms of clinician decision support systems have also been studied.24 Software programs such as DXplain generate a list of potential diagnoses based on a patient’s chief complaint. In one study, when the software provided physicians a list of possible diagnoses before evaluating patients, diagnoses were 1.31 times more likely to be correct. 25 The use of diagnostic support tools may grow in the future as they are integrated into electronic medical record systems.Improving Knowledge and ExperienceA combination of experience, knowledge and feedback are integral in developing a clinician’s intuition to produce the best hypotheses. Experience without feedback can lead to overconfidence, which itself is a cognitive bias. The evidence supporting feedback is strong. Fridriksson et al showed a significant reduction in diagnostic error when referring doctors were provided feedback on the identification of subarachnoid hemorrhage.26 A systematic review of 118 randomized trials concluded that feedback was effective in improving professional practice.27 The specific characteristics of the best feedback were elusive. In general, however, feedback was thought to be most effective when it was explicit and delivered close to the time of decision making. ConclusionsIn our review, we explore clinical decision making through the lens of dual-process theory. However, multiple different dual-processing models are still being explored and fundamental questions are still under debate. For example, some experts believe that instead of focusing on de-biasing strategies, the key to improving intuitive (System 1) processes is simply to acquire more formal and experiential knowledge.19 Other unanswered questions include: the impact and magnitude of cognitive bias in actual clinical practice, which biases are most prevalent in each medical specialty and which strategies are the most effective in mitigating bias. Further study is also needed to assess the impact of novel educational methods, such as case-based and simulation-based learning, which are promising venues where trainees may identify and correct cognitive biases in a directly observed setting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call