User login
CE/CME No: CR-1603
PROGRAM OVERVIEW
Earn credit by reading this article and successfully completing the posttest and evaluation. Successful completion is defined as a cumulative score of at least 70% correct.
EDUCATIONAL OBJECTIVES
• List the characteristics of System 1 and System 2 thinking.
• Explain how System 1 and System 2 thinking affects clinical decisions.
• Define the characteristics of no-fault, system, and cognitive errors and how they affect health care delivery.
• Describe how biases and cognitive dispositions to respond cause health care providers to make clinical decision errors.
• List some effective debiasing techniques to improve clinical decisions and patient safety.
FACULTY
David J. Klocko is an Associate Professor and Academic Coordinator in the Department of Physician Assistant Studies at the University of Texas Southwestern Medical Center, School of Health Professions, Dallas.
The author has no significant financial relationships to disclose.
ACCREDITATION STATEMENT
This program has been reviewed and is approved for a maximum of 1.0 hour of American Academy of Physician Assistants (AAPA) Category 1 CME credit by the Physician Assistant Review Panel. [NPs: Both ANCC and the AANP Certification Program recognize AAPA as an approved provider of Category 1 credit.] Approval is valid for one year from the issue date of March 2016.
Article begins on next page >>
Diagnostic errors occur for many reasons, some of which are based in cognitive biases. Also called cognitive dispositions to respond (CDR), these can result from failures in perception, faulty mental shortcuts, or unconscious biases, and clinicians are usually unaware they exist. This article discusses the influence CDRs have on clinical decisions and walks you through methods for purposeful debiasing.
Diagnosis is the foundation of medicine ... [and] diagnostic reasoning is a critical aspect of clinical performance.1
— Pat Croskerry, MD, PhD
Diagnostic errors compromise patient safety and the quality of health care and account for the majority of paid malpractice claims. They are especially common in family medicine, internal medicine, emergency medicine, and urgent care, wherethe error rate can be as high as 15%.2 However, all health care providers are subject to errors in clinical judgment, regardless of the setting or specialty in which they practice.3
Clinical disciplines such as internal medicine and emergency medicine have higher error rates than the perceptual disciplines, radiology and pathology. Higher diagnostic error rates in the clinical disciplines are due to the elevated case complexity and the need for rapid interpretation of diagnostic studies. In the perceptual disciplines such as pathology and radiology, fewer time pressures and the ability to obtain a second opinion before making a diagnosis decrease error rates.3 In a National Practitioner Data Bank analysis, more diagnostic error claims occurred in the outpatient setting than in the inpatient setting.4
Quality assurance and performance improvement have become paramount for all health care providers. The modern patient safety movement began in 1999 with the Institute of Medicine (IOM) report To Err Is Human, which highlighted how a faulty health care system causes people to make mistakes and negatively impacts patient safety.5 Some examples of errors arising from imperfections in the health system include medication errors, patient falls, wrong-site surgeries, and improper patient identification. Despite an increased emphasis on patient safety and quality improvement, diagnostic error had not been a focus of attention for policy makers and institutions. Only since the IOM report was released have the medical profession and health policy makers begun to pay attention to diagnostic errors as a serious patient safety issue.5
Cognitive biases, or cognitive dispositions to respond (CDR), can influence clinical decision-making and lead to diagnostic errors. By understanding the thinking processes involved in diagnostic reasoning and the interaction between these processes and cognitive biases, clinicians can take steps to counteract the influence of cognitive biases on their clinical decisions. Here, a brief introduction to dual processing theory is provided, along with information to help clinicians identify potential cognitive biases. Workplace and educational debiasing techniques to counter biases that lead to cognitive decision errors are presented as well.
DIAGNOSTIC ERRORS
All advanced practice providers are at risk for making a clinical decision error. The diagnostic errors that are made in clinical practice can be classified into three broad etiologic categories6:
No-fault errors occur when a rare disease is misdiagnosed as something more common or a disease is silent or presents in an atypical manner. An example of an error that falls into this category is a delayed diagnosis of ischemic bowel in a diabetic patient with no abdominal pain. Another example is a patient with a language barrier who is not able to describe his or her symptoms clearly, leading the clinician to misinterpret the history. Patient nonadherence to recommended care can also be viewed as no-fault, as in the case of a patient diagnosed with colon cancer who did not obtain a recommended screening colonoscopy.6 In one study, no-fault errors accounted for 7% of diagnostic errors.7
System errors occur as a result of “latent” faults in the process of delivering care and can be technical or organizational in nature.6 Examples of diagnostic errors related to technical issues are misdiagnosis or delayed diagnosis resulting from lack of appropriate testing or equipment or from incorrect laboratory results caused by technical problems with equipment. Organizational shortcomings that contribute to diagnostic errors include imperfections in department policies, error tolerance culture, poor patient care coordination, communication problems, inadequate staff training, poor working conditions, unavailability of acute specialty care, and failing to follow up with patients having abnormal diagnostic study results.6 Excessive workload and heavy administrative responsibilities also can contribute to clinician decision errors.
An example of a specific clinical organizational system error would be a missed or delayed diagnosis of a cancer on a chest x-ray due to lack of an “over-read” by a radiologist. Due to cost, many private practices do not send all radiographs for a radiologist’s interpretation. Another example is a patient with a severe eye injury who develops complications after being transferred to another hospital because there is not an on-call ophthalmologist at the presenting hospital.6 Delays in reviewing patient laboratory results are a significant system-based source of medical errors. In one study, 83% of the physician respondents reported at least one delay in reviewing test results in the past two months, with 18% reporting five or more delays in reviewing test results over the same time period.8
Cognitive errors are caused by gaps in knowledge or experience, inadequate interpretation of diagnostic studies, or succumbing to faulty heuristics and biases.6 With cognitive errors, incorrect perception or interpretation of a clinical situation results in faulty differential diagnosis development. Confirmation bias is one type of cognitive error—once supporting information is found for a diagnosis, the search for information to rule out the diagnosis stops.6
An example of this would be a patient with an ankle fracture who is discharged with a missed proximal fibula fracture after the clinician performs a physical exam only on the ankle and orders an ankle x-ray. A cognitive error like this would occur due to inadvertent omission of an important physical exam component or the clinician not knowing the importance of examining the knee when evaluating an ankle fracture.
It is important to note that clinical decision errors are usually multifactorial. In a study involving 100 cases of diagnostic error in internal medicine, Graber and colleagues determined that in 46% of the cases errors were caused by a combination of system-related and cognitive factors.7
Continue for decision making >>
Decision Making: Dual Process Theory
Over the past two decades, dual process theory (DPT) has been recognized as a reliable model of the decision-making process in the psychology literature.9 DPT proposes two unique processes of thinking during decision making, referred to as System 1 and System 2, or Type 1 and Type 2, processes. A brief introduction to DPT is given here for practicing clinicians, but a detailed discussion of the literature pertaining to this concept is beyond the scope of this review.
System 1 processes are “intuitive,” utilize pattern recognition and heuristics, and rely heavily on the context or conditions in which the decision is made. The intuitive System 1 mode of thinking uses a pattern recognition or “gut reaction” approach.10 It is fast and reflexive but can be subject to deficits in predictive power and reliability.10 Experienced clinicians use pattern recognition in conditions presenting with classic signs and symptoms.10 For example, the clinician who evaluates a 12-year-old child with an annular, erythemic patch with central clearing on the forearm and immediately diagnoses ringworm is thinking in the intuitive mode. Generally, human beings are most comfortable in this decision mode because it involves intuition and requires less mental effort and concentration. For clinicians, System 1 thinking is the default defense mechanism against “decision fatigue” and “cognitive overload” during a busy shift, and it is the thinking mode used when clinicians are stressed, hurried, tired, and working with a lack of resources.9,10 Croskerry maintains, however, that such clinical situations, and the reliance on System 1 thinking that such situations entail, can make clinicians more vulnerable to certain biases.9
System 2 thinking is analytic, deductive, slow, and deliberate. This mode of thinking has high predictive power with high reliability, and it is less influenced by the context or conditions in which the decision is being made.10 Clinicians use this mode of thinking when patients present with vague signs and symptoms and a diagnosis is not instantly recognized.10 System 2 decision making would be required, for example, when evaluating a 55-year-old woman with chest pain. The clinical condition requires the clinician to acquire more data and make a conscious effort to analyze results, and arriving at a clinical decision in this situation takes more time. Shortcuts due to time pressures can have devastating outcomes in this setting. It should be mentioned, however, that psychology research has shown that the System 2 analytic approach is mentally taxing and may also result in poor decisions (“thinking too much”).11
Intuitive and analytic thinking are not independent of each other. During a clinical encounter, there is unconscious switching back and forth between the two modes as the clinician evaluates the information at hand in order to produce a decision.12 A patient presenting with a chief complaint may trigger a System 1 decision, but due to uncertainty there may be a “System 2 override”where the clinician consciously forces herself to reassess and perform further analysis.10 System 1 intuitive decision processes become more dominant with experience. Many encounters requiring System 2 thinking early in a clinician’s career may become System 1 decisions as the clinician gains expertise.10 This results as the clinician develops a “mental library” of previous encounters with commonly seen medical conditions.13 It is important to note that clinical decision errors often result from a combination of knowledge gaps and processing malfunctions and not from one process alone.14
Similarly, diagnostic errors are not purely a result of cognitive biases or reliance on System 1 or System 2 thinking, but rather are a result of multiple factors.In a study that looked at provider time to diagnosis and accuracy of diagnosis, results indicated that System 1 reasoning was not more error prone than System 2 thinking.15 Experienced clinicians emphasize that errors can occur at any time or in any context in both System 1 and 2 modes of thinking.16
The vast majority of human decisions—95%—are made in System 1 mode, while only 5% of our “thinking” is conscious analytic thought.17 Croskerry suggests that clinical reasoning defaults to the faster, more mentally economic System 1 thinking, which can make clinicians prone to error by allowing intuition, heuristics, and processes that are most vulnerable to mistakes—stereotyping, prejudices, and biases—to influence a decision.9,18 Both novice and expert clinicians should be encouraged to develop insight into their intuitive and analytic decision-making processes and become aware of which thinking mode they are using in a specific clinical situation.
Continue for cognitive dispositions to respond >>
Cognitive Dispositions to Respond
Diagnostic errors are often associated with cognitive errors such as failures in perception, failed heuristics, and biases; as a group, these cognitive errors have been labeled cognitive dispositions to respond.1 In the medical and psychology literature, more than 100 CDRs have been identified.19 Common CDR/bias definitions are provided in the graphic.
In everyday practice, clinicians encounter clinical scenarios or situations where CDRs can affect decision making. The following brief clinical examples further illustrate the defining characteristics of the CDRs. Cognitive errors related to these CDRs can occur if a clinician does not remain completely objective.
Availability is a bias that applies the saying “more common diseases are common.” An example of this bias in practice would be a provider who has seen three patients with abdominal pain and diagnosed gastritis for each. A fourth patient presents with abdominal pain, is diagnosed with gastritis, but actually has appendicitis.
Search satisficing, or premature closure, occurs when one has found enough information to make a diagnosis and then stops looking for further causes or additional problems. For example, a PA rounds on a patient who is post-op day 1 from coronary bypass surgery and develops decreasing oxygen saturation. A chest x-ray reveals right lower lobe opacity consistent with either pneumonia or pleural effusion; antibiotics are started and oxygen concentration is increased on the ventilator. The radiologist later informs the PA that the patient also has a left-sided pneumothorax. The PA did not treat that because he stopped looking for other causes of the oxygen desaturation once the right lower lobe pneumonia was found.
Continue for confirmation >>
Confirmation bias occurs when clinicians seek to confirm a diagnosis rather than rule it out. For example, a patient presents with first-time, new-onset “classic” migraine symptoms, characterized as “the worst headache of her life.” The provider asks patient history questions to confirm the initial impression of a migraine headache and does not order a CT scan.
Posterior probability is a bias whereby the clinician gives excessive weight to a patient’s previous medical history. It occurs, for example, when a patient with chronic back pain is diagnosed with musculoskeletal back pain without considering other causes, such as urinary tract infection or pyelonephritis.
Diagnosis momentum bias occurs when a clinician relies on information handed down from numerous parties involved with the patient. An example is a patient who has a syncopal episode in church and several tonic-clonic movements while briefly unconscious. Nearby witnesses describe the event as a “seizure,” and paramedics relaying information to the emergency department indicate that the patient had a “seizure.” Ultimately, the triage information records “seizure” as the diagnosis. A cognitive error can occur if the treating clinician does not take a thorough history to consider an alternative diagnosis.
Fundamental attribution error bias occurs when a provider is judgmental and blames the patient for their disease. A provider who quips, “No wonder that patient has diabetes and hypertension; she weighs 325 lb,” is exhibiting fundamental attribution error bias.
Ascertainment bias allows preconceived notions, including stereotypes, to influence a clinician’s thinking. A provider who determines that all female patients with multiple somatic complaints have anxiety and depression is subject to this bias.
Triage cueing occurs when some aspect of the triage process influences the clinician’s thinking, such as when the clinician assumes that patients who are placed in the fast track are low acuity and therefore gives no consideration to higher acuity diagnoses.
Playing the odds assumes that a patient with a vague presentation has a benign condition rather than a serious one because the odds favor that. An example of this bias occurs when a 65-year-old woman with vomiting during flu season is quickly diagnosed with gastroenteritis. Fortunately, the patient is on a telemetry monitor while getting IV fluids and antinausea medication. The monitor results indicate that her vomiting episodes are occurring during long periods of sinus arrest.
Psych-out bias applies when signs or symptoms in a patient with a psychiatric diagnosis are ascribed to the underlying psychiatric condition and other serious possibilities are quickly dismissed. For example, a provider who assumes that an unstable psychiatric patient is nonadherent with her prescribed medication or is abusing substances rather than considering an underlying medical illness is demonstrating psych-out bias.
Illusory correlation bias occurs, for example, when the provider makes the assumption that the emergency department will be busy because there is a full moon.
Continue to find out if you are at risk for being wrong >>
AM I AT RISK FOR BEING WRONG?
Autonomous advanced practice clinicians in high-risk practice settings have an immense responsibility to ensure that their patients are getting the best possible care. It is documented that as expertise develops, knowledge and decision processes change. Ordinarily, highly experienced clinicians use the more time-efficient System 1 process when faced with common disorders; for more complex disorders, they change to System 2 thinking to facilitate a more comprehensive evaluation.13 In many instances, however, a provider may inadvertently take shortcuts to conclude the clinical encounter, including relying on intuitive thinking—which can be prone to bias—when analytic thinking is necessary.
Clinicians are usually unaware of the influence that biases may have on their decision making and should reflect on their behavior to determine if any biases exist. To improve patient safety and facilitate better care, all providers should perform a personal inventory to identify CDRs they may have developed. Questions that will help to reveal CDRs include
- Am I rushing to get off my shift on time?
- Was the patient “turned over” to me at the shift change?
- Have I allowed a previously negative experience with this patient to influence my objectivity and clinical decision-making?
- Am I tired?
- Has the diagnosis been suggested by the nurse, paramedic, or the patient’s family?9
- Has the diagnosis been suggested by the nurse,
If one or more biases are found, a purposeful effort to mentally “uncouple” from a bias should be done. This process is referred to as metacognition, or thinking about one’s own thought processes.9 Paramount among the thinking processes that may be at play is an awareness of how System 1 and System 2 thinking interact and affect clinical decision making, as this enables the clinician to recognize which mode of thinking they use to arrive at a decision and when they need to shift from intuitive to analytic thinking.
Another factor to consider is overconfidence: Berner and Graber note that a provider’s overconfidence3 in his or her own knowledge and experience and lack of awareness of when an “override” is needed can be a cause of diagnostic errors.18 The tendency to shore up existing beliefs rather than force a new cognitive strategy is a sign of a rigid thinking process that may ultimately result in a poor clinical decision.9 Finally, providers should be aware of their surroundings and practice environments. As noted earlier, emergency medicine, family medicine, internal medicine, and urgent care have high diagnostic error rates due, in part, to high patient volumes.1
Once a tendency for a certain cognitive bias is recognized, the next step is to develop a sustainable method to counteract it, a process referred to as debiasing, to prevent cognitive errors. The table lists some workplace and educational debiasing techniques that have been described in the literature.20,21 Critics of cognitive debiasing argue that CDRs are preconscious, that awareness of CDRs is not enough to counteract their effects, and that there is no ability for one to develop “generic” conscious efforts to counter them.14 Their concern here is that a clinician may be able to counter a bias in one clinical context but not in another.14 It is clear that clinical reasoning is complex and involves many interrelated elements, such as clinical knowledge and critical thinking, with System 1 and 2 thinking working in tandem and metacognition overarching the whole process.21 Errors in diagnosis can have multiple causes and no single cognitive approach can be effective in addressing all of these causes. Knowing about cognitive bias helps clinicians address one possible element underlying diagnostic errors. Efforts to eliminate bias in clinical reasoning should begin early in clinical education; this can be done by incorporating instruction on clinical reasoning, including the relationship between intuitive and analytic decisions, metacognition, and awareness of the strengths and weaknesses of heuristics.22
In summary, in clinical situations where bias or uncertainty might exist, a clinician can make an effort to avoid a bad decision by
- Stepping back and reflecting to consider if a bias exists.
- Developing rules and mental procedures to reject a reflexive automatic response and force a “System 2 override.”9
- Developing “mental-ware” (mental techniques) to uncouple from a recognized or recurring cognitive bias.9
Continue to conclusion >>
Conclusion
This article reminds health care providers that cognitive biases can influence clinical decision-making. Clinicians should be aware of how System 1 and System 2 thinking couple with unconscious cognitive biases to affect clinical decisions and patient safety. Once a provider identifies a bias, he or she should attempt to employ one or more debiasing techniques. Medical decision errors usually occur due to multiple factors, and one thinking mode is not more error prone than the other (analytic versus intuitive). Cognitive errors are also caused by knowledge gaps and faulty patient data processing. Future research is needed to assess outcomes of quality improvement projects that include these components.
1. Croskerry P. Diagnostic failure: a cognitive and affective approach. In: Henriksen K, Battles JB, Marks ES, Lewin DI, eds. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville, MD: Agency for Healthcare Research and Quality; 2005:241-254.
2. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775-780.
3. Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008;121(5A):S2-S23.
4. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22(8):672-680.
5. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy of Sciences; 1999.
6. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77:981-992.
7. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165(13):1493-1499.
8. Poon EG, Gandhi TK, Sequist TD, et al. “I wish I had seen this test result earlier!”: dissatisfaction with test result management systems in primary care. Arch Intern Med. 2004;164(20):2223-2228.
9. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-ii64.
10. Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. J R Coll Physicians Edinb. 2011;41(2):155-162.
11. Wilson TD, Schooler JW. Thinking too much: introspection can reduce the quality of p and decisions. J Pers Soc Psychol. 1991;60(2): 181-192.
12. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009; 84(8):1022-1028.
13. Norman G, Young M, Brooks L. Non-analytical models of clinical reasoning: the role of experience. Med Educ. 2007;41(12):1140-1145.
14. Norman GR, Eva KW. Diagnostic error and clinical reasoning. Med Educ. 2010;44(1):94-100.
15. Sherbino J, Dore KL, Wood TJ, et al. The relationship between response time and diagnostic accuracy. Acad Med. 2012;87(6):785-791.
16. Petrie D, Campbell S. Clinical decision making, fast and slow. Acad Med. 2013;88(5):557.
17. Lakoff G, Johnson M. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. New York, NY: Basic Books; 1999.
18. Sinclair D, Croskerry P. Patient safety and diagnostic error: tips for your next shift. Can Fam Physician. 2010;56(1):28-30.
19. Croskerry P. From mindless to mindful practice—cognitive bias and clinical decision making. N Engl J Med. 2013;368(26):2445-2448.
20. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013;22(suppl 2):ii65-ii72.
21. Groves M. Understanding clinical reasoning: the next step in working out how it really works. Med Educ. 2012;46(5):444-446.
22. Trowbridge RL, Dhaliwal G, Cosby KS. Educational agenda for diagnostic error reduction. BMJ Qual Saf. 2013;22(suppl 2):ii28-ii32.
CE/CME No: CR-1603
PROGRAM OVERVIEW
Earn credit by reading this article and successfully completing the posttest and evaluation. Successful completion is defined as a cumulative score of at least 70% correct.
EDUCATIONAL OBJECTIVES
• List the characteristics of System 1 and System 2 thinking.
• Explain how System 1 and System 2 thinking affects clinical decisions.
• Define the characteristics of no-fault, system, and cognitive errors and how they affect health care delivery.
• Describe how biases and cognitive dispositions to respond cause health care providers to make clinical decision errors.
• List some effective debiasing techniques to improve clinical decisions and patient safety.
FACULTY
David J. Klocko is an Associate Professor and Academic Coordinator in the Department of Physician Assistant Studies at the University of Texas Southwestern Medical Center, School of Health Professions, Dallas.
The author has no significant financial relationships to disclose.
ACCREDITATION STATEMENT
This program has been reviewed and is approved for a maximum of 1.0 hour of American Academy of Physician Assistants (AAPA) Category 1 CME credit by the Physician Assistant Review Panel. [NPs: Both ANCC and the AANP Certification Program recognize AAPA as an approved provider of Category 1 credit.] Approval is valid for one year from the issue date of March 2016.
Article begins on next page >>
Diagnostic errors occur for many reasons, some of which are based in cognitive biases. Also called cognitive dispositions to respond (CDR), these can result from failures in perception, faulty mental shortcuts, or unconscious biases, and clinicians are usually unaware they exist. This article discusses the influence CDRs have on clinical decisions and walks you through methods for purposeful debiasing.
Diagnosis is the foundation of medicine ... [and] diagnostic reasoning is a critical aspect of clinical performance.1
— Pat Croskerry, MD, PhD
Diagnostic errors compromise patient safety and the quality of health care and account for the majority of paid malpractice claims. They are especially common in family medicine, internal medicine, emergency medicine, and urgent care, wherethe error rate can be as high as 15%.2 However, all health care providers are subject to errors in clinical judgment, regardless of the setting or specialty in which they practice.3
Clinical disciplines such as internal medicine and emergency medicine have higher error rates than the perceptual disciplines, radiology and pathology. Higher diagnostic error rates in the clinical disciplines are due to the elevated case complexity and the need for rapid interpretation of diagnostic studies. In the perceptual disciplines such as pathology and radiology, fewer time pressures and the ability to obtain a second opinion before making a diagnosis decrease error rates.3 In a National Practitioner Data Bank analysis, more diagnostic error claims occurred in the outpatient setting than in the inpatient setting.4
Quality assurance and performance improvement have become paramount for all health care providers. The modern patient safety movement began in 1999 with the Institute of Medicine (IOM) report To Err Is Human, which highlighted how a faulty health care system causes people to make mistakes and negatively impacts patient safety.5 Some examples of errors arising from imperfections in the health system include medication errors, patient falls, wrong-site surgeries, and improper patient identification. Despite an increased emphasis on patient safety and quality improvement, diagnostic error had not been a focus of attention for policy makers and institutions. Only since the IOM report was released have the medical profession and health policy makers begun to pay attention to diagnostic errors as a serious patient safety issue.5
Cognitive biases, or cognitive dispositions to respond (CDR), can influence clinical decision-making and lead to diagnostic errors. By understanding the thinking processes involved in diagnostic reasoning and the interaction between these processes and cognitive biases, clinicians can take steps to counteract the influence of cognitive biases on their clinical decisions. Here, a brief introduction to dual processing theory is provided, along with information to help clinicians identify potential cognitive biases. Workplace and educational debiasing techniques to counter biases that lead to cognitive decision errors are presented as well.
DIAGNOSTIC ERRORS
All advanced practice providers are at risk for making a clinical decision error. The diagnostic errors that are made in clinical practice can be classified into three broad etiologic categories6:
No-fault errors occur when a rare disease is misdiagnosed as something more common or a disease is silent or presents in an atypical manner. An example of an error that falls into this category is a delayed diagnosis of ischemic bowel in a diabetic patient with no abdominal pain. Another example is a patient with a language barrier who is not able to describe his or her symptoms clearly, leading the clinician to misinterpret the history. Patient nonadherence to recommended care can also be viewed as no-fault, as in the case of a patient diagnosed with colon cancer who did not obtain a recommended screening colonoscopy.6 In one study, no-fault errors accounted for 7% of diagnostic errors.7
System errors occur as a result of “latent” faults in the process of delivering care and can be technical or organizational in nature.6 Examples of diagnostic errors related to technical issues are misdiagnosis or delayed diagnosis resulting from lack of appropriate testing or equipment or from incorrect laboratory results caused by technical problems with equipment. Organizational shortcomings that contribute to diagnostic errors include imperfections in department policies, error tolerance culture, poor patient care coordination, communication problems, inadequate staff training, poor working conditions, unavailability of acute specialty care, and failing to follow up with patients having abnormal diagnostic study results.6 Excessive workload and heavy administrative responsibilities also can contribute to clinician decision errors.
An example of a specific clinical organizational system error would be a missed or delayed diagnosis of a cancer on a chest x-ray due to lack of an “over-read” by a radiologist. Due to cost, many private practices do not send all radiographs for a radiologist’s interpretation. Another example is a patient with a severe eye injury who develops complications after being transferred to another hospital because there is not an on-call ophthalmologist at the presenting hospital.6 Delays in reviewing patient laboratory results are a significant system-based source of medical errors. In one study, 83% of the physician respondents reported at least one delay in reviewing test results in the past two months, with 18% reporting five or more delays in reviewing test results over the same time period.8
Cognitive errors are caused by gaps in knowledge or experience, inadequate interpretation of diagnostic studies, or succumbing to faulty heuristics and biases.6 With cognitive errors, incorrect perception or interpretation of a clinical situation results in faulty differential diagnosis development. Confirmation bias is one type of cognitive error—once supporting information is found for a diagnosis, the search for information to rule out the diagnosis stops.6
An example of this would be a patient with an ankle fracture who is discharged with a missed proximal fibula fracture after the clinician performs a physical exam only on the ankle and orders an ankle x-ray. A cognitive error like this would occur due to inadvertent omission of an important physical exam component or the clinician not knowing the importance of examining the knee when evaluating an ankle fracture.
It is important to note that clinical decision errors are usually multifactorial. In a study involving 100 cases of diagnostic error in internal medicine, Graber and colleagues determined that in 46% of the cases errors were caused by a combination of system-related and cognitive factors.7
Continue for decision making >>
Decision Making: Dual Process Theory
Over the past two decades, dual process theory (DPT) has been recognized as a reliable model of the decision-making process in the psychology literature.9 DPT proposes two unique processes of thinking during decision making, referred to as System 1 and System 2, or Type 1 and Type 2, processes. A brief introduction to DPT is given here for practicing clinicians, but a detailed discussion of the literature pertaining to this concept is beyond the scope of this review.
System 1 processes are “intuitive,” utilize pattern recognition and heuristics, and rely heavily on the context or conditions in which the decision is made. The intuitive System 1 mode of thinking uses a pattern recognition or “gut reaction” approach.10 It is fast and reflexive but can be subject to deficits in predictive power and reliability.10 Experienced clinicians use pattern recognition in conditions presenting with classic signs and symptoms.10 For example, the clinician who evaluates a 12-year-old child with an annular, erythemic patch with central clearing on the forearm and immediately diagnoses ringworm is thinking in the intuitive mode. Generally, human beings are most comfortable in this decision mode because it involves intuition and requires less mental effort and concentration. For clinicians, System 1 thinking is the default defense mechanism against “decision fatigue” and “cognitive overload” during a busy shift, and it is the thinking mode used when clinicians are stressed, hurried, tired, and working with a lack of resources.9,10 Croskerry maintains, however, that such clinical situations, and the reliance on System 1 thinking that such situations entail, can make clinicians more vulnerable to certain biases.9
System 2 thinking is analytic, deductive, slow, and deliberate. This mode of thinking has high predictive power with high reliability, and it is less influenced by the context or conditions in which the decision is being made.10 Clinicians use this mode of thinking when patients present with vague signs and symptoms and a diagnosis is not instantly recognized.10 System 2 decision making would be required, for example, when evaluating a 55-year-old woman with chest pain. The clinical condition requires the clinician to acquire more data and make a conscious effort to analyze results, and arriving at a clinical decision in this situation takes more time. Shortcuts due to time pressures can have devastating outcomes in this setting. It should be mentioned, however, that psychology research has shown that the System 2 analytic approach is mentally taxing and may also result in poor decisions (“thinking too much”).11
Intuitive and analytic thinking are not independent of each other. During a clinical encounter, there is unconscious switching back and forth between the two modes as the clinician evaluates the information at hand in order to produce a decision.12 A patient presenting with a chief complaint may trigger a System 1 decision, but due to uncertainty there may be a “System 2 override”where the clinician consciously forces herself to reassess and perform further analysis.10 System 1 intuitive decision processes become more dominant with experience. Many encounters requiring System 2 thinking early in a clinician’s career may become System 1 decisions as the clinician gains expertise.10 This results as the clinician develops a “mental library” of previous encounters with commonly seen medical conditions.13 It is important to note that clinical decision errors often result from a combination of knowledge gaps and processing malfunctions and not from one process alone.14
Similarly, diagnostic errors are not purely a result of cognitive biases or reliance on System 1 or System 2 thinking, but rather are a result of multiple factors.In a study that looked at provider time to diagnosis and accuracy of diagnosis, results indicated that System 1 reasoning was not more error prone than System 2 thinking.15 Experienced clinicians emphasize that errors can occur at any time or in any context in both System 1 and 2 modes of thinking.16
The vast majority of human decisions—95%—are made in System 1 mode, while only 5% of our “thinking” is conscious analytic thought.17 Croskerry suggests that clinical reasoning defaults to the faster, more mentally economic System 1 thinking, which can make clinicians prone to error by allowing intuition, heuristics, and processes that are most vulnerable to mistakes—stereotyping, prejudices, and biases—to influence a decision.9,18 Both novice and expert clinicians should be encouraged to develop insight into their intuitive and analytic decision-making processes and become aware of which thinking mode they are using in a specific clinical situation.
Continue for cognitive dispositions to respond >>
Cognitive Dispositions to Respond
Diagnostic errors are often associated with cognitive errors such as failures in perception, failed heuristics, and biases; as a group, these cognitive errors have been labeled cognitive dispositions to respond.1 In the medical and psychology literature, more than 100 CDRs have been identified.19 Common CDR/bias definitions are provided in the graphic.
In everyday practice, clinicians encounter clinical scenarios or situations where CDRs can affect decision making. The following brief clinical examples further illustrate the defining characteristics of the CDRs. Cognitive errors related to these CDRs can occur if a clinician does not remain completely objective.
Availability is a bias that applies the saying “more common diseases are common.” An example of this bias in practice would be a provider who has seen three patients with abdominal pain and diagnosed gastritis for each. A fourth patient presents with abdominal pain, is diagnosed with gastritis, but actually has appendicitis.
Search satisficing, or premature closure, occurs when one has found enough information to make a diagnosis and then stops looking for further causes or additional problems. For example, a PA rounds on a patient who is post-op day 1 from coronary bypass surgery and develops decreasing oxygen saturation. A chest x-ray reveals right lower lobe opacity consistent with either pneumonia or pleural effusion; antibiotics are started and oxygen concentration is increased on the ventilator. The radiologist later informs the PA that the patient also has a left-sided pneumothorax. The PA did not treat that because he stopped looking for other causes of the oxygen desaturation once the right lower lobe pneumonia was found.
Continue for confirmation >>
Confirmation bias occurs when clinicians seek to confirm a diagnosis rather than rule it out. For example, a patient presents with first-time, new-onset “classic” migraine symptoms, characterized as “the worst headache of her life.” The provider asks patient history questions to confirm the initial impression of a migraine headache and does not order a CT scan.
Posterior probability is a bias whereby the clinician gives excessive weight to a patient’s previous medical history. It occurs, for example, when a patient with chronic back pain is diagnosed with musculoskeletal back pain without considering other causes, such as urinary tract infection or pyelonephritis.
Diagnosis momentum bias occurs when a clinician relies on information handed down from numerous parties involved with the patient. An example is a patient who has a syncopal episode in church and several tonic-clonic movements while briefly unconscious. Nearby witnesses describe the event as a “seizure,” and paramedics relaying information to the emergency department indicate that the patient had a “seizure.” Ultimately, the triage information records “seizure” as the diagnosis. A cognitive error can occur if the treating clinician does not take a thorough history to consider an alternative diagnosis.
Fundamental attribution error bias occurs when a provider is judgmental and blames the patient for their disease. A provider who quips, “No wonder that patient has diabetes and hypertension; she weighs 325 lb,” is exhibiting fundamental attribution error bias.
Ascertainment bias allows preconceived notions, including stereotypes, to influence a clinician’s thinking. A provider who determines that all female patients with multiple somatic complaints have anxiety and depression is subject to this bias.
Triage cueing occurs when some aspect of the triage process influences the clinician’s thinking, such as when the clinician assumes that patients who are placed in the fast track are low acuity and therefore gives no consideration to higher acuity diagnoses.
Playing the odds assumes that a patient with a vague presentation has a benign condition rather than a serious one because the odds favor that. An example of this bias occurs when a 65-year-old woman with vomiting during flu season is quickly diagnosed with gastroenteritis. Fortunately, the patient is on a telemetry monitor while getting IV fluids and antinausea medication. The monitor results indicate that her vomiting episodes are occurring during long periods of sinus arrest.
Psych-out bias applies when signs or symptoms in a patient with a psychiatric diagnosis are ascribed to the underlying psychiatric condition and other serious possibilities are quickly dismissed. For example, a provider who assumes that an unstable psychiatric patient is nonadherent with her prescribed medication or is abusing substances rather than considering an underlying medical illness is demonstrating psych-out bias.
Illusory correlation bias occurs, for example, when the provider makes the assumption that the emergency department will be busy because there is a full moon.
Continue to find out if you are at risk for being wrong >>
AM I AT RISK FOR BEING WRONG?
Autonomous advanced practice clinicians in high-risk practice settings have an immense responsibility to ensure that their patients are getting the best possible care. It is documented that as expertise develops, knowledge and decision processes change. Ordinarily, highly experienced clinicians use the more time-efficient System 1 process when faced with common disorders; for more complex disorders, they change to System 2 thinking to facilitate a more comprehensive evaluation.13 In many instances, however, a provider may inadvertently take shortcuts to conclude the clinical encounter, including relying on intuitive thinking—which can be prone to bias—when analytic thinking is necessary.
Clinicians are usually unaware of the influence that biases may have on their decision making and should reflect on their behavior to determine if any biases exist. To improve patient safety and facilitate better care, all providers should perform a personal inventory to identify CDRs they may have developed. Questions that will help to reveal CDRs include
- Am I rushing to get off my shift on time?
- Was the patient “turned over” to me at the shift change?
- Have I allowed a previously negative experience with this patient to influence my objectivity and clinical decision-making?
- Am I tired?
- Has the diagnosis been suggested by the nurse, paramedic, or the patient’s family?9
- Has the diagnosis been suggested by the nurse,
If one or more biases are found, a purposeful effort to mentally “uncouple” from a bias should be done. This process is referred to as metacognition, or thinking about one’s own thought processes.9 Paramount among the thinking processes that may be at play is an awareness of how System 1 and System 2 thinking interact and affect clinical decision making, as this enables the clinician to recognize which mode of thinking they use to arrive at a decision and when they need to shift from intuitive to analytic thinking.
Another factor to consider is overconfidence: Berner and Graber note that a provider’s overconfidence3 in his or her own knowledge and experience and lack of awareness of when an “override” is needed can be a cause of diagnostic errors.18 The tendency to shore up existing beliefs rather than force a new cognitive strategy is a sign of a rigid thinking process that may ultimately result in a poor clinical decision.9 Finally, providers should be aware of their surroundings and practice environments. As noted earlier, emergency medicine, family medicine, internal medicine, and urgent care have high diagnostic error rates due, in part, to high patient volumes.1
Once a tendency for a certain cognitive bias is recognized, the next step is to develop a sustainable method to counteract it, a process referred to as debiasing, to prevent cognitive errors. The table lists some workplace and educational debiasing techniques that have been described in the literature.20,21 Critics of cognitive debiasing argue that CDRs are preconscious, that awareness of CDRs is not enough to counteract their effects, and that there is no ability for one to develop “generic” conscious efforts to counter them.14 Their concern here is that a clinician may be able to counter a bias in one clinical context but not in another.14 It is clear that clinical reasoning is complex and involves many interrelated elements, such as clinical knowledge and critical thinking, with System 1 and 2 thinking working in tandem and metacognition overarching the whole process.21 Errors in diagnosis can have multiple causes and no single cognitive approach can be effective in addressing all of these causes. Knowing about cognitive bias helps clinicians address one possible element underlying diagnostic errors. Efforts to eliminate bias in clinical reasoning should begin early in clinical education; this can be done by incorporating instruction on clinical reasoning, including the relationship between intuitive and analytic decisions, metacognition, and awareness of the strengths and weaknesses of heuristics.22
In summary, in clinical situations where bias or uncertainty might exist, a clinician can make an effort to avoid a bad decision by
- Stepping back and reflecting to consider if a bias exists.
- Developing rules and mental procedures to reject a reflexive automatic response and force a “System 2 override.”9
- Developing “mental-ware” (mental techniques) to uncouple from a recognized or recurring cognitive bias.9
Continue to conclusion >>
Conclusion
This article reminds health care providers that cognitive biases can influence clinical decision-making. Clinicians should be aware of how System 1 and System 2 thinking couple with unconscious cognitive biases to affect clinical decisions and patient safety. Once a provider identifies a bias, he or she should attempt to employ one or more debiasing techniques. Medical decision errors usually occur due to multiple factors, and one thinking mode is not more error prone than the other (analytic versus intuitive). Cognitive errors are also caused by knowledge gaps and faulty patient data processing. Future research is needed to assess outcomes of quality improvement projects that include these components.
CE/CME No: CR-1603
PROGRAM OVERVIEW
Earn credit by reading this article and successfully completing the posttest and evaluation. Successful completion is defined as a cumulative score of at least 70% correct.
EDUCATIONAL OBJECTIVES
• List the characteristics of System 1 and System 2 thinking.
• Explain how System 1 and System 2 thinking affects clinical decisions.
• Define the characteristics of no-fault, system, and cognitive errors and how they affect health care delivery.
• Describe how biases and cognitive dispositions to respond cause health care providers to make clinical decision errors.
• List some effective debiasing techniques to improve clinical decisions and patient safety.
FACULTY
David J. Klocko is an Associate Professor and Academic Coordinator in the Department of Physician Assistant Studies at the University of Texas Southwestern Medical Center, School of Health Professions, Dallas.
The author has no significant financial relationships to disclose.
ACCREDITATION STATEMENT
This program has been reviewed and is approved for a maximum of 1.0 hour of American Academy of Physician Assistants (AAPA) Category 1 CME credit by the Physician Assistant Review Panel. [NPs: Both ANCC and the AANP Certification Program recognize AAPA as an approved provider of Category 1 credit.] Approval is valid for one year from the issue date of March 2016.
Article begins on next page >>
Diagnostic errors occur for many reasons, some of which are based in cognitive biases. Also called cognitive dispositions to respond (CDR), these can result from failures in perception, faulty mental shortcuts, or unconscious biases, and clinicians are usually unaware they exist. This article discusses the influence CDRs have on clinical decisions and walks you through methods for purposeful debiasing.
Diagnosis is the foundation of medicine ... [and] diagnostic reasoning is a critical aspect of clinical performance.1
— Pat Croskerry, MD, PhD
Diagnostic errors compromise patient safety and the quality of health care and account for the majority of paid malpractice claims. They are especially common in family medicine, internal medicine, emergency medicine, and urgent care, wherethe error rate can be as high as 15%.2 However, all health care providers are subject to errors in clinical judgment, regardless of the setting or specialty in which they practice.3
Clinical disciplines such as internal medicine and emergency medicine have higher error rates than the perceptual disciplines, radiology and pathology. Higher diagnostic error rates in the clinical disciplines are due to the elevated case complexity and the need for rapid interpretation of diagnostic studies. In the perceptual disciplines such as pathology and radiology, fewer time pressures and the ability to obtain a second opinion before making a diagnosis decrease error rates.3 In a National Practitioner Data Bank analysis, more diagnostic error claims occurred in the outpatient setting than in the inpatient setting.4
Quality assurance and performance improvement have become paramount for all health care providers. The modern patient safety movement began in 1999 with the Institute of Medicine (IOM) report To Err Is Human, which highlighted how a faulty health care system causes people to make mistakes and negatively impacts patient safety.5 Some examples of errors arising from imperfections in the health system include medication errors, patient falls, wrong-site surgeries, and improper patient identification. Despite an increased emphasis on patient safety and quality improvement, diagnostic error had not been a focus of attention for policy makers and institutions. Only since the IOM report was released have the medical profession and health policy makers begun to pay attention to diagnostic errors as a serious patient safety issue.5
Cognitive biases, or cognitive dispositions to respond (CDR), can influence clinical decision-making and lead to diagnostic errors. By understanding the thinking processes involved in diagnostic reasoning and the interaction between these processes and cognitive biases, clinicians can take steps to counteract the influence of cognitive biases on their clinical decisions. Here, a brief introduction to dual processing theory is provided, along with information to help clinicians identify potential cognitive biases. Workplace and educational debiasing techniques to counter biases that lead to cognitive decision errors are presented as well.
DIAGNOSTIC ERRORS
All advanced practice providers are at risk for making a clinical decision error. The diagnostic errors that are made in clinical practice can be classified into three broad etiologic categories6:
No-fault errors occur when a rare disease is misdiagnosed as something more common or a disease is silent or presents in an atypical manner. An example of an error that falls into this category is a delayed diagnosis of ischemic bowel in a diabetic patient with no abdominal pain. Another example is a patient with a language barrier who is not able to describe his or her symptoms clearly, leading the clinician to misinterpret the history. Patient nonadherence to recommended care can also be viewed as no-fault, as in the case of a patient diagnosed with colon cancer who did not obtain a recommended screening colonoscopy.6 In one study, no-fault errors accounted for 7% of diagnostic errors.7
System errors occur as a result of “latent” faults in the process of delivering care and can be technical or organizational in nature.6 Examples of diagnostic errors related to technical issues are misdiagnosis or delayed diagnosis resulting from lack of appropriate testing or equipment or from incorrect laboratory results caused by technical problems with equipment. Organizational shortcomings that contribute to diagnostic errors include imperfections in department policies, error tolerance culture, poor patient care coordination, communication problems, inadequate staff training, poor working conditions, unavailability of acute specialty care, and failing to follow up with patients having abnormal diagnostic study results.6 Excessive workload and heavy administrative responsibilities also can contribute to clinician decision errors.
An example of a specific clinical organizational system error would be a missed or delayed diagnosis of a cancer on a chest x-ray due to lack of an “over-read” by a radiologist. Due to cost, many private practices do not send all radiographs for a radiologist’s interpretation. Another example is a patient with a severe eye injury who develops complications after being transferred to another hospital because there is not an on-call ophthalmologist at the presenting hospital.6 Delays in reviewing patient laboratory results are a significant system-based source of medical errors. In one study, 83% of the physician respondents reported at least one delay in reviewing test results in the past two months, with 18% reporting five or more delays in reviewing test results over the same time period.8
Cognitive errors are caused by gaps in knowledge or experience, inadequate interpretation of diagnostic studies, or succumbing to faulty heuristics and biases.6 With cognitive errors, incorrect perception or interpretation of a clinical situation results in faulty differential diagnosis development. Confirmation bias is one type of cognitive error—once supporting information is found for a diagnosis, the search for information to rule out the diagnosis stops.6
An example of this would be a patient with an ankle fracture who is discharged with a missed proximal fibula fracture after the clinician performs a physical exam only on the ankle and orders an ankle x-ray. A cognitive error like this would occur due to inadvertent omission of an important physical exam component or the clinician not knowing the importance of examining the knee when evaluating an ankle fracture.
It is important to note that clinical decision errors are usually multifactorial. In a study involving 100 cases of diagnostic error in internal medicine, Graber and colleagues determined that in 46% of the cases errors were caused by a combination of system-related and cognitive factors.7
Continue for decision making >>
Decision Making: Dual Process Theory
Over the past two decades, dual process theory (DPT) has been recognized as a reliable model of the decision-making process in the psychology literature.9 DPT proposes two unique processes of thinking during decision making, referred to as System 1 and System 2, or Type 1 and Type 2, processes. A brief introduction to DPT is given here for practicing clinicians, but a detailed discussion of the literature pertaining to this concept is beyond the scope of this review.
System 1 processes are “intuitive,” utilize pattern recognition and heuristics, and rely heavily on the context or conditions in which the decision is made. The intuitive System 1 mode of thinking uses a pattern recognition or “gut reaction” approach.10 It is fast and reflexive but can be subject to deficits in predictive power and reliability.10 Experienced clinicians use pattern recognition in conditions presenting with classic signs and symptoms.10 For example, the clinician who evaluates a 12-year-old child with an annular, erythemic patch with central clearing on the forearm and immediately diagnoses ringworm is thinking in the intuitive mode. Generally, human beings are most comfortable in this decision mode because it involves intuition and requires less mental effort and concentration. For clinicians, System 1 thinking is the default defense mechanism against “decision fatigue” and “cognitive overload” during a busy shift, and it is the thinking mode used when clinicians are stressed, hurried, tired, and working with a lack of resources.9,10 Croskerry maintains, however, that such clinical situations, and the reliance on System 1 thinking that such situations entail, can make clinicians more vulnerable to certain biases.9
System 2 thinking is analytic, deductive, slow, and deliberate. This mode of thinking has high predictive power with high reliability, and it is less influenced by the context or conditions in which the decision is being made.10 Clinicians use this mode of thinking when patients present with vague signs and symptoms and a diagnosis is not instantly recognized.10 System 2 decision making would be required, for example, when evaluating a 55-year-old woman with chest pain. The clinical condition requires the clinician to acquire more data and make a conscious effort to analyze results, and arriving at a clinical decision in this situation takes more time. Shortcuts due to time pressures can have devastating outcomes in this setting. It should be mentioned, however, that psychology research has shown that the System 2 analytic approach is mentally taxing and may also result in poor decisions (“thinking too much”).11
Intuitive and analytic thinking are not independent of each other. During a clinical encounter, there is unconscious switching back and forth between the two modes as the clinician evaluates the information at hand in order to produce a decision.12 A patient presenting with a chief complaint may trigger a System 1 decision, but due to uncertainty there may be a “System 2 override”where the clinician consciously forces herself to reassess and perform further analysis.10 System 1 intuitive decision processes become more dominant with experience. Many encounters requiring System 2 thinking early in a clinician’s career may become System 1 decisions as the clinician gains expertise.10 This results as the clinician develops a “mental library” of previous encounters with commonly seen medical conditions.13 It is important to note that clinical decision errors often result from a combination of knowledge gaps and processing malfunctions and not from one process alone.14
Similarly, diagnostic errors are not purely a result of cognitive biases or reliance on System 1 or System 2 thinking, but rather are a result of multiple factors.In a study that looked at provider time to diagnosis and accuracy of diagnosis, results indicated that System 1 reasoning was not more error prone than System 2 thinking.15 Experienced clinicians emphasize that errors can occur at any time or in any context in both System 1 and 2 modes of thinking.16
The vast majority of human decisions—95%—are made in System 1 mode, while only 5% of our “thinking” is conscious analytic thought.17 Croskerry suggests that clinical reasoning defaults to the faster, more mentally economic System 1 thinking, which can make clinicians prone to error by allowing intuition, heuristics, and processes that are most vulnerable to mistakes—stereotyping, prejudices, and biases—to influence a decision.9,18 Both novice and expert clinicians should be encouraged to develop insight into their intuitive and analytic decision-making processes and become aware of which thinking mode they are using in a specific clinical situation.
Continue for cognitive dispositions to respond >>
Cognitive Dispositions to Respond
Diagnostic errors are often associated with cognitive errors such as failures in perception, failed heuristics, and biases; as a group, these cognitive errors have been labeled cognitive dispositions to respond.1 In the medical and psychology literature, more than 100 CDRs have been identified.19 Common CDR/bias definitions are provided in the graphic.
In everyday practice, clinicians encounter clinical scenarios or situations where CDRs can affect decision making. The following brief clinical examples further illustrate the defining characteristics of the CDRs. Cognitive errors related to these CDRs can occur if a clinician does not remain completely objective.
Availability is a bias that applies the saying “more common diseases are common.” An example of this bias in practice would be a provider who has seen three patients with abdominal pain and diagnosed gastritis for each. A fourth patient presents with abdominal pain, is diagnosed with gastritis, but actually has appendicitis.
Search satisficing, or premature closure, occurs when one has found enough information to make a diagnosis and then stops looking for further causes or additional problems. For example, a PA rounds on a patient who is post-op day 1 from coronary bypass surgery and develops decreasing oxygen saturation. A chest x-ray reveals right lower lobe opacity consistent with either pneumonia or pleural effusion; antibiotics are started and oxygen concentration is increased on the ventilator. The radiologist later informs the PA that the patient also has a left-sided pneumothorax. The PA did not treat that because he stopped looking for other causes of the oxygen desaturation once the right lower lobe pneumonia was found.
Continue for confirmation >>
Confirmation bias occurs when clinicians seek to confirm a diagnosis rather than rule it out. For example, a patient presents with first-time, new-onset “classic” migraine symptoms, characterized as “the worst headache of her life.” The provider asks patient history questions to confirm the initial impression of a migraine headache and does not order a CT scan.
Posterior probability is a bias whereby the clinician gives excessive weight to a patient’s previous medical history. It occurs, for example, when a patient with chronic back pain is diagnosed with musculoskeletal back pain without considering other causes, such as urinary tract infection or pyelonephritis.
Diagnosis momentum bias occurs when a clinician relies on information handed down from numerous parties involved with the patient. An example is a patient who has a syncopal episode in church and several tonic-clonic movements while briefly unconscious. Nearby witnesses describe the event as a “seizure,” and paramedics relaying information to the emergency department indicate that the patient had a “seizure.” Ultimately, the triage information records “seizure” as the diagnosis. A cognitive error can occur if the treating clinician does not take a thorough history to consider an alternative diagnosis.
Fundamental attribution error bias occurs when a provider is judgmental and blames the patient for their disease. A provider who quips, “No wonder that patient has diabetes and hypertension; she weighs 325 lb,” is exhibiting fundamental attribution error bias.
Ascertainment bias allows preconceived notions, including stereotypes, to influence a clinician’s thinking. A provider who determines that all female patients with multiple somatic complaints have anxiety and depression is subject to this bias.
Triage cueing occurs when some aspect of the triage process influences the clinician’s thinking, such as when the clinician assumes that patients who are placed in the fast track are low acuity and therefore gives no consideration to higher acuity diagnoses.
Playing the odds assumes that a patient with a vague presentation has a benign condition rather than a serious one because the odds favor that. An example of this bias occurs when a 65-year-old woman with vomiting during flu season is quickly diagnosed with gastroenteritis. Fortunately, the patient is on a telemetry monitor while getting IV fluids and antinausea medication. The monitor results indicate that her vomiting episodes are occurring during long periods of sinus arrest.
Psych-out bias applies when signs or symptoms in a patient with a psychiatric diagnosis are ascribed to the underlying psychiatric condition and other serious possibilities are quickly dismissed. For example, a provider who assumes that an unstable psychiatric patient is nonadherent with her prescribed medication or is abusing substances rather than considering an underlying medical illness is demonstrating psych-out bias.
Illusory correlation bias occurs, for example, when the provider makes the assumption that the emergency department will be busy because there is a full moon.
Continue to find out if you are at risk for being wrong >>
AM I AT RISK FOR BEING WRONG?
Autonomous advanced practice clinicians in high-risk practice settings have an immense responsibility to ensure that their patients are getting the best possible care. It is documented that as expertise develops, knowledge and decision processes change. Ordinarily, highly experienced clinicians use the more time-efficient System 1 process when faced with common disorders; for more complex disorders, they change to System 2 thinking to facilitate a more comprehensive evaluation.13 In many instances, however, a provider may inadvertently take shortcuts to conclude the clinical encounter, including relying on intuitive thinking—which can be prone to bias—when analytic thinking is necessary.
Clinicians are usually unaware of the influence that biases may have on their decision making and should reflect on their behavior to determine if any biases exist. To improve patient safety and facilitate better care, all providers should perform a personal inventory to identify CDRs they may have developed. Questions that will help to reveal CDRs include
- Am I rushing to get off my shift on time?
- Was the patient “turned over” to me at the shift change?
- Have I allowed a previously negative experience with this patient to influence my objectivity and clinical decision-making?
- Am I tired?
- Has the diagnosis been suggested by the nurse, paramedic, or the patient’s family?9
- Has the diagnosis been suggested by the nurse,
If one or more biases are found, a purposeful effort to mentally “uncouple” from a bias should be done. This process is referred to as metacognition, or thinking about one’s own thought processes.9 Paramount among the thinking processes that may be at play is an awareness of how System 1 and System 2 thinking interact and affect clinical decision making, as this enables the clinician to recognize which mode of thinking they use to arrive at a decision and when they need to shift from intuitive to analytic thinking.
Another factor to consider is overconfidence: Berner and Graber note that a provider’s overconfidence3 in his or her own knowledge and experience and lack of awareness of when an “override” is needed can be a cause of diagnostic errors.18 The tendency to shore up existing beliefs rather than force a new cognitive strategy is a sign of a rigid thinking process that may ultimately result in a poor clinical decision.9 Finally, providers should be aware of their surroundings and practice environments. As noted earlier, emergency medicine, family medicine, internal medicine, and urgent care have high diagnostic error rates due, in part, to high patient volumes.1
Once a tendency for a certain cognitive bias is recognized, the next step is to develop a sustainable method to counteract it, a process referred to as debiasing, to prevent cognitive errors. The table lists some workplace and educational debiasing techniques that have been described in the literature.20,21 Critics of cognitive debiasing argue that CDRs are preconscious, that awareness of CDRs is not enough to counteract their effects, and that there is no ability for one to develop “generic” conscious efforts to counter them.14 Their concern here is that a clinician may be able to counter a bias in one clinical context but not in another.14 It is clear that clinical reasoning is complex and involves many interrelated elements, such as clinical knowledge and critical thinking, with System 1 and 2 thinking working in tandem and metacognition overarching the whole process.21 Errors in diagnosis can have multiple causes and no single cognitive approach can be effective in addressing all of these causes. Knowing about cognitive bias helps clinicians address one possible element underlying diagnostic errors. Efforts to eliminate bias in clinical reasoning should begin early in clinical education; this can be done by incorporating instruction on clinical reasoning, including the relationship between intuitive and analytic decisions, metacognition, and awareness of the strengths and weaknesses of heuristics.22
In summary, in clinical situations where bias or uncertainty might exist, a clinician can make an effort to avoid a bad decision by
- Stepping back and reflecting to consider if a bias exists.
- Developing rules and mental procedures to reject a reflexive automatic response and force a “System 2 override.”9
- Developing “mental-ware” (mental techniques) to uncouple from a recognized or recurring cognitive bias.9
Continue to conclusion >>
Conclusion
This article reminds health care providers that cognitive biases can influence clinical decision-making. Clinicians should be aware of how System 1 and System 2 thinking couple with unconscious cognitive biases to affect clinical decisions and patient safety. Once a provider identifies a bias, he or she should attempt to employ one or more debiasing techniques. Medical decision errors usually occur due to multiple factors, and one thinking mode is not more error prone than the other (analytic versus intuitive). Cognitive errors are also caused by knowledge gaps and faulty patient data processing. Future research is needed to assess outcomes of quality improvement projects that include these components.
1. Croskerry P. Diagnostic failure: a cognitive and affective approach. In: Henriksen K, Battles JB, Marks ES, Lewin DI, eds. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville, MD: Agency for Healthcare Research and Quality; 2005:241-254.
2. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775-780.
3. Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008;121(5A):S2-S23.
4. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22(8):672-680.
5. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy of Sciences; 1999.
6. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77:981-992.
7. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165(13):1493-1499.
8. Poon EG, Gandhi TK, Sequist TD, et al. “I wish I had seen this test result earlier!”: dissatisfaction with test result management systems in primary care. Arch Intern Med. 2004;164(20):2223-2228.
9. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-ii64.
10. Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. J R Coll Physicians Edinb. 2011;41(2):155-162.
11. Wilson TD, Schooler JW. Thinking too much: introspection can reduce the quality of p and decisions. J Pers Soc Psychol. 1991;60(2): 181-192.
12. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009; 84(8):1022-1028.
13. Norman G, Young M, Brooks L. Non-analytical models of clinical reasoning: the role of experience. Med Educ. 2007;41(12):1140-1145.
14. Norman GR, Eva KW. Diagnostic error and clinical reasoning. Med Educ. 2010;44(1):94-100.
15. Sherbino J, Dore KL, Wood TJ, et al. The relationship between response time and diagnostic accuracy. Acad Med. 2012;87(6):785-791.
16. Petrie D, Campbell S. Clinical decision making, fast and slow. Acad Med. 2013;88(5):557.
17. Lakoff G, Johnson M. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. New York, NY: Basic Books; 1999.
18. Sinclair D, Croskerry P. Patient safety and diagnostic error: tips for your next shift. Can Fam Physician. 2010;56(1):28-30.
19. Croskerry P. From mindless to mindful practice—cognitive bias and clinical decision making. N Engl J Med. 2013;368(26):2445-2448.
20. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013;22(suppl 2):ii65-ii72.
21. Groves M. Understanding clinical reasoning: the next step in working out how it really works. Med Educ. 2012;46(5):444-446.
22. Trowbridge RL, Dhaliwal G, Cosby KS. Educational agenda for diagnostic error reduction. BMJ Qual Saf. 2013;22(suppl 2):ii28-ii32.
1. Croskerry P. Diagnostic failure: a cognitive and affective approach. In: Henriksen K, Battles JB, Marks ES, Lewin DI, eds. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville, MD: Agency for Healthcare Research and Quality; 2005:241-254.
2. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775-780.
3. Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008;121(5A):S2-S23.
4. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22(8):672-680.
5. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy of Sciences; 1999.
6. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77:981-992.
7. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165(13):1493-1499.
8. Poon EG, Gandhi TK, Sequist TD, et al. “I wish I had seen this test result earlier!”: dissatisfaction with test result management systems in primary care. Arch Intern Med. 2004;164(20):2223-2228.
9. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-ii64.
10. Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. J R Coll Physicians Edinb. 2011;41(2):155-162.
11. Wilson TD, Schooler JW. Thinking too much: introspection can reduce the quality of p and decisions. J Pers Soc Psychol. 1991;60(2): 181-192.
12. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009; 84(8):1022-1028.
13. Norman G, Young M, Brooks L. Non-analytical models of clinical reasoning: the role of experience. Med Educ. 2007;41(12):1140-1145.
14. Norman GR, Eva KW. Diagnostic error and clinical reasoning. Med Educ. 2010;44(1):94-100.
15. Sherbino J, Dore KL, Wood TJ, et al. The relationship between response time and diagnostic accuracy. Acad Med. 2012;87(6):785-791.
16. Petrie D, Campbell S. Clinical decision making, fast and slow. Acad Med. 2013;88(5):557.
17. Lakoff G, Johnson M. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. New York, NY: Basic Books; 1999.
18. Sinclair D, Croskerry P. Patient safety and diagnostic error: tips for your next shift. Can Fam Physician. 2010;56(1):28-30.
19. Croskerry P. From mindless to mindful practice—cognitive bias and clinical decision making. N Engl J Med. 2013;368(26):2445-2448.
20. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013;22(suppl 2):ii65-ii72.
21. Groves M. Understanding clinical reasoning: the next step in working out how it really works. Med Educ. 2012;46(5):444-446.
22. Trowbridge RL, Dhaliwal G, Cosby KS. Educational agenda for diagnostic error reduction. BMJ Qual Saf. 2013;22(suppl 2):ii28-ii32.