Article Type
Changed
Fri, 12/06/2019 - 12:31

Diagnostic error—defined as a failure to establish an accurate and timely explanation of the patient’s health problem—is an important source of patient harm.1 Data suggest that all patients will experience at least 1 diagnostic error in their lifetime.2-4 Not surprisingly, diagnostic errors are among the leading categories of paid malpractice claims in the United States.5

Despite diagnostic errors being morbid and sometimes deadly in the hospital,6,7 little is known about how residents and learners approach diagnostic decision making. Errors in diagnosis are believed to stem from cognitive or system failures,8 with errors in cognition believed to occur due to rapid, reflexive thinking operating in the absence of a more analytical, deliberate process. System-based problems (eg, lack of expert availability, technology barriers, and access to data) have also been cited as contributors.9 However, whether and how these apply to trainees is not known.

Therefore, we conducted a focused ethnography of inpatient medicine teams (ie, attendings, residents, interns, and medical students) in 2 affiliated teaching hospitals, aiming to (a) observe the process of diagnosis by trainees and (b) identify methods to improve the diagnostic process and prevent errors.

METHODS

We designed a multimethod, focused ethnographic study to examine diagnostic decision making in hospital settings.10,11 In contrast to anthropologic ethnographies that study entire fields using open-ended questions, our study was designed to examine the process of diagnosis from the perspective of clinicians engaged in this activity.11 This approach allowed us to capture diagnostic decisions and cognitive and system-based factors in a manner currently lacking in the literature.12

Setting and Participants

Between January 2016 and May 2016, we observed the members of four inpatient internal medicine teaching teams at 2 affiliated teaching hospitals. We purposefully selected teaching teams for observation because they are the primary model of care in academic settings and we have expertise in carrying out similar studies.13,14 Teaching teams typically consisted of a medical attending (senior-level physician), 1 senior resident (a second- or third-year postgraduate trainee), two interns (a trainee in their first postgraduate year), and two to four  medical students. Teams were selected at random using existing schedules and followed Monday to Friday so as to permit observation of work on call and noncall days. Owing to manpower limitations, weekend and night shifts were not observed. However, overnight events were captured during morning rounds.

Most of the teams began rounds at 8:30 AM. Typically, rounds lasted for 90–120 min and concluded with a recap (ie, “running the list”) with a review of explicit plans for patients after they had been evaluated by the attending. This discussion often occurred in the team rooms, with the attending leading the discussion with the trainees.

Data Collection

A multidisciplinary team, including clinicians (eg, physicians, nurses), nonclinicians (eg, qualitative researchers, social scientists), and healthcare engineers, conducted the observations. We observed preround activities of interns and residents before arrival of the attending (7:00 AM - 8:30 AM), followed by morning rounds with the entire team, and afternoon work that included senior residents, interns, and students.

To capture multiple aspects of the diagnostic process, we collected data using field notes modeled on components of the National Academy of Science model for diagnosis (Appendix).1,15 This model encompasses phases of the diagnostic process (eg, data gathering, integration, formulation of a working diagnosis, treatment delivery, and outcomes) and the work system (team members, organization, technology and tools, physical environment, tasks).

Focus Groups and Interviews

At the end of weekly observations, we conducted focus groups with the residents and one-on- one interviews with the attendings. Focus groups with the residents were conducted to encourage a group discussion about the diagnostic process. Separate interviews with the attendings were performed to ensure that power differentials did not influence discussions. During focus groups, we specifically asked about challenges and possible solutions to improve diagnosis. Experienced qualitative methodologists (J.F., M.H., M.Q.) used semistructured interview guides for discussions (Appendix).

 

 

Data Analysis

After aggregating and reading the data, three reviewers (V.C., S.K., S.S.) began inductive analysis by handwriting notes and initial reflective thoughts to create preliminary codes. Multiple team members then reread the original field notes and the focus group/interview data to refine the preliminary codes and develop additional codes. Next, relationships between codes were identified and used to develop key themes. Triangulation of data collected from observations and interview/focus group sessions was carried out to compare data that we surmised with data that were verbalized by the team. The developed themes were discussed as a group to ensure consistency of major findings.

Ethical and Regulatory Oversight

This study was reviewed and approved by the Institutional Review Boards at the University of Michigan Health System (HUM-00106657) and the VA Ann Arbor Healthcare System (1-2016-010040).

RESULTS

Four teaching teams (4 attendings, 4 senior residents, 9 interns, and 14 medical students) were observed over 33 distinct shifts and 168 hours. Observations included morning rounds (96 h), postround call days (52 h), and postround non-call days (20 h). Morning rounds lasted an average of 127 min (range: 48-232 min) and included an average of 9 patients (range: 4-16 patients).

Themes Regarding the Diagnostic Process

We identified the following 4 primary themes related to the diagnostic process in teaching hospitals: (1) diagnosis is a social phenomenon; (2) data necessary to make diagnoses are fragmented; (3) distractions undermine the diagnostic process; and (4) time pressures interfere with diagnostic decision making (Appendix Table 1).

(1) Diagnosis is a Social Phenomenon.

Team members viewed the process of diagnosis as a social exchange of facts, findings, and strategies within a defined structure. The opportunity to discuss impressions with others was valued as a means to share, test, and process assumptions.

“Rounds are the most important part of the process. That is where we make most decisions in a collective, collaborative way with the attending present. We bounce ideas off each other.” (Intern)

Typical of social processes, variations based on time of day and schedule were observed. For instance, during call days, learners gathered data and formed working diagnosis and treatment plans with minimal attending interaction. This separation of roles and responsibilities introduced a hierarchy within diagnosis as follows:

“The interns would not call me first; they would talk to the senior resident and then if the senior thought he should chat with me, then they would call. But for the most part, they gather information and come up with the plan.” (Attending).

The work system was suited to facilitate social interactions. For instance, designated rooms (with team members informally assigned to a computer) provided physical proximity of the resident to interns and medical students. In this space, numerous informal discussions between team members (eg, “What do you think about this test?” “I’m not sure what to do about this finding.” “Should I call a [consult] on this patient?”) were observed. Although proximity to each other was viewed as beneficial, dangers to the social nature of diagnosis in the form of anchoring (ie, a cognitive bias where emphasis is placed on the first piece of data)16 were also mentioned. Similarly, the paradox associated with social proof (ie, the pressure to assume conformity within a group) was also observed as disagreement between team members and attendings rarely occurred during observations.

“I mean, they’re the attending, right? It’s hard to argue with them when they want a test or something done. When I do push back, it’s rare that others will support me–so it’s usually me and the attending.” (Resident)

“I would push back if I think it’s really bad for the patient or could cause harm–but the truth is, it doesn’t happen much.” (Intern)

(2) Data Necessary to Make Diagnoses are Fragmented

Team members universally cited fragmentation in data delivery, retrieval, and processing as a barrier to diagnosis. Team members indicated that test results might not be looked at or acted upon in a timely manner, and participants pointed to the electronic medical record as a source of this challenge.

“Before I knew about [the app for Epic], I would literally sit on the computer to get all the information we would need on rounds. Its key to making decisions. We often say we will do something, only to find the test result doesn’t support it–and then we’re back to square 1.” (Intern)

Information used by teams came from myriad sources (eg, patients, family members, electronic records) and from various settings (eg, emergency department, patient rooms, discussions with consultants). Additionally, test results often appeared without warning. Thus, availability of information was poorly aligned with clinical duties.

 

 

“They (the lab) will call us when a blood culture is positive or something is off. That is very helpful but it often comes later in the day, when we’re done with rounds.” (Resident)

The work system was highlighted as a key contributor to data fragmentation. Peculiarities of our electronic medical record (EMR) and how data were collected, stored, or presented were described as “frustrating,” and “unsafe,” by team members. Correspondingly, we frequently observed interns asking for assistance for tasks such as ordering tests or finding information despite being “trained” to use the EMR.

“People have to learn how to filter, how to recognize the most important points and link data streams together in terms of causality. But we assume they know where to find that information. It’s actually a very hard thing to do, for both the house staff and me.” (Attending)

(3) Distractions Undermine the Diagnostic Process

Distractions often created cognitive difficulties. For example, ambient noise and interruptions from neighbors working on other teams were cited as barriers to diagnosis. In addition, we observed several team members using headphones to drown out ambient noise while working on the computer.

“I know I shouldn’t do it (wear headphones), but I have no other way of turning down the noise so I can concentrate.” (Intern)

Similarly, the unpredictable nature and the volume of pages often interrupted thinking about diagnosis.

“Sometimes the pager just goes off all the time and (after making sure its not an urgent issue), I will just ignore it for a bit, especially if I am in the middle of something. It would be great if I could finish my thought process knowing I would not be interrupted.” (Resident)

To mitigate this problem, 1 attending described how he would proactively seek out nurses caring for his patients to “head off” questions (eg, “I will renew the restraints and medications this morning,” and “Is there anything you need in terms of orders for this patient that I can take care of now?”) that might lead to pages. Another resident described his approach as follows:

“I make it a point to tell the nurses where I will be hanging out and where they can find me if they have any questions. I tell them to come talk to me rather than page me since that will be less distracting.” (Resident).

Most of the interns described documentation work such as writing admission and progress notes in negative terms (“an academic exercise,” “part of the billing activity”). However, in the context of interruptions, some described this as helpful.

“The most valuable part of the thinking process was writing the assessment and plan because that’s actually my schema for all problems. It literally is the only time where I can sit and collect my thoughts to formulate a diagnosis and plan.” (Intern)

(4) Time Pressures Interfere With Diagnostic Decision Making

All team members spoke about the challenge of finding time for diagnosis during the workday. Often, they had to skip learning sessions for this purpose.

“They tell us we should go to morning report or noon conference but when I’m running around trying to get things done. I hate having to choose between my education and doing what’s best for the patient–but that’s often what it comes down to.” (Intern)

When specifically asked whether setting aside dedicated time to specifically review and formulate diagnoses would be valuable, respondents were uniformly enthusiastic. Team members described attentional conflicts as being the worst when “cross covering” other teams on call days, as their patient load effectively doubled during this time. Of note, cross-covering occurred when teams were also on call—and thus took them away from important diagnostic activities such as data gathering or synthesis for patients they were admitting.

“If you were to ever design a system where errors were likely–this is how you would design it: take a team with little supervision, double their patient load, keep them busy with new challenging cases and then ask questions about patients they know little about.” (Resident)

DISCUSSION

Although diagnostic errors have been called “the next frontier for patient safety,”17 little is known about the process, barriers, and facilitators to diagnosis in teaching hospitals. In this focused ethnography conducted at 2 academic medical centers, we identified multiple cognitive and system-level challenges and potential strategies to improve diagnosis from trainees engaged in this activity. Key themes identified by those we observed included the social nature of diagnosis, fragmented information delivery, constant distractions and interruptions, and time pressures. In turn, these insights allow us to generate strategies that can be applied to improve the diagnostic process in teaching hospitals.

 

 

Our study underscores the importance of social interactions in diagnosis. In contrast, most of the interventions to prevent diagnostic errors target individual providers through practices such as metacognition and “thinking about thinking.”18-20 These interventions are based on Daniel Kahnemann’s work on dual thought process. Type 1 thought processes are fast, subconscious, reflexive, largely intuitive, and more vulnerable to error. In contrast, Type 2 processes are slower, deliberate, analytic, and less prone to error.21 Although an individual’s Type 2 thought capacity is limited, a major goal of cognitive interventions is to encourage Type 2 over Type 1 thinking, an approach termed “de-biasing.”22-24 Unfortunately, cognitive interventions testing such approaches have suffered mixed results–perhaps because of lack of focus on collective wisdom or group thinking, which may be key to diagnosis from our findings.9,25 In this sense, morning rounds were a social gathering used to strategize and develop care plans, but with limited time to think about diagnosis.26 Introduction of defined periods for individuals to engage in diagnostic activities such as de-biasing (ie, asking “what else could this be)27 before or after rounds may provide an opportunity for reflection and improving diagnosis. In addition, embedding tools such as diagnosis expanders and checklists within these defined time slots28,29 may prove to be useful in reflecting on diagnosis and preventing diagnostic errors.

An unexpected yet important finding from this study were the challenges posed by distractions and the physical environment. Potentially maladaptive workarounds to these interruptions included use of headphones; more productive strategies included updating nurses with plans to avert pages and creating a list of activities to ensure that key tasks were not forgotten.30,31 Applying lessons from aviation, a focused effort to limit distractions during key portions of the day, might be worth considering for diagnostic safety.32 Similarly, improving the environment in which diagnosis occurs—including creating spaces that are quiet, orderly, and optimized for thinking—may be valuable.33Our study has limitations. First, our findings are limited to direct observations; we are thus unable to comment on how unobserved aspects of care (eg, cognitive processes) might have influenced our findings. Our observations of clinical care might also have introduced a Hawthorne effect. However, because we were closely integrated with teams and conducted focus groups to corroborate our assessments, we believe that this was not the case. Second, we did not identify diagnostic errors or link processes we observed to errors. Third, our approach is limited to 2 teaching centers, thereby limiting the generalizability of findings. Relatedly, we were only able to conduct observations during weekdays; differences in weekend and night resources might affect our insights.

The cognitive and system-based barriers faced by clinicians in teaching hospitals suggest that new methods to improve diagnosis are needed. Future interventions such as defined “time-outs” for diagnosis, strategies focused on limiting distractions, and methods to improve communication between team members are novel and have parallels in other industries. As challenges to quantify diagnostic errors abound,34 improving cognitive- and system-based factors via reflection through communication, concentration, and organization is necessary to improve medical decision making in academic medical centers.

Disclosures

None declared for all coauthors.

Funding

This project was supported by grant number P30HS024385 from the Agency for Healthcare Research and Quality. The funding source played no role in study design, data acquisition, analysis or decision to report these data. Dr. Chopra is supported by a career development award from the Agency of Healthcare Research and Quality (1-K08-HS022835-01). Dr. Krein is supported by a VA Health Services Research and Development Research Career Scientist Award (RCS 11-222). Dr. Singh is partially supported by Houston VA HSR&D Center for Innovations in Quality, Effectiveness and Safety (CIN 13-413). The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality or the Department of Veterans Affairs.

Files
References

1. National Academies of Sciences, Engineering, and Medicine. 2015. Improving Diagnosis in Health Care. Washington, DC: The National Academies Press. http://www.nap.edu/21794. Accessed November 1; 2016:2015. https://doi.org/10.17226/21794.
2. Schiff GD, Hasan O, Kim S, et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med. 2009;169(20):1881-1887. http://dx.doi.org/10.1001/archinternmed.2009.333. PubMed
3. Sonderegger-Iseli K, Burger S, Muntwyler J, Salomon F. Diagnostic errors in three medical eras: A necropsy study. Lancet. 2000;355(9220):2027-2031. http://dx.doi.org/10.1016/S0140-6736(00)02349-7PubMed
4. Winters B, Custer J, Galvagno SM Jr, et al. Diagnostic errors in the intensive care unit: a systematic review of autopsy studies. BMJ Qual Saf. 2012;21(11):894-902. http://dx.doi.org/10.1136/bmjqs-2012-000803. PubMed
5. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-Year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22(8):672-680. http://dx.doi.org/10.1136/bmjqs-2012-001550PubMed
6. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77(10):981-992. http://dx.doi.org/10.1097/00001888-200210000-00009PubMed
7. Gupta A, Snyder A, Kachalia A, Flanders S, Saint S, Chopra V. Malpractice claims related to diagnostic errors in the hospital. BMJ Qual Saf. 2018;27(1):53-60. 10.1136/bmjqs-2017-006774. PubMed
8. van Noord I, Eikens MP, Hamersma AM, de Bruijne MC. Application of root cause analysis on malpractice claim files related to diagnostic failures. Qual Saf Health Care. 2010;19(6):e21. http://dx.doi.org/10.1136/qshc.2008.029801PubMed
9. Croskerry P, Petrie DA, Reilly JB, Tait G. Deciding about fast and slow decisions. Acad Med. 2014;89(2):197-200. 10.1097/ACM.0000000000000121. PubMed
10. Higginbottom GM, Pillay JJ, Boadu NY. Guidance on performing focused ethnographies with an emphasis on healthcare research. Qual Rep. 2013;18(9):1-6. https://doi.org/10.7939/R35M6287P. 
11. Savage J. Participative observation: standing in the shoes of others? Qual Health Res. 2000;10(3):324-339. http://dx.doi.org/10.1177/104973200129118471PubMed
12. Patton MQ. Qualitative Research and Evaluation Methods. 3rd ed. Thousand Oaks, CA: SAGE Publications; 2002. 
13. Harrod M, Weston LE, Robinson C, Tremblay A, Greenstone CL, Forman J. “It goes beyond good camaraderie”: A qualitative study of the process of becoming an interprofessional healthcare “teamlet.” J Interprof Care. 2016;30(3):295-300. http://dx.doi.org/10.3109/13561820.2015.1130028PubMed
14. Houchens N, Harrod M, Moody S, Fowler KE, Saint S. Techniques and behaviors associated with exemplary inpatient general medicine teaching: an exploratory qualitative study. J Hosp Med. 2017;12(7):503-509. http://dx.doi.org/10.12788/jhm.2763PubMed
15. Mulhall A. In the field: notes on observation in qualitative research. J Adv Nurs. 2003;41(3):306-313. http://dx.doi.org/10.1046/j.1365-2648.2003.02514.xPubMed
16. Zwaan L, Monteiro S, Sherbino J, Ilgen J, Howey B, Norman G. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf. 2017;26(2):104-110. http://dx.doi.org/10.1136/bmjqs-2015-005014PubMed
17. Singh H, Graber ML. Improving diagnosis in health care--the next imperative for patient safety. N Engl J Med. 2015;373(26):2493-2495. http://dx.doi.org/10.1056/NEJMp1512241PubMed
18. Croskerry P. From mindless to mindful practice--cognitive bias and clinical decision making. N Engl J Med. 2013;368(26):2445-2448. http://dx.doi.org/10.1056/NEJMp1303712PubMed
19. van den Berge K, Mamede S. Cognitive diagnostic error in internal medicine. Eur J Intern Med. 2013;24(6):525-529. http://dx.doi.org/10.1016/j.ejim.2013.03.006PubMed
20. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89(2):277-284. 10.1097/ACM.0000000000000105 PubMed
21. Dhaliwal G. Premature closure? Not so fast. BMJ Qual Saf. 2017;26(2):87-89. http://dx.doi.org/10.1136/bmjqs-2016-005267PubMed
22. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: Origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-iiii64. http://dx.doi.org/10.1136/bmjqs-2012-001712PubMed
23. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: Impediments to and strategies for change. BMJ Qual Saf. 2013;22(suppl 2):ii65-iiii72. http://dx.doi.org/10.1136/bmjqs-2012-001713PubMed
24. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013;22(12):1044-1050. http://dx.doi.org/10.1136/bmjqs-2013-001987PubMed
25. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JL, Rikers RM. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89(2):285-291. http://dx.doi.org/10.1097/ACM.0000000000000107PubMed
26. Hess BJ, Lipner RS, Thompson V, Holmboe ES, Graber ML. Blink or think: can further reflection improve initial diagnostic impressions? Acad Med. 2015;90(1):112-118. http://dx.doi.org/10.1097/ACM.0000000000000550PubMed
27. Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions to enhance diagnostic reasoning: A systematic review. BMJ Qual Saf. 2016;25(10):808-820. http://dx.doi.org/10.1136/bmjqs-2015-004417PubMed
28. Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012;21(7):535-557. http://dx.doi.org/10.1136/bmjqs-2011-000149PubMed
29. McDonald KM, Matesic B, Contopoulos-Ioannidis DG, et al. Patient safety strategies targeted at diagnostic errors: a systematic review. Ann Intern Med. 2013;158(5 Pt 2):381-389. http://dx.doi.org/10.7326/0003-4819-158-5-201303051-00004PubMed
30. Wray CM, Chaudhry S, Pincavage A, et al. Resident shift handoff strategies in US internal medicine residency programs. JAMA. 2016;316(21):2273-2275. http://dx.doi.org/10.1001/jama.2016.17786PubMed
31. Choo KJ, Arora VM, Barach P, Johnson JK, Farnan JM. How do supervising physicians decide to entrust residents with unsupervised tasks? A qualitative analysis. J Hosp Med. 2014;9(3):169-175. http://dx.doi.org/10.1002/jhm.2150PubMed
32. Carayon P, Wood KE. Patient safety - the role of human factors and systems engineering. Stud Health Technol Inform. 2010;153:23-46.

 

 

 

.http://dx.doi.org/10.1001/jama.2015.13453  PubMed

34. McGlynn EA, McDonald KM, Cassel CK. Measurement is essential for improving diagnosis and reducing diagnostic error: A report from the Institute of Medicine. JAMA. 2015;314(23):2501-2502.
.http://dx.doi.org/10.1136/bmjqs-2013-001812 PubMed

33. Carayon P, Xie A, Kianfar S. Human factors and ergonomics as a patient safety practice. BMJ Qual Saf. 2014;23(3):196-205. PubMed

 

Article PDF
Issue
Journal of Hospital Medicine 13(10)
Publications
Topics
Page Number
668-672. Published online first April 25, 2018
Sections
Files
Files
Article PDF
Article PDF
Related Articles

Diagnostic error—defined as a failure to establish an accurate and timely explanation of the patient’s health problem—is an important source of patient harm.1 Data suggest that all patients will experience at least 1 diagnostic error in their lifetime.2-4 Not surprisingly, diagnostic errors are among the leading categories of paid malpractice claims in the United States.5

Despite diagnostic errors being morbid and sometimes deadly in the hospital,6,7 little is known about how residents and learners approach diagnostic decision making. Errors in diagnosis are believed to stem from cognitive or system failures,8 with errors in cognition believed to occur due to rapid, reflexive thinking operating in the absence of a more analytical, deliberate process. System-based problems (eg, lack of expert availability, technology barriers, and access to data) have also been cited as contributors.9 However, whether and how these apply to trainees is not known.

Therefore, we conducted a focused ethnography of inpatient medicine teams (ie, attendings, residents, interns, and medical students) in 2 affiliated teaching hospitals, aiming to (a) observe the process of diagnosis by trainees and (b) identify methods to improve the diagnostic process and prevent errors.

METHODS

We designed a multimethod, focused ethnographic study to examine diagnostic decision making in hospital settings.10,11 In contrast to anthropologic ethnographies that study entire fields using open-ended questions, our study was designed to examine the process of diagnosis from the perspective of clinicians engaged in this activity.11 This approach allowed us to capture diagnostic decisions and cognitive and system-based factors in a manner currently lacking in the literature.12

Setting and Participants

Between January 2016 and May 2016, we observed the members of four inpatient internal medicine teaching teams at 2 affiliated teaching hospitals. We purposefully selected teaching teams for observation because they are the primary model of care in academic settings and we have expertise in carrying out similar studies.13,14 Teaching teams typically consisted of a medical attending (senior-level physician), 1 senior resident (a second- or third-year postgraduate trainee), two interns (a trainee in their first postgraduate year), and two to four  medical students. Teams were selected at random using existing schedules and followed Monday to Friday so as to permit observation of work on call and noncall days. Owing to manpower limitations, weekend and night shifts were not observed. However, overnight events were captured during morning rounds.

Most of the teams began rounds at 8:30 AM. Typically, rounds lasted for 90–120 min and concluded with a recap (ie, “running the list”) with a review of explicit plans for patients after they had been evaluated by the attending. This discussion often occurred in the team rooms, with the attending leading the discussion with the trainees.

Data Collection

A multidisciplinary team, including clinicians (eg, physicians, nurses), nonclinicians (eg, qualitative researchers, social scientists), and healthcare engineers, conducted the observations. We observed preround activities of interns and residents before arrival of the attending (7:00 AM - 8:30 AM), followed by morning rounds with the entire team, and afternoon work that included senior residents, interns, and students.

To capture multiple aspects of the diagnostic process, we collected data using field notes modeled on components of the National Academy of Science model for diagnosis (Appendix).1,15 This model encompasses phases of the diagnostic process (eg, data gathering, integration, formulation of a working diagnosis, treatment delivery, and outcomes) and the work system (team members, organization, technology and tools, physical environment, tasks).

Focus Groups and Interviews

At the end of weekly observations, we conducted focus groups with the residents and one-on- one interviews with the attendings. Focus groups with the residents were conducted to encourage a group discussion about the diagnostic process. Separate interviews with the attendings were performed to ensure that power differentials did not influence discussions. During focus groups, we specifically asked about challenges and possible solutions to improve diagnosis. Experienced qualitative methodologists (J.F., M.H., M.Q.) used semistructured interview guides for discussions (Appendix).

 

 

Data Analysis

After aggregating and reading the data, three reviewers (V.C., S.K., S.S.) began inductive analysis by handwriting notes and initial reflective thoughts to create preliminary codes. Multiple team members then reread the original field notes and the focus group/interview data to refine the preliminary codes and develop additional codes. Next, relationships between codes were identified and used to develop key themes. Triangulation of data collected from observations and interview/focus group sessions was carried out to compare data that we surmised with data that were verbalized by the team. The developed themes were discussed as a group to ensure consistency of major findings.

Ethical and Regulatory Oversight

This study was reviewed and approved by the Institutional Review Boards at the University of Michigan Health System (HUM-00106657) and the VA Ann Arbor Healthcare System (1-2016-010040).

RESULTS

Four teaching teams (4 attendings, 4 senior residents, 9 interns, and 14 medical students) were observed over 33 distinct shifts and 168 hours. Observations included morning rounds (96 h), postround call days (52 h), and postround non-call days (20 h). Morning rounds lasted an average of 127 min (range: 48-232 min) and included an average of 9 patients (range: 4-16 patients).

Themes Regarding the Diagnostic Process

We identified the following 4 primary themes related to the diagnostic process in teaching hospitals: (1) diagnosis is a social phenomenon; (2) data necessary to make diagnoses are fragmented; (3) distractions undermine the diagnostic process; and (4) time pressures interfere with diagnostic decision making (Appendix Table 1).

(1) Diagnosis is a Social Phenomenon.

Team members viewed the process of diagnosis as a social exchange of facts, findings, and strategies within a defined structure. The opportunity to discuss impressions with others was valued as a means to share, test, and process assumptions.

“Rounds are the most important part of the process. That is where we make most decisions in a collective, collaborative way with the attending present. We bounce ideas off each other.” (Intern)

Typical of social processes, variations based on time of day and schedule were observed. For instance, during call days, learners gathered data and formed working diagnosis and treatment plans with minimal attending interaction. This separation of roles and responsibilities introduced a hierarchy within diagnosis as follows:

“The interns would not call me first; they would talk to the senior resident and then if the senior thought he should chat with me, then they would call. But for the most part, they gather information and come up with the plan.” (Attending).

The work system was suited to facilitate social interactions. For instance, designated rooms (with team members informally assigned to a computer) provided physical proximity of the resident to interns and medical students. In this space, numerous informal discussions between team members (eg, “What do you think about this test?” “I’m not sure what to do about this finding.” “Should I call a [consult] on this patient?”) were observed. Although proximity to each other was viewed as beneficial, dangers to the social nature of diagnosis in the form of anchoring (ie, a cognitive bias where emphasis is placed on the first piece of data)16 were also mentioned. Similarly, the paradox associated with social proof (ie, the pressure to assume conformity within a group) was also observed as disagreement between team members and attendings rarely occurred during observations.

“I mean, they’re the attending, right? It’s hard to argue with them when they want a test or something done. When I do push back, it’s rare that others will support me–so it’s usually me and the attending.” (Resident)

“I would push back if I think it’s really bad for the patient or could cause harm–but the truth is, it doesn’t happen much.” (Intern)

(2) Data Necessary to Make Diagnoses are Fragmented

Team members universally cited fragmentation in data delivery, retrieval, and processing as a barrier to diagnosis. Team members indicated that test results might not be looked at or acted upon in a timely manner, and participants pointed to the electronic medical record as a source of this challenge.

“Before I knew about [the app for Epic], I would literally sit on the computer to get all the information we would need on rounds. Its key to making decisions. We often say we will do something, only to find the test result doesn’t support it–and then we’re back to square 1.” (Intern)

Information used by teams came from myriad sources (eg, patients, family members, electronic records) and from various settings (eg, emergency department, patient rooms, discussions with consultants). Additionally, test results often appeared without warning. Thus, availability of information was poorly aligned with clinical duties.

 

 

“They (the lab) will call us when a blood culture is positive or something is off. That is very helpful but it often comes later in the day, when we’re done with rounds.” (Resident)

The work system was highlighted as a key contributor to data fragmentation. Peculiarities of our electronic medical record (EMR) and how data were collected, stored, or presented were described as “frustrating,” and “unsafe,” by team members. Correspondingly, we frequently observed interns asking for assistance for tasks such as ordering tests or finding information despite being “trained” to use the EMR.

“People have to learn how to filter, how to recognize the most important points and link data streams together in terms of causality. But we assume they know where to find that information. It’s actually a very hard thing to do, for both the house staff and me.” (Attending)

(3) Distractions Undermine the Diagnostic Process

Distractions often created cognitive difficulties. For example, ambient noise and interruptions from neighbors working on other teams were cited as barriers to diagnosis. In addition, we observed several team members using headphones to drown out ambient noise while working on the computer.

“I know I shouldn’t do it (wear headphones), but I have no other way of turning down the noise so I can concentrate.” (Intern)

Similarly, the unpredictable nature and the volume of pages often interrupted thinking about diagnosis.

“Sometimes the pager just goes off all the time and (after making sure its not an urgent issue), I will just ignore it for a bit, especially if I am in the middle of something. It would be great if I could finish my thought process knowing I would not be interrupted.” (Resident)

To mitigate this problem, 1 attending described how he would proactively seek out nurses caring for his patients to “head off” questions (eg, “I will renew the restraints and medications this morning,” and “Is there anything you need in terms of orders for this patient that I can take care of now?”) that might lead to pages. Another resident described his approach as follows:

“I make it a point to tell the nurses where I will be hanging out and where they can find me if they have any questions. I tell them to come talk to me rather than page me since that will be less distracting.” (Resident).

Most of the interns described documentation work such as writing admission and progress notes in negative terms (“an academic exercise,” “part of the billing activity”). However, in the context of interruptions, some described this as helpful.

“The most valuable part of the thinking process was writing the assessment and plan because that’s actually my schema for all problems. It literally is the only time where I can sit and collect my thoughts to formulate a diagnosis and plan.” (Intern)

(4) Time Pressures Interfere With Diagnostic Decision Making

All team members spoke about the challenge of finding time for diagnosis during the workday. Often, they had to skip learning sessions for this purpose.

“They tell us we should go to morning report or noon conference but when I’m running around trying to get things done. I hate having to choose between my education and doing what’s best for the patient–but that’s often what it comes down to.” (Intern)

When specifically asked whether setting aside dedicated time to specifically review and formulate diagnoses would be valuable, respondents were uniformly enthusiastic. Team members described attentional conflicts as being the worst when “cross covering” other teams on call days, as their patient load effectively doubled during this time. Of note, cross-covering occurred when teams were also on call—and thus took them away from important diagnostic activities such as data gathering or synthesis for patients they were admitting.

“If you were to ever design a system where errors were likely–this is how you would design it: take a team with little supervision, double their patient load, keep them busy with new challenging cases and then ask questions about patients they know little about.” (Resident)

DISCUSSION

Although diagnostic errors have been called “the next frontier for patient safety,”17 little is known about the process, barriers, and facilitators to diagnosis in teaching hospitals. In this focused ethnography conducted at 2 academic medical centers, we identified multiple cognitive and system-level challenges and potential strategies to improve diagnosis from trainees engaged in this activity. Key themes identified by those we observed included the social nature of diagnosis, fragmented information delivery, constant distractions and interruptions, and time pressures. In turn, these insights allow us to generate strategies that can be applied to improve the diagnostic process in teaching hospitals.

 

 

Our study underscores the importance of social interactions in diagnosis. In contrast, most of the interventions to prevent diagnostic errors target individual providers through practices such as metacognition and “thinking about thinking.”18-20 These interventions are based on Daniel Kahnemann’s work on dual thought process. Type 1 thought processes are fast, subconscious, reflexive, largely intuitive, and more vulnerable to error. In contrast, Type 2 processes are slower, deliberate, analytic, and less prone to error.21 Although an individual’s Type 2 thought capacity is limited, a major goal of cognitive interventions is to encourage Type 2 over Type 1 thinking, an approach termed “de-biasing.”22-24 Unfortunately, cognitive interventions testing such approaches have suffered mixed results–perhaps because of lack of focus on collective wisdom or group thinking, which may be key to diagnosis from our findings.9,25 In this sense, morning rounds were a social gathering used to strategize and develop care plans, but with limited time to think about diagnosis.26 Introduction of defined periods for individuals to engage in diagnostic activities such as de-biasing (ie, asking “what else could this be)27 before or after rounds may provide an opportunity for reflection and improving diagnosis. In addition, embedding tools such as diagnosis expanders and checklists within these defined time slots28,29 may prove to be useful in reflecting on diagnosis and preventing diagnostic errors.

An unexpected yet important finding from this study were the challenges posed by distractions and the physical environment. Potentially maladaptive workarounds to these interruptions included use of headphones; more productive strategies included updating nurses with plans to avert pages and creating a list of activities to ensure that key tasks were not forgotten.30,31 Applying lessons from aviation, a focused effort to limit distractions during key portions of the day, might be worth considering for diagnostic safety.32 Similarly, improving the environment in which diagnosis occurs—including creating spaces that are quiet, orderly, and optimized for thinking—may be valuable.33Our study has limitations. First, our findings are limited to direct observations; we are thus unable to comment on how unobserved aspects of care (eg, cognitive processes) might have influenced our findings. Our observations of clinical care might also have introduced a Hawthorne effect. However, because we were closely integrated with teams and conducted focus groups to corroborate our assessments, we believe that this was not the case. Second, we did not identify diagnostic errors or link processes we observed to errors. Third, our approach is limited to 2 teaching centers, thereby limiting the generalizability of findings. Relatedly, we were only able to conduct observations during weekdays; differences in weekend and night resources might affect our insights.

The cognitive and system-based barriers faced by clinicians in teaching hospitals suggest that new methods to improve diagnosis are needed. Future interventions such as defined “time-outs” for diagnosis, strategies focused on limiting distractions, and methods to improve communication between team members are novel and have parallels in other industries. As challenges to quantify diagnostic errors abound,34 improving cognitive- and system-based factors via reflection through communication, concentration, and organization is necessary to improve medical decision making in academic medical centers.

Disclosures

None declared for all coauthors.

Funding

This project was supported by grant number P30HS024385 from the Agency for Healthcare Research and Quality. The funding source played no role in study design, data acquisition, analysis or decision to report these data. Dr. Chopra is supported by a career development award from the Agency of Healthcare Research and Quality (1-K08-HS022835-01). Dr. Krein is supported by a VA Health Services Research and Development Research Career Scientist Award (RCS 11-222). Dr. Singh is partially supported by Houston VA HSR&D Center for Innovations in Quality, Effectiveness and Safety (CIN 13-413). The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality or the Department of Veterans Affairs.

Diagnostic error—defined as a failure to establish an accurate and timely explanation of the patient’s health problem—is an important source of patient harm.1 Data suggest that all patients will experience at least 1 diagnostic error in their lifetime.2-4 Not surprisingly, diagnostic errors are among the leading categories of paid malpractice claims in the United States.5

Despite diagnostic errors being morbid and sometimes deadly in the hospital,6,7 little is known about how residents and learners approach diagnostic decision making. Errors in diagnosis are believed to stem from cognitive or system failures,8 with errors in cognition believed to occur due to rapid, reflexive thinking operating in the absence of a more analytical, deliberate process. System-based problems (eg, lack of expert availability, technology barriers, and access to data) have also been cited as contributors.9 However, whether and how these apply to trainees is not known.

Therefore, we conducted a focused ethnography of inpatient medicine teams (ie, attendings, residents, interns, and medical students) in 2 affiliated teaching hospitals, aiming to (a) observe the process of diagnosis by trainees and (b) identify methods to improve the diagnostic process and prevent errors.

METHODS

We designed a multimethod, focused ethnographic study to examine diagnostic decision making in hospital settings.10,11 In contrast to anthropologic ethnographies that study entire fields using open-ended questions, our study was designed to examine the process of diagnosis from the perspective of clinicians engaged in this activity.11 This approach allowed us to capture diagnostic decisions and cognitive and system-based factors in a manner currently lacking in the literature.12

Setting and Participants

Between January 2016 and May 2016, we observed the members of four inpatient internal medicine teaching teams at 2 affiliated teaching hospitals. We purposefully selected teaching teams for observation because they are the primary model of care in academic settings and we have expertise in carrying out similar studies.13,14 Teaching teams typically consisted of a medical attending (senior-level physician), 1 senior resident (a second- or third-year postgraduate trainee), two interns (a trainee in their first postgraduate year), and two to four  medical students. Teams were selected at random using existing schedules and followed Monday to Friday so as to permit observation of work on call and noncall days. Owing to manpower limitations, weekend and night shifts were not observed. However, overnight events were captured during morning rounds.

Most of the teams began rounds at 8:30 AM. Typically, rounds lasted for 90–120 min and concluded with a recap (ie, “running the list”) with a review of explicit plans for patients after they had been evaluated by the attending. This discussion often occurred in the team rooms, with the attending leading the discussion with the trainees.

Data Collection

A multidisciplinary team, including clinicians (eg, physicians, nurses), nonclinicians (eg, qualitative researchers, social scientists), and healthcare engineers, conducted the observations. We observed preround activities of interns and residents before arrival of the attending (7:00 AM - 8:30 AM), followed by morning rounds with the entire team, and afternoon work that included senior residents, interns, and students.

To capture multiple aspects of the diagnostic process, we collected data using field notes modeled on components of the National Academy of Science model for diagnosis (Appendix).1,15 This model encompasses phases of the diagnostic process (eg, data gathering, integration, formulation of a working diagnosis, treatment delivery, and outcomes) and the work system (team members, organization, technology and tools, physical environment, tasks).

Focus Groups and Interviews

At the end of weekly observations, we conducted focus groups with the residents and one-on- one interviews with the attendings. Focus groups with the residents were conducted to encourage a group discussion about the diagnostic process. Separate interviews with the attendings were performed to ensure that power differentials did not influence discussions. During focus groups, we specifically asked about challenges and possible solutions to improve diagnosis. Experienced qualitative methodologists (J.F., M.H., M.Q.) used semistructured interview guides for discussions (Appendix).

 

 

Data Analysis

After aggregating and reading the data, three reviewers (V.C., S.K., S.S.) began inductive analysis by handwriting notes and initial reflective thoughts to create preliminary codes. Multiple team members then reread the original field notes and the focus group/interview data to refine the preliminary codes and develop additional codes. Next, relationships between codes were identified and used to develop key themes. Triangulation of data collected from observations and interview/focus group sessions was carried out to compare data that we surmised with data that were verbalized by the team. The developed themes were discussed as a group to ensure consistency of major findings.

Ethical and Regulatory Oversight

This study was reviewed and approved by the Institutional Review Boards at the University of Michigan Health System (HUM-00106657) and the VA Ann Arbor Healthcare System (1-2016-010040).

RESULTS

Four teaching teams (4 attendings, 4 senior residents, 9 interns, and 14 medical students) were observed over 33 distinct shifts and 168 hours. Observations included morning rounds (96 h), postround call days (52 h), and postround non-call days (20 h). Morning rounds lasted an average of 127 min (range: 48-232 min) and included an average of 9 patients (range: 4-16 patients).

Themes Regarding the Diagnostic Process

We identified the following 4 primary themes related to the diagnostic process in teaching hospitals: (1) diagnosis is a social phenomenon; (2) data necessary to make diagnoses are fragmented; (3) distractions undermine the diagnostic process; and (4) time pressures interfere with diagnostic decision making (Appendix Table 1).

(1) Diagnosis is a Social Phenomenon.

Team members viewed the process of diagnosis as a social exchange of facts, findings, and strategies within a defined structure. The opportunity to discuss impressions with others was valued as a means to share, test, and process assumptions.

“Rounds are the most important part of the process. That is where we make most decisions in a collective, collaborative way with the attending present. We bounce ideas off each other.” (Intern)

Typical of social processes, variations based on time of day and schedule were observed. For instance, during call days, learners gathered data and formed working diagnosis and treatment plans with minimal attending interaction. This separation of roles and responsibilities introduced a hierarchy within diagnosis as follows:

“The interns would not call me first; they would talk to the senior resident and then if the senior thought he should chat with me, then they would call. But for the most part, they gather information and come up with the plan.” (Attending).

The work system was suited to facilitate social interactions. For instance, designated rooms (with team members informally assigned to a computer) provided physical proximity of the resident to interns and medical students. In this space, numerous informal discussions between team members (eg, “What do you think about this test?” “I’m not sure what to do about this finding.” “Should I call a [consult] on this patient?”) were observed. Although proximity to each other was viewed as beneficial, dangers to the social nature of diagnosis in the form of anchoring (ie, a cognitive bias where emphasis is placed on the first piece of data)16 were also mentioned. Similarly, the paradox associated with social proof (ie, the pressure to assume conformity within a group) was also observed as disagreement between team members and attendings rarely occurred during observations.

“I mean, they’re the attending, right? It’s hard to argue with them when they want a test or something done. When I do push back, it’s rare that others will support me–so it’s usually me and the attending.” (Resident)

“I would push back if I think it’s really bad for the patient or could cause harm–but the truth is, it doesn’t happen much.” (Intern)

(2) Data Necessary to Make Diagnoses are Fragmented

Team members universally cited fragmentation in data delivery, retrieval, and processing as a barrier to diagnosis. Team members indicated that test results might not be looked at or acted upon in a timely manner, and participants pointed to the electronic medical record as a source of this challenge.

“Before I knew about [the app for Epic], I would literally sit on the computer to get all the information we would need on rounds. Its key to making decisions. We often say we will do something, only to find the test result doesn’t support it–and then we’re back to square 1.” (Intern)

Information used by teams came from myriad sources (eg, patients, family members, electronic records) and from various settings (eg, emergency department, patient rooms, discussions with consultants). Additionally, test results often appeared without warning. Thus, availability of information was poorly aligned with clinical duties.

 

 

“They (the lab) will call us when a blood culture is positive or something is off. That is very helpful but it often comes later in the day, when we’re done with rounds.” (Resident)

The work system was highlighted as a key contributor to data fragmentation. Peculiarities of our electronic medical record (EMR) and how data were collected, stored, or presented were described as “frustrating,” and “unsafe,” by team members. Correspondingly, we frequently observed interns asking for assistance for tasks such as ordering tests or finding information despite being “trained” to use the EMR.

“People have to learn how to filter, how to recognize the most important points and link data streams together in terms of causality. But we assume they know where to find that information. It’s actually a very hard thing to do, for both the house staff and me.” (Attending)

(3) Distractions Undermine the Diagnostic Process

Distractions often created cognitive difficulties. For example, ambient noise and interruptions from neighbors working on other teams were cited as barriers to diagnosis. In addition, we observed several team members using headphones to drown out ambient noise while working on the computer.

“I know I shouldn’t do it (wear headphones), but I have no other way of turning down the noise so I can concentrate.” (Intern)

Similarly, the unpredictable nature and the volume of pages often interrupted thinking about diagnosis.

“Sometimes the pager just goes off all the time and (after making sure its not an urgent issue), I will just ignore it for a bit, especially if I am in the middle of something. It would be great if I could finish my thought process knowing I would not be interrupted.” (Resident)

To mitigate this problem, 1 attending described how he would proactively seek out nurses caring for his patients to “head off” questions (eg, “I will renew the restraints and medications this morning,” and “Is there anything you need in terms of orders for this patient that I can take care of now?”) that might lead to pages. Another resident described his approach as follows:

“I make it a point to tell the nurses where I will be hanging out and where they can find me if they have any questions. I tell them to come talk to me rather than page me since that will be less distracting.” (Resident).

Most of the interns described documentation work such as writing admission and progress notes in negative terms (“an academic exercise,” “part of the billing activity”). However, in the context of interruptions, some described this as helpful.

“The most valuable part of the thinking process was writing the assessment and plan because that’s actually my schema for all problems. It literally is the only time where I can sit and collect my thoughts to formulate a diagnosis and plan.” (Intern)

(4) Time Pressures Interfere With Diagnostic Decision Making

All team members spoke about the challenge of finding time for diagnosis during the workday. Often, they had to skip learning sessions for this purpose.

“They tell us we should go to morning report or noon conference but when I’m running around trying to get things done. I hate having to choose between my education and doing what’s best for the patient–but that’s often what it comes down to.” (Intern)

When specifically asked whether setting aside dedicated time to specifically review and formulate diagnoses would be valuable, respondents were uniformly enthusiastic. Team members described attentional conflicts as being the worst when “cross covering” other teams on call days, as their patient load effectively doubled during this time. Of note, cross-covering occurred when teams were also on call—and thus took them away from important diagnostic activities such as data gathering or synthesis for patients they were admitting.

“If you were to ever design a system where errors were likely–this is how you would design it: take a team with little supervision, double their patient load, keep them busy with new challenging cases and then ask questions about patients they know little about.” (Resident)

DISCUSSION

Although diagnostic errors have been called “the next frontier for patient safety,”17 little is known about the process, barriers, and facilitators to diagnosis in teaching hospitals. In this focused ethnography conducted at 2 academic medical centers, we identified multiple cognitive and system-level challenges and potential strategies to improve diagnosis from trainees engaged in this activity. Key themes identified by those we observed included the social nature of diagnosis, fragmented information delivery, constant distractions and interruptions, and time pressures. In turn, these insights allow us to generate strategies that can be applied to improve the diagnostic process in teaching hospitals.

 

 

Our study underscores the importance of social interactions in diagnosis. In contrast, most of the interventions to prevent diagnostic errors target individual providers through practices such as metacognition and “thinking about thinking.”18-20 These interventions are based on Daniel Kahnemann’s work on dual thought process. Type 1 thought processes are fast, subconscious, reflexive, largely intuitive, and more vulnerable to error. In contrast, Type 2 processes are slower, deliberate, analytic, and less prone to error.21 Although an individual’s Type 2 thought capacity is limited, a major goal of cognitive interventions is to encourage Type 2 over Type 1 thinking, an approach termed “de-biasing.”22-24 Unfortunately, cognitive interventions testing such approaches have suffered mixed results–perhaps because of lack of focus on collective wisdom or group thinking, which may be key to diagnosis from our findings.9,25 In this sense, morning rounds were a social gathering used to strategize and develop care plans, but with limited time to think about diagnosis.26 Introduction of defined periods for individuals to engage in diagnostic activities such as de-biasing (ie, asking “what else could this be)27 before or after rounds may provide an opportunity for reflection and improving diagnosis. In addition, embedding tools such as diagnosis expanders and checklists within these defined time slots28,29 may prove to be useful in reflecting on diagnosis and preventing diagnostic errors.

An unexpected yet important finding from this study were the challenges posed by distractions and the physical environment. Potentially maladaptive workarounds to these interruptions included use of headphones; more productive strategies included updating nurses with plans to avert pages and creating a list of activities to ensure that key tasks were not forgotten.30,31 Applying lessons from aviation, a focused effort to limit distractions during key portions of the day, might be worth considering for diagnostic safety.32 Similarly, improving the environment in which diagnosis occurs—including creating spaces that are quiet, orderly, and optimized for thinking—may be valuable.33Our study has limitations. First, our findings are limited to direct observations; we are thus unable to comment on how unobserved aspects of care (eg, cognitive processes) might have influenced our findings. Our observations of clinical care might also have introduced a Hawthorne effect. However, because we were closely integrated with teams and conducted focus groups to corroborate our assessments, we believe that this was not the case. Second, we did not identify diagnostic errors or link processes we observed to errors. Third, our approach is limited to 2 teaching centers, thereby limiting the generalizability of findings. Relatedly, we were only able to conduct observations during weekdays; differences in weekend and night resources might affect our insights.

The cognitive and system-based barriers faced by clinicians in teaching hospitals suggest that new methods to improve diagnosis are needed. Future interventions such as defined “time-outs” for diagnosis, strategies focused on limiting distractions, and methods to improve communication between team members are novel and have parallels in other industries. As challenges to quantify diagnostic errors abound,34 improving cognitive- and system-based factors via reflection through communication, concentration, and organization is necessary to improve medical decision making in academic medical centers.

Disclosures

None declared for all coauthors.

Funding

This project was supported by grant number P30HS024385 from the Agency for Healthcare Research and Quality. The funding source played no role in study design, data acquisition, analysis or decision to report these data. Dr. Chopra is supported by a career development award from the Agency of Healthcare Research and Quality (1-K08-HS022835-01). Dr. Krein is supported by a VA Health Services Research and Development Research Career Scientist Award (RCS 11-222). Dr. Singh is partially supported by Houston VA HSR&D Center for Innovations in Quality, Effectiveness and Safety (CIN 13-413). The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality or the Department of Veterans Affairs.

References

1. National Academies of Sciences, Engineering, and Medicine. 2015. Improving Diagnosis in Health Care. Washington, DC: The National Academies Press. http://www.nap.edu/21794. Accessed November 1; 2016:2015. https://doi.org/10.17226/21794.
2. Schiff GD, Hasan O, Kim S, et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med. 2009;169(20):1881-1887. http://dx.doi.org/10.1001/archinternmed.2009.333. PubMed
3. Sonderegger-Iseli K, Burger S, Muntwyler J, Salomon F. Diagnostic errors in three medical eras: A necropsy study. Lancet. 2000;355(9220):2027-2031. http://dx.doi.org/10.1016/S0140-6736(00)02349-7PubMed
4. Winters B, Custer J, Galvagno SM Jr, et al. Diagnostic errors in the intensive care unit: a systematic review of autopsy studies. BMJ Qual Saf. 2012;21(11):894-902. http://dx.doi.org/10.1136/bmjqs-2012-000803. PubMed
5. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-Year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22(8):672-680. http://dx.doi.org/10.1136/bmjqs-2012-001550PubMed
6. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77(10):981-992. http://dx.doi.org/10.1097/00001888-200210000-00009PubMed
7. Gupta A, Snyder A, Kachalia A, Flanders S, Saint S, Chopra V. Malpractice claims related to diagnostic errors in the hospital. BMJ Qual Saf. 2018;27(1):53-60. 10.1136/bmjqs-2017-006774. PubMed
8. van Noord I, Eikens MP, Hamersma AM, de Bruijne MC. Application of root cause analysis on malpractice claim files related to diagnostic failures. Qual Saf Health Care. 2010;19(6):e21. http://dx.doi.org/10.1136/qshc.2008.029801PubMed
9. Croskerry P, Petrie DA, Reilly JB, Tait G. Deciding about fast and slow decisions. Acad Med. 2014;89(2):197-200. 10.1097/ACM.0000000000000121. PubMed
10. Higginbottom GM, Pillay JJ, Boadu NY. Guidance on performing focused ethnographies with an emphasis on healthcare research. Qual Rep. 2013;18(9):1-6. https://doi.org/10.7939/R35M6287P. 
11. Savage J. Participative observation: standing in the shoes of others? Qual Health Res. 2000;10(3):324-339. http://dx.doi.org/10.1177/104973200129118471PubMed
12. Patton MQ. Qualitative Research and Evaluation Methods. 3rd ed. Thousand Oaks, CA: SAGE Publications; 2002. 
13. Harrod M, Weston LE, Robinson C, Tremblay A, Greenstone CL, Forman J. “It goes beyond good camaraderie”: A qualitative study of the process of becoming an interprofessional healthcare “teamlet.” J Interprof Care. 2016;30(3):295-300. http://dx.doi.org/10.3109/13561820.2015.1130028PubMed
14. Houchens N, Harrod M, Moody S, Fowler KE, Saint S. Techniques and behaviors associated with exemplary inpatient general medicine teaching: an exploratory qualitative study. J Hosp Med. 2017;12(7):503-509. http://dx.doi.org/10.12788/jhm.2763PubMed
15. Mulhall A. In the field: notes on observation in qualitative research. J Adv Nurs. 2003;41(3):306-313. http://dx.doi.org/10.1046/j.1365-2648.2003.02514.xPubMed
16. Zwaan L, Monteiro S, Sherbino J, Ilgen J, Howey B, Norman G. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf. 2017;26(2):104-110. http://dx.doi.org/10.1136/bmjqs-2015-005014PubMed
17. Singh H, Graber ML. Improving diagnosis in health care--the next imperative for patient safety. N Engl J Med. 2015;373(26):2493-2495. http://dx.doi.org/10.1056/NEJMp1512241PubMed
18. Croskerry P. From mindless to mindful practice--cognitive bias and clinical decision making. N Engl J Med. 2013;368(26):2445-2448. http://dx.doi.org/10.1056/NEJMp1303712PubMed
19. van den Berge K, Mamede S. Cognitive diagnostic error in internal medicine. Eur J Intern Med. 2013;24(6):525-529. http://dx.doi.org/10.1016/j.ejim.2013.03.006PubMed
20. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89(2):277-284. 10.1097/ACM.0000000000000105 PubMed
21. Dhaliwal G. Premature closure? Not so fast. BMJ Qual Saf. 2017;26(2):87-89. http://dx.doi.org/10.1136/bmjqs-2016-005267PubMed
22. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: Origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-iiii64. http://dx.doi.org/10.1136/bmjqs-2012-001712PubMed
23. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: Impediments to and strategies for change. BMJ Qual Saf. 2013;22(suppl 2):ii65-iiii72. http://dx.doi.org/10.1136/bmjqs-2012-001713PubMed
24. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013;22(12):1044-1050. http://dx.doi.org/10.1136/bmjqs-2013-001987PubMed
25. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JL, Rikers RM. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89(2):285-291. http://dx.doi.org/10.1097/ACM.0000000000000107PubMed
26. Hess BJ, Lipner RS, Thompson V, Holmboe ES, Graber ML. Blink or think: can further reflection improve initial diagnostic impressions? Acad Med. 2015;90(1):112-118. http://dx.doi.org/10.1097/ACM.0000000000000550PubMed
27. Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions to enhance diagnostic reasoning: A systematic review. BMJ Qual Saf. 2016;25(10):808-820. http://dx.doi.org/10.1136/bmjqs-2015-004417PubMed
28. Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012;21(7):535-557. http://dx.doi.org/10.1136/bmjqs-2011-000149PubMed
29. McDonald KM, Matesic B, Contopoulos-Ioannidis DG, et al. Patient safety strategies targeted at diagnostic errors: a systematic review. Ann Intern Med. 2013;158(5 Pt 2):381-389. http://dx.doi.org/10.7326/0003-4819-158-5-201303051-00004PubMed
30. Wray CM, Chaudhry S, Pincavage A, et al. Resident shift handoff strategies in US internal medicine residency programs. JAMA. 2016;316(21):2273-2275. http://dx.doi.org/10.1001/jama.2016.17786PubMed
31. Choo KJ, Arora VM, Barach P, Johnson JK, Farnan JM. How do supervising physicians decide to entrust residents with unsupervised tasks? A qualitative analysis. J Hosp Med. 2014;9(3):169-175. http://dx.doi.org/10.1002/jhm.2150PubMed
32. Carayon P, Wood KE. Patient safety - the role of human factors and systems engineering. Stud Health Technol Inform. 2010;153:23-46.

 

 

 

.http://dx.doi.org/10.1001/jama.2015.13453  PubMed

34. McGlynn EA, McDonald KM, Cassel CK. Measurement is essential for improving diagnosis and reducing diagnostic error: A report from the Institute of Medicine. JAMA. 2015;314(23):2501-2502.
.http://dx.doi.org/10.1136/bmjqs-2013-001812 PubMed

33. Carayon P, Xie A, Kianfar S. Human factors and ergonomics as a patient safety practice. BMJ Qual Saf. 2014;23(3):196-205. PubMed

 

References

1. National Academies of Sciences, Engineering, and Medicine. 2015. Improving Diagnosis in Health Care. Washington, DC: The National Academies Press. http://www.nap.edu/21794. Accessed November 1; 2016:2015. https://doi.org/10.17226/21794.
2. Schiff GD, Hasan O, Kim S, et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med. 2009;169(20):1881-1887. http://dx.doi.org/10.1001/archinternmed.2009.333. PubMed
3. Sonderegger-Iseli K, Burger S, Muntwyler J, Salomon F. Diagnostic errors in three medical eras: A necropsy study. Lancet. 2000;355(9220):2027-2031. http://dx.doi.org/10.1016/S0140-6736(00)02349-7PubMed
4. Winters B, Custer J, Galvagno SM Jr, et al. Diagnostic errors in the intensive care unit: a systematic review of autopsy studies. BMJ Qual Saf. 2012;21(11):894-902. http://dx.doi.org/10.1136/bmjqs-2012-000803. PubMed
5. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-Year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22(8):672-680. http://dx.doi.org/10.1136/bmjqs-2012-001550PubMed
6. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77(10):981-992. http://dx.doi.org/10.1097/00001888-200210000-00009PubMed
7. Gupta A, Snyder A, Kachalia A, Flanders S, Saint S, Chopra V. Malpractice claims related to diagnostic errors in the hospital. BMJ Qual Saf. 2018;27(1):53-60. 10.1136/bmjqs-2017-006774. PubMed
8. van Noord I, Eikens MP, Hamersma AM, de Bruijne MC. Application of root cause analysis on malpractice claim files related to diagnostic failures. Qual Saf Health Care. 2010;19(6):e21. http://dx.doi.org/10.1136/qshc.2008.029801PubMed
9. Croskerry P, Petrie DA, Reilly JB, Tait G. Deciding about fast and slow decisions. Acad Med. 2014;89(2):197-200. 10.1097/ACM.0000000000000121. PubMed
10. Higginbottom GM, Pillay JJ, Boadu NY. Guidance on performing focused ethnographies with an emphasis on healthcare research. Qual Rep. 2013;18(9):1-6. https://doi.org/10.7939/R35M6287P. 
11. Savage J. Participative observation: standing in the shoes of others? Qual Health Res. 2000;10(3):324-339. http://dx.doi.org/10.1177/104973200129118471PubMed
12. Patton MQ. Qualitative Research and Evaluation Methods. 3rd ed. Thousand Oaks, CA: SAGE Publications; 2002. 
13. Harrod M, Weston LE, Robinson C, Tremblay A, Greenstone CL, Forman J. “It goes beyond good camaraderie”: A qualitative study of the process of becoming an interprofessional healthcare “teamlet.” J Interprof Care. 2016;30(3):295-300. http://dx.doi.org/10.3109/13561820.2015.1130028PubMed
14. Houchens N, Harrod M, Moody S, Fowler KE, Saint S. Techniques and behaviors associated with exemplary inpatient general medicine teaching: an exploratory qualitative study. J Hosp Med. 2017;12(7):503-509. http://dx.doi.org/10.12788/jhm.2763PubMed
15. Mulhall A. In the field: notes on observation in qualitative research. J Adv Nurs. 2003;41(3):306-313. http://dx.doi.org/10.1046/j.1365-2648.2003.02514.xPubMed
16. Zwaan L, Monteiro S, Sherbino J, Ilgen J, Howey B, Norman G. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf. 2017;26(2):104-110. http://dx.doi.org/10.1136/bmjqs-2015-005014PubMed
17. Singh H, Graber ML. Improving diagnosis in health care--the next imperative for patient safety. N Engl J Med. 2015;373(26):2493-2495. http://dx.doi.org/10.1056/NEJMp1512241PubMed
18. Croskerry P. From mindless to mindful practice--cognitive bias and clinical decision making. N Engl J Med. 2013;368(26):2445-2448. http://dx.doi.org/10.1056/NEJMp1303712PubMed
19. van den Berge K, Mamede S. Cognitive diagnostic error in internal medicine. Eur J Intern Med. 2013;24(6):525-529. http://dx.doi.org/10.1016/j.ejim.2013.03.006PubMed
20. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89(2):277-284. 10.1097/ACM.0000000000000105 PubMed
21. Dhaliwal G. Premature closure? Not so fast. BMJ Qual Saf. 2017;26(2):87-89. http://dx.doi.org/10.1136/bmjqs-2016-005267PubMed
22. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: Origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-iiii64. http://dx.doi.org/10.1136/bmjqs-2012-001712PubMed
23. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: Impediments to and strategies for change. BMJ Qual Saf. 2013;22(suppl 2):ii65-iiii72. http://dx.doi.org/10.1136/bmjqs-2012-001713PubMed
24. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013;22(12):1044-1050. http://dx.doi.org/10.1136/bmjqs-2013-001987PubMed
25. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JL, Rikers RM. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89(2):285-291. http://dx.doi.org/10.1097/ACM.0000000000000107PubMed
26. Hess BJ, Lipner RS, Thompson V, Holmboe ES, Graber ML. Blink or think: can further reflection improve initial diagnostic impressions? Acad Med. 2015;90(1):112-118. http://dx.doi.org/10.1097/ACM.0000000000000550PubMed
27. Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions to enhance diagnostic reasoning: A systematic review. BMJ Qual Saf. 2016;25(10):808-820. http://dx.doi.org/10.1136/bmjqs-2015-004417PubMed
28. Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012;21(7):535-557. http://dx.doi.org/10.1136/bmjqs-2011-000149PubMed
29. McDonald KM, Matesic B, Contopoulos-Ioannidis DG, et al. Patient safety strategies targeted at diagnostic errors: a systematic review. Ann Intern Med. 2013;158(5 Pt 2):381-389. http://dx.doi.org/10.7326/0003-4819-158-5-201303051-00004PubMed
30. Wray CM, Chaudhry S, Pincavage A, et al. Resident shift handoff strategies in US internal medicine residency programs. JAMA. 2016;316(21):2273-2275. http://dx.doi.org/10.1001/jama.2016.17786PubMed
31. Choo KJ, Arora VM, Barach P, Johnson JK, Farnan JM. How do supervising physicians decide to entrust residents with unsupervised tasks? A qualitative analysis. J Hosp Med. 2014;9(3):169-175. http://dx.doi.org/10.1002/jhm.2150PubMed
32. Carayon P, Wood KE. Patient safety - the role of human factors and systems engineering. Stud Health Technol Inform. 2010;153:23-46.

 

 

 

.http://dx.doi.org/10.1001/jama.2015.13453  PubMed

34. McGlynn EA, McDonald KM, Cassel CK. Measurement is essential for improving diagnosis and reducing diagnostic error: A report from the Institute of Medicine. JAMA. 2015;314(23):2501-2502.
.http://dx.doi.org/10.1136/bmjqs-2013-001812 PubMed

33. Carayon P, Xie A, Kianfar S. Human factors and ergonomics as a patient safety practice. BMJ Qual Saf. 2014;23(3):196-205. PubMed

 

Issue
Journal of Hospital Medicine 13(10)
Issue
Journal of Hospital Medicine 13(10)
Page Number
668-672. Published online first April 25, 2018
Page Number
668-672. Published online first April 25, 2018
Publications
Publications
Topics
Article Type
Sections
Article Source

© 2018 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Vineet Chopra MD, MSc, 2800 Plymouth Rd, Building 16 #432W North Campus Research Complex, Ann Arbor, MI 48109; Telephone: 734-936-4000; Fax: 734-852-4600; E-mail: [email protected]
Content Gating
Gated (full article locked unless allowed per User)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 05/23/2018 - 06:45
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Gating Strategy
First Peek Free
Article PDF Media
Media Files