User login
Graduate medical education (GME) is heavily reliant on experiential learning. Most of a resident’s time is spent in progressively independent delivery of patient care, which is associated with decreasing supervision. Attainment and demonstration of competence in patient care is the goal and responsibility of GME training programs. What happens, then, if the medicine resident never has the experience necessary to enable experiential learning? What if she never “sees one,” let alone “does one”?
In this month’s Journal of Hospital Medicine, Sclafani et al1 examine how exposure to urgent clinical situations impacts residents’ confidence in managing these ward emergencies. They astutely reveal the idiosyncratic nature of residency training and consequent gaps created when an educational delivery model predicated on experience lacks certain experiences. How can a resident without certain key experiences be ready for independent practice?
The ACGME’s Next Accreditation System is intended to ensure that residents are prepared for independent practice. The educational outcomes that learners must attain are comprised of six core competencies, with milestones intended to operationalize the measurement and reporting of learner progression toward competence.2,3 It is challenging to apply general competencies to assessment of day to day clinical activities. This challenge led to the development of 16 Entrustable Professional Activities (EPAs). These allow the direct observation of concrete clinical activities that could then infer the attainment (or not) of multiple competencies. Ideally, EPAs are paired with and mapped to curricular milestones which describe a learner’s trajectory within the framework of competencies and determine if a resident is prepared for independent practice.4,5
In Sclafani et al.1 the authors characterize resident exposure to, and confidence in, 50 urgent clinical situations. Both level of training and exposure were associated with increased confidence. However, the most important finding of this paper is the wide variation of resident exposures and confidence with respect to specific urgent clinical events. At least 15% of graduating residents had never seen 16% of the 50 emergency events, and a majority of
Several factors account for the idiosyncratic nature of medical training, including the rarity of certain clinical events, seasonal variation in conditions, and other variables (ie, learner elective choices). In addition, the scheduling of most residency programs is based on patient care needs instead of individual trainees’ educational needs. Other areas of medicine have attempted to standardize experience and ensure specific exposure and/or competence using strategies such as surgical case logs and case-based certifying examinations. There are very important recently described projects in undergraduate medical education aimed at using longitudinal assessment of EPAs in multiple contexts to make entrustment decisions.6 However, Internal Medicine residencies do not routinely employ these strategies.
It must be noted that Sclafani et al. surveyed residents from only one site, and examined only self-reported exposure and confidence, not competence. The relationship between confidence and competence is notoriously problematic7 and there is a risk of familiarity creating an illusion of knowledge and/or competence. Ultimately, a competency-based medical system is intended to be dynamic, adaptive, and contextual. Despite the extensive competency-based framework in place to track the development of physicians, data about the contexts in which competency is demonstrated are lacking. There is no reason to think that the key gaps identified in Sclafani et al are unique to their institution.
Given the ultimate goal of developing curricula that prepare residents for independent practice coupled with robust systems of assessment that ensure they are ready to do so, educators must implement strategies to identify and alleviate the idiosyncrasy of the resident experience. The survey tool in the present work could be used as a needs assessment and would require minimal resources, but is limited by recall bias, illusion of knowledge, and lack of data regarding actual competence. Other potential strategies include case logs or e-folios, although these tools are also limited by the understanding that familiarity and exposure do not necessarily engender competence.
One potential strategy suggested by Warm et al. is the addition of the “Observable Practice Activities” (OPA), “a collection of learning objectives/activities that must be observed in daily practice in order to form entrustment decisions.”8 The intention is to more granularly define what residents actually do and then map these activities to the established competency-based framework. Using these observable activities as an assessment unit may allow for identification of individual experience gaps, thereby improving the dynamicity and adaptiveness of GME training. Certainly, there are very real concerns about further complicating an already complex and abstract system and using a reductionist approach to define the activities of a profession. However, the findings of Sclafani et al with respect to the wide range of resident experience elucidates the need for continued study and innovation regarding the manner in which the medical education community determines our trainees are prepared for independent practice.
Disclosures
The authors have nothing to disclose.
1. Sclafani A, Currier P, Chang Y, Eromo E, Raemer D, Miloslavsky E. Internal Medicine Residents’ Exposure to and Confidence in Managing Ward Emergencies. J Hosp Med. 2019;14(4):218-223. PubMed
2. Holmboe ES, Call S, Ficalora RD. Milestones and Competency-Based Medical Education in Internal Medicine. JAMA Intern Med. 2016;176(11):1601. PubMed
3. Hauer KE, Vandergrift J, Lipner RS, Holmboe ES, Hood S, McDonald FS. National Internal Medicine Milestone Ratings. Acad Med. 2018;93(8):1189-1204. PubMed
4. Ten Cate O, Scheele F, Ten Cate TJ. Viewpoint: Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
5. Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training: A report from the Education Redesign Committee of the Alliance for Academic Internal Medicine. Acad Med. 2015;90(4):479-484. PubMed
6. Murray KE, Lane JL, Carraccio C, et al. Crossing the Gap. Acad Med. November 2018:1. PubMed
7. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of Physician Self-assessment Compared With Observed Measures of Competence. JAMA. 2006;296(9):1094. PubMed
8. Warm EJ, Mathis BR, Held JD, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med. 2014;29(8):1177-1182. PubMed
Graduate medical education (GME) is heavily reliant on experiential learning. Most of a resident’s time is spent in progressively independent delivery of patient care, which is associated with decreasing supervision. Attainment and demonstration of competence in patient care is the goal and responsibility of GME training programs. What happens, then, if the medicine resident never has the experience necessary to enable experiential learning? What if she never “sees one,” let alone “does one”?
In this month’s Journal of Hospital Medicine, Sclafani et al1 examine how exposure to urgent clinical situations impacts residents’ confidence in managing these ward emergencies. They astutely reveal the idiosyncratic nature of residency training and consequent gaps created when an educational delivery model predicated on experience lacks certain experiences. How can a resident without certain key experiences be ready for independent practice?
The ACGME’s Next Accreditation System is intended to ensure that residents are prepared for independent practice. The educational outcomes that learners must attain are comprised of six core competencies, with milestones intended to operationalize the measurement and reporting of learner progression toward competence.2,3 It is challenging to apply general competencies to assessment of day to day clinical activities. This challenge led to the development of 16 Entrustable Professional Activities (EPAs). These allow the direct observation of concrete clinical activities that could then infer the attainment (or not) of multiple competencies. Ideally, EPAs are paired with and mapped to curricular milestones which describe a learner’s trajectory within the framework of competencies and determine if a resident is prepared for independent practice.4,5
In Sclafani et al.1 the authors characterize resident exposure to, and confidence in, 50 urgent clinical situations. Both level of training and exposure were associated with increased confidence. However, the most important finding of this paper is the wide variation of resident exposures and confidence with respect to specific urgent clinical events. At least 15% of graduating residents had never seen 16% of the 50 emergency events, and a majority of
Several factors account for the idiosyncratic nature of medical training, including the rarity of certain clinical events, seasonal variation in conditions, and other variables (ie, learner elective choices). In addition, the scheduling of most residency programs is based on patient care needs instead of individual trainees’ educational needs. Other areas of medicine have attempted to standardize experience and ensure specific exposure and/or competence using strategies such as surgical case logs and case-based certifying examinations. There are very important recently described projects in undergraduate medical education aimed at using longitudinal assessment of EPAs in multiple contexts to make entrustment decisions.6 However, Internal Medicine residencies do not routinely employ these strategies.
It must be noted that Sclafani et al. surveyed residents from only one site, and examined only self-reported exposure and confidence, not competence. The relationship between confidence and competence is notoriously problematic7 and there is a risk of familiarity creating an illusion of knowledge and/or competence. Ultimately, a competency-based medical system is intended to be dynamic, adaptive, and contextual. Despite the extensive competency-based framework in place to track the development of physicians, data about the contexts in which competency is demonstrated are lacking. There is no reason to think that the key gaps identified in Sclafani et al are unique to their institution.
Given the ultimate goal of developing curricula that prepare residents for independent practice coupled with robust systems of assessment that ensure they are ready to do so, educators must implement strategies to identify and alleviate the idiosyncrasy of the resident experience. The survey tool in the present work could be used as a needs assessment and would require minimal resources, but is limited by recall bias, illusion of knowledge, and lack of data regarding actual competence. Other potential strategies include case logs or e-folios, although these tools are also limited by the understanding that familiarity and exposure do not necessarily engender competence.
One potential strategy suggested by Warm et al. is the addition of the “Observable Practice Activities” (OPA), “a collection of learning objectives/activities that must be observed in daily practice in order to form entrustment decisions.”8 The intention is to more granularly define what residents actually do and then map these activities to the established competency-based framework. Using these observable activities as an assessment unit may allow for identification of individual experience gaps, thereby improving the dynamicity and adaptiveness of GME training. Certainly, there are very real concerns about further complicating an already complex and abstract system and using a reductionist approach to define the activities of a profession. However, the findings of Sclafani et al with respect to the wide range of resident experience elucidates the need for continued study and innovation regarding the manner in which the medical education community determines our trainees are prepared for independent practice.
Disclosures
The authors have nothing to disclose.
Graduate medical education (GME) is heavily reliant on experiential learning. Most of a resident’s time is spent in progressively independent delivery of patient care, which is associated with decreasing supervision. Attainment and demonstration of competence in patient care is the goal and responsibility of GME training programs. What happens, then, if the medicine resident never has the experience necessary to enable experiential learning? What if she never “sees one,” let alone “does one”?
In this month’s Journal of Hospital Medicine, Sclafani et al1 examine how exposure to urgent clinical situations impacts residents’ confidence in managing these ward emergencies. They astutely reveal the idiosyncratic nature of residency training and consequent gaps created when an educational delivery model predicated on experience lacks certain experiences. How can a resident without certain key experiences be ready for independent practice?
The ACGME’s Next Accreditation System is intended to ensure that residents are prepared for independent practice. The educational outcomes that learners must attain are comprised of six core competencies, with milestones intended to operationalize the measurement and reporting of learner progression toward competence.2,3 It is challenging to apply general competencies to assessment of day to day clinical activities. This challenge led to the development of 16 Entrustable Professional Activities (EPAs). These allow the direct observation of concrete clinical activities that could then infer the attainment (or not) of multiple competencies. Ideally, EPAs are paired with and mapped to curricular milestones which describe a learner’s trajectory within the framework of competencies and determine if a resident is prepared for independent practice.4,5
In Sclafani et al.1 the authors characterize resident exposure to, and confidence in, 50 urgent clinical situations. Both level of training and exposure were associated with increased confidence. However, the most important finding of this paper is the wide variation of resident exposures and confidence with respect to specific urgent clinical events. At least 15% of graduating residents had never seen 16% of the 50 emergency events, and a majority of
Several factors account for the idiosyncratic nature of medical training, including the rarity of certain clinical events, seasonal variation in conditions, and other variables (ie, learner elective choices). In addition, the scheduling of most residency programs is based on patient care needs instead of individual trainees’ educational needs. Other areas of medicine have attempted to standardize experience and ensure specific exposure and/or competence using strategies such as surgical case logs and case-based certifying examinations. There are very important recently described projects in undergraduate medical education aimed at using longitudinal assessment of EPAs in multiple contexts to make entrustment decisions.6 However, Internal Medicine residencies do not routinely employ these strategies.
It must be noted that Sclafani et al. surveyed residents from only one site, and examined only self-reported exposure and confidence, not competence. The relationship between confidence and competence is notoriously problematic7 and there is a risk of familiarity creating an illusion of knowledge and/or competence. Ultimately, a competency-based medical system is intended to be dynamic, adaptive, and contextual. Despite the extensive competency-based framework in place to track the development of physicians, data about the contexts in which competency is demonstrated are lacking. There is no reason to think that the key gaps identified in Sclafani et al are unique to their institution.
Given the ultimate goal of developing curricula that prepare residents for independent practice coupled with robust systems of assessment that ensure they are ready to do so, educators must implement strategies to identify and alleviate the idiosyncrasy of the resident experience. The survey tool in the present work could be used as a needs assessment and would require minimal resources, but is limited by recall bias, illusion of knowledge, and lack of data regarding actual competence. Other potential strategies include case logs or e-folios, although these tools are also limited by the understanding that familiarity and exposure do not necessarily engender competence.
One potential strategy suggested by Warm et al. is the addition of the “Observable Practice Activities” (OPA), “a collection of learning objectives/activities that must be observed in daily practice in order to form entrustment decisions.”8 The intention is to more granularly define what residents actually do and then map these activities to the established competency-based framework. Using these observable activities as an assessment unit may allow for identification of individual experience gaps, thereby improving the dynamicity and adaptiveness of GME training. Certainly, there are very real concerns about further complicating an already complex and abstract system and using a reductionist approach to define the activities of a profession. However, the findings of Sclafani et al with respect to the wide range of resident experience elucidates the need for continued study and innovation regarding the manner in which the medical education community determines our trainees are prepared for independent practice.
Disclosures
The authors have nothing to disclose.
1. Sclafani A, Currier P, Chang Y, Eromo E, Raemer D, Miloslavsky E. Internal Medicine Residents’ Exposure to and Confidence in Managing Ward Emergencies. J Hosp Med. 2019;14(4):218-223. PubMed
2. Holmboe ES, Call S, Ficalora RD. Milestones and Competency-Based Medical Education in Internal Medicine. JAMA Intern Med. 2016;176(11):1601. PubMed
3. Hauer KE, Vandergrift J, Lipner RS, Holmboe ES, Hood S, McDonald FS. National Internal Medicine Milestone Ratings. Acad Med. 2018;93(8):1189-1204. PubMed
4. Ten Cate O, Scheele F, Ten Cate TJ. Viewpoint: Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
5. Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training: A report from the Education Redesign Committee of the Alliance for Academic Internal Medicine. Acad Med. 2015;90(4):479-484. PubMed
6. Murray KE, Lane JL, Carraccio C, et al. Crossing the Gap. Acad Med. November 2018:1. PubMed
7. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of Physician Self-assessment Compared With Observed Measures of Competence. JAMA. 2006;296(9):1094. PubMed
8. Warm EJ, Mathis BR, Held JD, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med. 2014;29(8):1177-1182. PubMed
1. Sclafani A, Currier P, Chang Y, Eromo E, Raemer D, Miloslavsky E. Internal Medicine Residents’ Exposure to and Confidence in Managing Ward Emergencies. J Hosp Med. 2019;14(4):218-223. PubMed
2. Holmboe ES, Call S, Ficalora RD. Milestones and Competency-Based Medical Education in Internal Medicine. JAMA Intern Med. 2016;176(11):1601. PubMed
3. Hauer KE, Vandergrift J, Lipner RS, Holmboe ES, Hood S, McDonald FS. National Internal Medicine Milestone Ratings. Acad Med. 2018;93(8):1189-1204. PubMed
4. Ten Cate O, Scheele F, Ten Cate TJ. Viewpoint: Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
5. Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training: A report from the Education Redesign Committee of the Alliance for Academic Internal Medicine. Acad Med. 2015;90(4):479-484. PubMed
6. Murray KE, Lane JL, Carraccio C, et al. Crossing the Gap. Acad Med. November 2018:1. PubMed
7. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of Physician Self-assessment Compared With Observed Measures of Competence. JAMA. 2006;296(9):1094. PubMed
8. Warm EJ, Mathis BR, Held JD, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med. 2014;29(8):1177-1182. PubMed
© 2019 Society of Hospital Medicine