Affiliations
Medical Service, Louis Stokes Cleveland Department of Veterans Affairs Medical Center, and Department of Medicine, Case Western Reserve School of Medicine
Given name(s)
Todd I.
Family name
Smith
Degrees
MD

Inpatient Trainee Clinical Exposures

Article Type
Changed
Sun, 05/21/2017 - 14:08
Display Headline
Clinical exposures during internal medicine acting internship: Profiling student and team experiences

The clinical learning model in medical education, specifically in the third and fourth years of medical school and in residency and fellowship training, is driven by direct patient‐care experiences and complemented by mentorship and supervision provided by experienced physicians.[1] Despite the emphasis on experiential learning in medical school and graduate training, the ability of educators to quantify the clinical experiences of learners has been limited. Case logs, often self‐reported, are frequently required during educational rotations to attempt to measure clinical experience.[2] Logs have been utilized to document diagnoses, demographics, disease severity, procedures, and chief complaints.[3, 4, 5, 6] Unfortunately, self‐reported logs are vulnerable to delayed updates, misreported data, and unreliable data validation.[7, 8] Automated data collection has been shown to be more reliable than self‐reported logs.[8, 9]

The enhanced data mining methods now available allow educators to appraise learners' exposures during patient‐care interactions beyond just the diagnosis or chief complaint (eg, how many electrocardiograms do our learners evaluate during a cardiology rotation, how often do our learners gain experience prescribing a specific class of antibiotics, how many of the patients seen by our learners are diabetic). For example, a learner's interaction with a patient during an inpatient admission for community‐acquired pneumonia, at minimum, would include assessing of past medical history, reviewing outpatient medications and allergies, evaluating tests completed (chest x‐ray, complete blood count, blood cultures), prescribing antibiotics, and monitoring comorbidities. The lack of knowledge regarding the frequency and context of these exposures is a key gap in our understanding of the clinical experience of inpatient trainees. Additionally, there are no data on clinical exposures specific to team‐based inpatient learning. When a rotation is team‐based, the educational experience is not limited to the learner's assigned patients, and this arrangement allows for educational exposures from patients who are not the learner's primary assignments through experiences gained during team rounds, cross‐coverage assessments, and informal discussions of patient care.

In this study, we quantify the clinical exposures of learners on an acting internship (AI) rotation in internal medicine by utilizing the Veterans Affairs (VA) electronic medical records (EMR) as collected through the VA Veterans Integrated Service Network 10 Clinical Data Warehouse (CDW). The AI or subinternship is a medical school clinical rotation typically completed in the fourth year, where the learning experience is expected to mirror a 1‐month rotation of a first‐year resident.[10] The AI has historically been defined as an experiential curriculum, during which students assume many of the responsibilities and activities that they will manage as graduate medical trainees.[10, 11] The exposures of AI learners include primary diagnoses encountered, problem lists evaluated at the time of admission, medications prescribed, laboratory tests ordered, and radiologic imaging evaluated. We additionally explored the exposures of the AI learner's team to assess the experiences available through team‐based care.

METHODS

This study was completed at the Louis Stokes Veterans Affairs Medical Center (LSVAMC) in Cleveland, Ohio, which is an academic affiliate of the Case Western Reserve University School of Medicine. The study was approved by the LSVAMC institutional review board.

At the LSVAMC, the AI rotation in internal medicine is a 4‐week inpatient rotation for fourth‐year medical students, in which the student is assigned to an inpatient medical team consisting of an attending physician, a senior resident, and a combination of first‐year residents and acting interns. Compared to a first‐year resident, the acting intern is assigned approximately half of the number of admissions. The teams rounds as a group at least once per day. Acting interns are permitted to place orders and write notes in the EMR; all orders require a cosignature by a resident or attending physician to be released.

We identified students who rotated through the LSVAMC for an AI in internal medicine rotation from July 2008 to November 2011 from rotation records. Using the CDW, we queried student names and their rotation dates and analyzed the results using a Structured Query Language Query Analyzer. Each student's patient encounters during the rotation were identified. A patient encounter was defined as a patient for whom the student wrote at least 1 note titled either Medicine Admission Note or Medicine Inpatient Progress Note, on any of the dates during their AI rotation. We then counted the total number of notes written by each student during their rotation. A patient identifier is associated with each note. The number of distinct patient identifiers was also tallied to establish the total number of patients seen during the rotation by the individual student as the primary caregiver.

We associated each patient encounter with an inpatient admission profile that included patient admission and discharge dates, International Classification of Diseases, 9th Revision (ICD‐9) diagnosis codes, and admitting specialty. Primary diagnosis codes were queried for each admission and were counted for individual students and in aggregate. We tallied both the individual student and aggregate patient medications prescribed during the dates of admission and ordered to a patient location consistent with an acute medical ward (therefore excluding orders placed if a patient was transferred to an intensive care unit). Similar queries were completed for laboratory and radiological testing.

The VA EMR keeps an active problem list on each patient, and items are associated with an ICD‐9 code. To assemble the active problems available for evaluation by the student on the day of a patient's admission, we queried all problem list items added prior to, but not discontinued before, the day of admission. We then tallied the results for every patient seen by each individual student and in aggregate.

To assess the team exposures for each AI student, we queried all discharge summaries cosigned by the student's attending during the dates of the student's rotation. We assumed the student's team members wrote these discharge summaries. After excluding the student's patients, the resultant list represented the team patient exposures for each student. This list was also queried for the number of patients seen, primary diagnoses, medications, problems, labs, and radiology. The number of team admissions counted included all patients who spent at least 1 day on the team while the student was rotating. All other team exposure counts completed included only patients who were both admitted and discharged within the dates of the student's rotation.

RESULTS

An AI rotation is 4 weeks in duration. Students competed a total of 128 rotations from July 30, 2008 through November 21, 2011. We included all rotations during this time period in the analysis. Tables 1, 2, 3, 4, 5 report results in 4 categories. The Student category tallies the total number of specific exposures (diagnoses, problems, medications, lab values, or radiology tests) for all patients primarily assigned to a student. The Team category tallies the total number of exposures for all patients assigned to other members of the student's inpatient team. The Primary % category identifies the percentage of students who had at least 1 assigned patient with the evaluated clinical exposure. The All Patients % category identifies the percentage of students who had at least 1 student‐assigned patient or at least 1 team‐assigned patient with the evaluated clinical exposure.

Most Common Primary Diagnoses
DiagnosisStudentTeamPrimary%All Patients %
Obstructive chronic bronchitis, with acute exacerbation10224157%91%
Pneumonia, organism unspecified9122849%91%
Acute renal failure, unspecified7317046%83%
Urinary tract infection, site not specified6914943%87%
Congestive heart failure, unspecified6511441%68%
Alcohol withdrawal4610126%61%
Alcoholic cirrhosis of liver289816%57%
Cellulitis and abscess of leg, except foot266118%45%
Acute pancreatitis235116%43%
Intestinal infection due to Clostridium difficile223017%33%
Malignant neoplasm of bronchus and lung, unspecified223816%35%
Acute on chronic diastolic heart failure224516%39%
Encounter for antineoplastic chemotherapy219615%48%
Dehydration197813%46%
Anemia, unspecified193613%30%
Pneumonitis due to inhalation of food or vomitus192513%24%
Syncope and collapse163813%39%
Other pulmonary embolism and infarction154112%26%
Unspecified pleural effusion153710%34%
Acute respiratory failure154211%35%
Most Common Problem List Items
ProblemStudentTeamPrimary%All Patients %
Hypertension1,6653,280100%100%
Tobacco use disorder1,3502,759100%100%
Unknown cause morbidity/mortality1,1542,370100%100%
Hyperlipidemia1,0362,04499%100%
Diabetes mellitus 2 without complication8651,709100%100%
Chronic airway obstruction6001,132100%100%
Esophageal reflux5831,13199%100%
Depressive disorder5101,005100%100%
Dermatophytosis of nail49893998%100%
Alcohol dependence44196697%100%
Chronic ischemic heart disease38575895%100%
Osteoarthritis38379196%100%
Lumbago35769297%100%
Current useanticoagulation34262994%100%
Anemia33767497%100%
Inhibited sex excitement31761091%100%
Congestive heart failure29455191%100%
Peripheral vascular disease28852988%99%
Sensorineural hearing loss28053588%99%
Post‐traumatic stress disorder27452891%100%
Pure hypercholesterolemia26252188%100%
Coronary atherosclerosis25939687%95%
Obesity24650989%99%
Atrial fibrillation23646985%100%
Gout21638985%100%
Most Common Medications Prescribed
MedicationStudentTeamPrimary%All Patients %
Omeprazole1,3722,98199%100%
Heparin1,0672,27195%96%
Sodium chloride 0.9%9252,03699%100%
Aspirin8441,78298%100%
Potassium chloride7071,38799%100%
Metoprolol tartrate6931,31898%100%
Insulin regular6921,51899%100%
Acetaminophen6691,35198%100%
Simvastatin6481,40899%100%
Lisinopril5821,30998%100%
Furosemide5771,18698%100%
Docusate sodium5411,12798%100%
Vancomycin53197798%100%
Multivitamin4781,07496%100%
Piperacillin/tazobactam47078198%100%
Selected examples    
Prednisone30561393%100%
Insulin glargine24449281%98%
Spironolactone16738073%98%
Digoxin6812540%77%
Meropenem162111%24%
Common Laboratory Tests (Proxy)
Lab TestStudentTeamPrimary%All Patients %
  • NOTE: Abbreviations:SGOT, serum glutamic oxaloacetic transaminase; WBC, white blood cell.

Fingerstick glucose12,86924,946100%100%
Renal panel (serum sodium)7,72814,504100%100%
Complete blood count (blood hematocrit)7,37214,188100%100%
International normalized ratio3,7256,259100%100%
Liver function tests (serum SGOT)1,5703,18099%100%
Urinalysis (urine nitrite)7891,537100%100%
Arterial blood gas (arterial blood pH)76770478%99%
Hemoglobin A1C4851,17796%100%
Fractional excretion of sodium (urine creatinine)33667785%99%
Lactic acid19531465%96%
Ferritin19341374%99%
Thyroid‐stimulating hormone18439155%64%
Lipase15731758%91%
Hepatitis C antibody13932770%98%
Haptoglobin10120846%83%
B‐type natriuretic peptide9821248%87%
Cortisol7011934%60%
Rapid plasma reagin7017344%82%
Urine legionella antigen7012638%64%
D‐dimer5911134%72%
Digoxin456918%39%
Paracentesis labs (peritoneal fluid total protein)344716%34%
Thoracentesis labs (pleural fluid WBC count)334220%38%
C‐reactive protein306517%34%
Lumbar puncture labs (cerebrospinal fluid WBC count)225711%27%
Arthrocentesis (synovial fluid WBC count)14239%23%
Most Common Radiology Tests
Radiology TestStudentTeamPrimary%All Patients %
  • NOTE: Abbreviations: CT, computed tomography; KUB, kidney, ureter, and bladder; MRI, magnetic resonance imaging; PA, posteroanterior; PE, pulmonary embolism;PET, positron‐emission tomography.

Chest,2 views,PA and lateral9381,955100%100%
Chest portable41475196%100%
CT head without contrast23549982%100%
CT abdomen with contrast21836559%71%
CT pelvis with contrast21336459%70%
CT chest with contrast16335175%99%
Ultrasound kidney, bilateral11920861%92%
Abdomen 1 view10722059%93%
Ultrasound liver10018348%82%
Modified barium swallow9313053%82%
PET scan9318149%79%
Selected examples    
Acute abdomen series8517748%81%
CT chest, PE protocol6712637%73%
MRI brain with andwithout contrast5610934%66%
Chest decubitus517634%60%
Portable KUBfor Dobhoff placement426230%48%
Ventilation/perfusion lung scan152512%27%
Ultrasound thyroid8165%17%

Distinct Patients and Progress Notes

The mean number of progress notes written by a student was 67.2 (standard deviation [SD] 16.3). The mean number of distinct patients evaluated by a student during a rotation was 18.4 (SD 4.2). The mean number of team admissions per student rotation was 46.7 (SD 9.6) distinct patients.

Primary Diagnoses

A total of 2213 primary diagnoses were documented on patients assigned to students on AI rotations. A total of 5323 primary diagnoses were documented on patients assigned to other members of the team during the students' rotations. Therefore, the mean number of primary diagnoses seen by a student during a rotation was 58.9 (17.3 primary diagnoses for student‐assigned patients and 41.6 primary diagnoses for team patients). The students and teams encountered similar diagnoses (Table 1).

Problem List

Students and teams evaluated a total of 40,015 and 78,643 past medical problems, respectively. The mean number of problems seen by a student during a rotation was 927 (313 student, 614 team). Table 2 reports the most frequent problems assigned to primary student admissions. Students and teams evaluated similar problems. Hepatitis C (196 student, 410 team) was the only team problem that was in the team top 25 but not in the student top 25.

Medications

A total of 38,149 medications were prescribed to the students' primary patients. A total of 77,738 medications were prescribed to patients assigned to the rest of the team. The mean number of medication exposures for a student during a rotation was 905 (298 student, 607 team). The most frequently prescribed medications were similar between student and the team (Table 3). Team medications that were in the top 25 but not in the student top 25 included: hydralazine (300 student, 629 team), prednisone (305 student, 613 team), and oxycodone/acetaminophen (286 student, 608 team).

Labs

All laboratory tests with reported results were tallied. For common laboratory panels, single lab values (eg, serum hematocrit for a complete blood count) were selected as proxies to count the number of studies completed and evaluated. Table 4 shows a cross‐section of laboratory tests evaluated during AI rotations.

Radiology

A total of 6197 radiology tests were completed on patients assigned to students, whereas 11,761 radiology tests were completed on patients assigned to other team members. The mean number of radiology exposures for a student was 140 (48 student, 92 team). The most frequently seen radiology tests were similar between student and the team (Table 5).

DISCUSSION

As medical educators, we assume that the clinical training years allow learners to develop essential skills through their varied clinical experiences. Through exposure to direct patient care, to medical decision‐making scenarios, and to senior physician management practices, trainees build the knowledge base for independent practice. To ensure there is sufficient clinical exposure, data on what trainees are encountering may prove beneficial.

In this novel study, we quantified what learners encounter during a 1‐month team‐based inpatient rotation at a large teaching hospital. We effectively measured a number of aspects of internal medicine inpatient training that have been difficult to quantify in the past. The ability to extract learner‐specific data is becoming increasingly available in academic teaching hospitals. For example, VA medical centers have available a daily updated national data warehouse. The other steps necessary for using learner‐specific data include an understanding of the local inpatient processhow tests are ordered, what note titles are used by traineesas well as someone able to build the queries necessary for data extraction. Once built, data extraction should be able to continue as an automated process and used in real time by medical educators.

Our method of data collection has limitations. The orders placed on a learner's primary patients may not have been placed by the learner. For example, orders may have been placed by an overnight resident cross‐covering the learner's patients. We assumed that learners evaluated the results of all tests (or medication changes) that occurred at any time during their rotation, including cross‐cover periods or days off. In addition, our method for evaluating team exposure underestimates the number of team patients calculated for each learner by limiting the query only to patients whose hospital stay was completed before the student left the inpatient service. It is also difficult to know the how many of the exposures are realized by the learner. Differences in learner attention, contrasts in rounding styles, and varying presentation methods will affect the number of exposures truly attained by the learner. Finally, not all clinical exposures can be evaluated through review of an EMR. Clinical experiences, such as care coordination, patient education, and family counseling, cannot be easily extracted.

Data mining EMRs can enhance clinical medical education. Although our data collection was completed retrospectively, we could easily provide learner‐specific data in real time to ward attendings, chief residents, and program directors. This information could direct the development of teaching tools and individualization of curricula. Perhaps, even more importantly, it would also allow educators to define curricular gaps. Whether these gaps are due to the particular patient demographics of a medical center, the practice patterns and strengths of a particular institution, or career interests of a trainee, these gaps may skew the patient‐care experiences encountered by individual trainees. We can use these data to identify differences in clinical experience and then develop opportunities for learnersclinical, didactic, or simulatedto address deficiencies and provide well‐rounded clinical experiences.

Further investigation to better understand the relationship between direct patient‐care experience and clinical skill acquisition is needed. This information could help guide the development of standards on the number of exposures we expect our learners to have with different diagnostic or treatment modalities prior to independent practice. Using learner data to better understand the clinical experiences of our medical trainees, we can hopefully develop more precise and focused curricula to ensure we produce competent graduates.

Acknowledgments

This material is the result of work supported with resources and the use of facilities at the Louis Stokes Cleveland VA Medical Center. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

Files
References
  1. Accreditation Council for Graduate Medical Education. Program requirements for graduate medical education in internal medicine. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQ-PIF/140_internal_medicine_07012013.pdf. Originally accessed December 18, 2012.
  2. Kasten SJ, Prince ME, Lypson ML. Residents make their lists and program directors check them twice: reviewing case logs. J Grad Med Educ. 2012;34:257260.
  3. Mattana J, Kerpen H, Lee C, et al. Quantifying internal medicine resident clinical experience using resident‐selected primary diagnosis codes. J Hosp Med. 2011;6(7):395400.
  4. Rattner SL, Louis DZ, Rabinowitz C, et al. Documenting and comparing medical students' clinical experiences. JAMA. 2001;286:10351040.
  5. Sequist TD, Singh S, Pereira AG, Rusinak D, Pearson SD. Use of an electronic medical record to profile the continuity clinic experiences of primary care residents. Acad Med. 2005;80:390394.
  6. Iglar K, Polsky J, Glazier R. Using a Web‐based system to monitor practice profiles in primary care residency training. Can Fam Physician. 2011;57:10301037.
  7. Nagler J, Harper MB, Bachur RG. An automated electronic case log: using electronic information systems to assess training in emergency medicine. Acad Emergency Med. 2006;13:733739.
  8. Simpao A, Heitz JW, McNulty SE, Chekemian B, Bren BR, Epstein RH. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system. Anesth Analg. 2011;112(2):422429.
  9. Nkoy FL, Petersen S, Matheny Antommaria AH, Maloney CG. Validation of an electronic system for recording medical student patient encounters. AMIA Annu Symp Proc. 2008;2008:510514.
  10. Sidlow R. The structure and content of the medical subinternship: a national survey. J Gen Intern Med. 2001;16:550553.
  11. Jolly BC, MacDonald MM. Education for practice: the role of practical experience in undergraduate and general clinical training. Med Educ. 1989;23:189195.
Article PDF
Issue
Journal of Hospital Medicine - 9(7)
Publications
Page Number
436-440
Sections
Files
Files
Article PDF
Article PDF

The clinical learning model in medical education, specifically in the third and fourth years of medical school and in residency and fellowship training, is driven by direct patient‐care experiences and complemented by mentorship and supervision provided by experienced physicians.[1] Despite the emphasis on experiential learning in medical school and graduate training, the ability of educators to quantify the clinical experiences of learners has been limited. Case logs, often self‐reported, are frequently required during educational rotations to attempt to measure clinical experience.[2] Logs have been utilized to document diagnoses, demographics, disease severity, procedures, and chief complaints.[3, 4, 5, 6] Unfortunately, self‐reported logs are vulnerable to delayed updates, misreported data, and unreliable data validation.[7, 8] Automated data collection has been shown to be more reliable than self‐reported logs.[8, 9]

The enhanced data mining methods now available allow educators to appraise learners' exposures during patient‐care interactions beyond just the diagnosis or chief complaint (eg, how many electrocardiograms do our learners evaluate during a cardiology rotation, how often do our learners gain experience prescribing a specific class of antibiotics, how many of the patients seen by our learners are diabetic). For example, a learner's interaction with a patient during an inpatient admission for community‐acquired pneumonia, at minimum, would include assessing of past medical history, reviewing outpatient medications and allergies, evaluating tests completed (chest x‐ray, complete blood count, blood cultures), prescribing antibiotics, and monitoring comorbidities. The lack of knowledge regarding the frequency and context of these exposures is a key gap in our understanding of the clinical experience of inpatient trainees. Additionally, there are no data on clinical exposures specific to team‐based inpatient learning. When a rotation is team‐based, the educational experience is not limited to the learner's assigned patients, and this arrangement allows for educational exposures from patients who are not the learner's primary assignments through experiences gained during team rounds, cross‐coverage assessments, and informal discussions of patient care.

In this study, we quantify the clinical exposures of learners on an acting internship (AI) rotation in internal medicine by utilizing the Veterans Affairs (VA) electronic medical records (EMR) as collected through the VA Veterans Integrated Service Network 10 Clinical Data Warehouse (CDW). The AI or subinternship is a medical school clinical rotation typically completed in the fourth year, where the learning experience is expected to mirror a 1‐month rotation of a first‐year resident.[10] The AI has historically been defined as an experiential curriculum, during which students assume many of the responsibilities and activities that they will manage as graduate medical trainees.[10, 11] The exposures of AI learners include primary diagnoses encountered, problem lists evaluated at the time of admission, medications prescribed, laboratory tests ordered, and radiologic imaging evaluated. We additionally explored the exposures of the AI learner's team to assess the experiences available through team‐based care.

METHODS

This study was completed at the Louis Stokes Veterans Affairs Medical Center (LSVAMC) in Cleveland, Ohio, which is an academic affiliate of the Case Western Reserve University School of Medicine. The study was approved by the LSVAMC institutional review board.

At the LSVAMC, the AI rotation in internal medicine is a 4‐week inpatient rotation for fourth‐year medical students, in which the student is assigned to an inpatient medical team consisting of an attending physician, a senior resident, and a combination of first‐year residents and acting interns. Compared to a first‐year resident, the acting intern is assigned approximately half of the number of admissions. The teams rounds as a group at least once per day. Acting interns are permitted to place orders and write notes in the EMR; all orders require a cosignature by a resident or attending physician to be released.

We identified students who rotated through the LSVAMC for an AI in internal medicine rotation from July 2008 to November 2011 from rotation records. Using the CDW, we queried student names and their rotation dates and analyzed the results using a Structured Query Language Query Analyzer. Each student's patient encounters during the rotation were identified. A patient encounter was defined as a patient for whom the student wrote at least 1 note titled either Medicine Admission Note or Medicine Inpatient Progress Note, on any of the dates during their AI rotation. We then counted the total number of notes written by each student during their rotation. A patient identifier is associated with each note. The number of distinct patient identifiers was also tallied to establish the total number of patients seen during the rotation by the individual student as the primary caregiver.

We associated each patient encounter with an inpatient admission profile that included patient admission and discharge dates, International Classification of Diseases, 9th Revision (ICD‐9) diagnosis codes, and admitting specialty. Primary diagnosis codes were queried for each admission and were counted for individual students and in aggregate. We tallied both the individual student and aggregate patient medications prescribed during the dates of admission and ordered to a patient location consistent with an acute medical ward (therefore excluding orders placed if a patient was transferred to an intensive care unit). Similar queries were completed for laboratory and radiological testing.

The VA EMR keeps an active problem list on each patient, and items are associated with an ICD‐9 code. To assemble the active problems available for evaluation by the student on the day of a patient's admission, we queried all problem list items added prior to, but not discontinued before, the day of admission. We then tallied the results for every patient seen by each individual student and in aggregate.

To assess the team exposures for each AI student, we queried all discharge summaries cosigned by the student's attending during the dates of the student's rotation. We assumed the student's team members wrote these discharge summaries. After excluding the student's patients, the resultant list represented the team patient exposures for each student. This list was also queried for the number of patients seen, primary diagnoses, medications, problems, labs, and radiology. The number of team admissions counted included all patients who spent at least 1 day on the team while the student was rotating. All other team exposure counts completed included only patients who were both admitted and discharged within the dates of the student's rotation.

RESULTS

An AI rotation is 4 weeks in duration. Students competed a total of 128 rotations from July 30, 2008 through November 21, 2011. We included all rotations during this time period in the analysis. Tables 1, 2, 3, 4, 5 report results in 4 categories. The Student category tallies the total number of specific exposures (diagnoses, problems, medications, lab values, or radiology tests) for all patients primarily assigned to a student. The Team category tallies the total number of exposures for all patients assigned to other members of the student's inpatient team. The Primary % category identifies the percentage of students who had at least 1 assigned patient with the evaluated clinical exposure. The All Patients % category identifies the percentage of students who had at least 1 student‐assigned patient or at least 1 team‐assigned patient with the evaluated clinical exposure.

Most Common Primary Diagnoses
DiagnosisStudentTeamPrimary%All Patients %
Obstructive chronic bronchitis, with acute exacerbation10224157%91%
Pneumonia, organism unspecified9122849%91%
Acute renal failure, unspecified7317046%83%
Urinary tract infection, site not specified6914943%87%
Congestive heart failure, unspecified6511441%68%
Alcohol withdrawal4610126%61%
Alcoholic cirrhosis of liver289816%57%
Cellulitis and abscess of leg, except foot266118%45%
Acute pancreatitis235116%43%
Intestinal infection due to Clostridium difficile223017%33%
Malignant neoplasm of bronchus and lung, unspecified223816%35%
Acute on chronic diastolic heart failure224516%39%
Encounter for antineoplastic chemotherapy219615%48%
Dehydration197813%46%
Anemia, unspecified193613%30%
Pneumonitis due to inhalation of food or vomitus192513%24%
Syncope and collapse163813%39%
Other pulmonary embolism and infarction154112%26%
Unspecified pleural effusion153710%34%
Acute respiratory failure154211%35%
Most Common Problem List Items
ProblemStudentTeamPrimary%All Patients %
Hypertension1,6653,280100%100%
Tobacco use disorder1,3502,759100%100%
Unknown cause morbidity/mortality1,1542,370100%100%
Hyperlipidemia1,0362,04499%100%
Diabetes mellitus 2 without complication8651,709100%100%
Chronic airway obstruction6001,132100%100%
Esophageal reflux5831,13199%100%
Depressive disorder5101,005100%100%
Dermatophytosis of nail49893998%100%
Alcohol dependence44196697%100%
Chronic ischemic heart disease38575895%100%
Osteoarthritis38379196%100%
Lumbago35769297%100%
Current useanticoagulation34262994%100%
Anemia33767497%100%
Inhibited sex excitement31761091%100%
Congestive heart failure29455191%100%
Peripheral vascular disease28852988%99%
Sensorineural hearing loss28053588%99%
Post‐traumatic stress disorder27452891%100%
Pure hypercholesterolemia26252188%100%
Coronary atherosclerosis25939687%95%
Obesity24650989%99%
Atrial fibrillation23646985%100%
Gout21638985%100%
Most Common Medications Prescribed
MedicationStudentTeamPrimary%All Patients %
Omeprazole1,3722,98199%100%
Heparin1,0672,27195%96%
Sodium chloride 0.9%9252,03699%100%
Aspirin8441,78298%100%
Potassium chloride7071,38799%100%
Metoprolol tartrate6931,31898%100%
Insulin regular6921,51899%100%
Acetaminophen6691,35198%100%
Simvastatin6481,40899%100%
Lisinopril5821,30998%100%
Furosemide5771,18698%100%
Docusate sodium5411,12798%100%
Vancomycin53197798%100%
Multivitamin4781,07496%100%
Piperacillin/tazobactam47078198%100%
Selected examples    
Prednisone30561393%100%
Insulin glargine24449281%98%
Spironolactone16738073%98%
Digoxin6812540%77%
Meropenem162111%24%
Common Laboratory Tests (Proxy)
Lab TestStudentTeamPrimary%All Patients %
  • NOTE: Abbreviations:SGOT, serum glutamic oxaloacetic transaminase; WBC, white blood cell.

Fingerstick glucose12,86924,946100%100%
Renal panel (serum sodium)7,72814,504100%100%
Complete blood count (blood hematocrit)7,37214,188100%100%
International normalized ratio3,7256,259100%100%
Liver function tests (serum SGOT)1,5703,18099%100%
Urinalysis (urine nitrite)7891,537100%100%
Arterial blood gas (arterial blood pH)76770478%99%
Hemoglobin A1C4851,17796%100%
Fractional excretion of sodium (urine creatinine)33667785%99%
Lactic acid19531465%96%
Ferritin19341374%99%
Thyroid‐stimulating hormone18439155%64%
Lipase15731758%91%
Hepatitis C antibody13932770%98%
Haptoglobin10120846%83%
B‐type natriuretic peptide9821248%87%
Cortisol7011934%60%
Rapid plasma reagin7017344%82%
Urine legionella antigen7012638%64%
D‐dimer5911134%72%
Digoxin456918%39%
Paracentesis labs (peritoneal fluid total protein)344716%34%
Thoracentesis labs (pleural fluid WBC count)334220%38%
C‐reactive protein306517%34%
Lumbar puncture labs (cerebrospinal fluid WBC count)225711%27%
Arthrocentesis (synovial fluid WBC count)14239%23%
Most Common Radiology Tests
Radiology TestStudentTeamPrimary%All Patients %
  • NOTE: Abbreviations: CT, computed tomography; KUB, kidney, ureter, and bladder; MRI, magnetic resonance imaging; PA, posteroanterior; PE, pulmonary embolism;PET, positron‐emission tomography.

Chest,2 views,PA and lateral9381,955100%100%
Chest portable41475196%100%
CT head without contrast23549982%100%
CT abdomen with contrast21836559%71%
CT pelvis with contrast21336459%70%
CT chest with contrast16335175%99%
Ultrasound kidney, bilateral11920861%92%
Abdomen 1 view10722059%93%
Ultrasound liver10018348%82%
Modified barium swallow9313053%82%
PET scan9318149%79%
Selected examples    
Acute abdomen series8517748%81%
CT chest, PE protocol6712637%73%
MRI brain with andwithout contrast5610934%66%
Chest decubitus517634%60%
Portable KUBfor Dobhoff placement426230%48%
Ventilation/perfusion lung scan152512%27%
Ultrasound thyroid8165%17%

Distinct Patients and Progress Notes

The mean number of progress notes written by a student was 67.2 (standard deviation [SD] 16.3). The mean number of distinct patients evaluated by a student during a rotation was 18.4 (SD 4.2). The mean number of team admissions per student rotation was 46.7 (SD 9.6) distinct patients.

Primary Diagnoses

A total of 2213 primary diagnoses were documented on patients assigned to students on AI rotations. A total of 5323 primary diagnoses were documented on patients assigned to other members of the team during the students' rotations. Therefore, the mean number of primary diagnoses seen by a student during a rotation was 58.9 (17.3 primary diagnoses for student‐assigned patients and 41.6 primary diagnoses for team patients). The students and teams encountered similar diagnoses (Table 1).

Problem List

Students and teams evaluated a total of 40,015 and 78,643 past medical problems, respectively. The mean number of problems seen by a student during a rotation was 927 (313 student, 614 team). Table 2 reports the most frequent problems assigned to primary student admissions. Students and teams evaluated similar problems. Hepatitis C (196 student, 410 team) was the only team problem that was in the team top 25 but not in the student top 25.

Medications

A total of 38,149 medications were prescribed to the students' primary patients. A total of 77,738 medications were prescribed to patients assigned to the rest of the team. The mean number of medication exposures for a student during a rotation was 905 (298 student, 607 team). The most frequently prescribed medications were similar between student and the team (Table 3). Team medications that were in the top 25 but not in the student top 25 included: hydralazine (300 student, 629 team), prednisone (305 student, 613 team), and oxycodone/acetaminophen (286 student, 608 team).

Labs

All laboratory tests with reported results were tallied. For common laboratory panels, single lab values (eg, serum hematocrit for a complete blood count) were selected as proxies to count the number of studies completed and evaluated. Table 4 shows a cross‐section of laboratory tests evaluated during AI rotations.

Radiology

A total of 6197 radiology tests were completed on patients assigned to students, whereas 11,761 radiology tests were completed on patients assigned to other team members. The mean number of radiology exposures for a student was 140 (48 student, 92 team). The most frequently seen radiology tests were similar between student and the team (Table 5).

DISCUSSION

As medical educators, we assume that the clinical training years allow learners to develop essential skills through their varied clinical experiences. Through exposure to direct patient care, to medical decision‐making scenarios, and to senior physician management practices, trainees build the knowledge base for independent practice. To ensure there is sufficient clinical exposure, data on what trainees are encountering may prove beneficial.

In this novel study, we quantified what learners encounter during a 1‐month team‐based inpatient rotation at a large teaching hospital. We effectively measured a number of aspects of internal medicine inpatient training that have been difficult to quantify in the past. The ability to extract learner‐specific data is becoming increasingly available in academic teaching hospitals. For example, VA medical centers have available a daily updated national data warehouse. The other steps necessary for using learner‐specific data include an understanding of the local inpatient processhow tests are ordered, what note titles are used by traineesas well as someone able to build the queries necessary for data extraction. Once built, data extraction should be able to continue as an automated process and used in real time by medical educators.

Our method of data collection has limitations. The orders placed on a learner's primary patients may not have been placed by the learner. For example, orders may have been placed by an overnight resident cross‐covering the learner's patients. We assumed that learners evaluated the results of all tests (or medication changes) that occurred at any time during their rotation, including cross‐cover periods or days off. In addition, our method for evaluating team exposure underestimates the number of team patients calculated for each learner by limiting the query only to patients whose hospital stay was completed before the student left the inpatient service. It is also difficult to know the how many of the exposures are realized by the learner. Differences in learner attention, contrasts in rounding styles, and varying presentation methods will affect the number of exposures truly attained by the learner. Finally, not all clinical exposures can be evaluated through review of an EMR. Clinical experiences, such as care coordination, patient education, and family counseling, cannot be easily extracted.

Data mining EMRs can enhance clinical medical education. Although our data collection was completed retrospectively, we could easily provide learner‐specific data in real time to ward attendings, chief residents, and program directors. This information could direct the development of teaching tools and individualization of curricula. Perhaps, even more importantly, it would also allow educators to define curricular gaps. Whether these gaps are due to the particular patient demographics of a medical center, the practice patterns and strengths of a particular institution, or career interests of a trainee, these gaps may skew the patient‐care experiences encountered by individual trainees. We can use these data to identify differences in clinical experience and then develop opportunities for learnersclinical, didactic, or simulatedto address deficiencies and provide well‐rounded clinical experiences.

Further investigation to better understand the relationship between direct patient‐care experience and clinical skill acquisition is needed. This information could help guide the development of standards on the number of exposures we expect our learners to have with different diagnostic or treatment modalities prior to independent practice. Using learner data to better understand the clinical experiences of our medical trainees, we can hopefully develop more precise and focused curricula to ensure we produce competent graduates.

Acknowledgments

This material is the result of work supported with resources and the use of facilities at the Louis Stokes Cleveland VA Medical Center. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

The clinical learning model in medical education, specifically in the third and fourth years of medical school and in residency and fellowship training, is driven by direct patient‐care experiences and complemented by mentorship and supervision provided by experienced physicians.[1] Despite the emphasis on experiential learning in medical school and graduate training, the ability of educators to quantify the clinical experiences of learners has been limited. Case logs, often self‐reported, are frequently required during educational rotations to attempt to measure clinical experience.[2] Logs have been utilized to document diagnoses, demographics, disease severity, procedures, and chief complaints.[3, 4, 5, 6] Unfortunately, self‐reported logs are vulnerable to delayed updates, misreported data, and unreliable data validation.[7, 8] Automated data collection has been shown to be more reliable than self‐reported logs.[8, 9]

The enhanced data mining methods now available allow educators to appraise learners' exposures during patient‐care interactions beyond just the diagnosis or chief complaint (eg, how many electrocardiograms do our learners evaluate during a cardiology rotation, how often do our learners gain experience prescribing a specific class of antibiotics, how many of the patients seen by our learners are diabetic). For example, a learner's interaction with a patient during an inpatient admission for community‐acquired pneumonia, at minimum, would include assessing of past medical history, reviewing outpatient medications and allergies, evaluating tests completed (chest x‐ray, complete blood count, blood cultures), prescribing antibiotics, and monitoring comorbidities. The lack of knowledge regarding the frequency and context of these exposures is a key gap in our understanding of the clinical experience of inpatient trainees. Additionally, there are no data on clinical exposures specific to team‐based inpatient learning. When a rotation is team‐based, the educational experience is not limited to the learner's assigned patients, and this arrangement allows for educational exposures from patients who are not the learner's primary assignments through experiences gained during team rounds, cross‐coverage assessments, and informal discussions of patient care.

In this study, we quantify the clinical exposures of learners on an acting internship (AI) rotation in internal medicine by utilizing the Veterans Affairs (VA) electronic medical records (EMR) as collected through the VA Veterans Integrated Service Network 10 Clinical Data Warehouse (CDW). The AI or subinternship is a medical school clinical rotation typically completed in the fourth year, where the learning experience is expected to mirror a 1‐month rotation of a first‐year resident.[10] The AI has historically been defined as an experiential curriculum, during which students assume many of the responsibilities and activities that they will manage as graduate medical trainees.[10, 11] The exposures of AI learners include primary diagnoses encountered, problem lists evaluated at the time of admission, medications prescribed, laboratory tests ordered, and radiologic imaging evaluated. We additionally explored the exposures of the AI learner's team to assess the experiences available through team‐based care.

METHODS

This study was completed at the Louis Stokes Veterans Affairs Medical Center (LSVAMC) in Cleveland, Ohio, which is an academic affiliate of the Case Western Reserve University School of Medicine. The study was approved by the LSVAMC institutional review board.

At the LSVAMC, the AI rotation in internal medicine is a 4‐week inpatient rotation for fourth‐year medical students, in which the student is assigned to an inpatient medical team consisting of an attending physician, a senior resident, and a combination of first‐year residents and acting interns. Compared to a first‐year resident, the acting intern is assigned approximately half of the number of admissions. The teams rounds as a group at least once per day. Acting interns are permitted to place orders and write notes in the EMR; all orders require a cosignature by a resident or attending physician to be released.

We identified students who rotated through the LSVAMC for an AI in internal medicine rotation from July 2008 to November 2011 from rotation records. Using the CDW, we queried student names and their rotation dates and analyzed the results using a Structured Query Language Query Analyzer. Each student's patient encounters during the rotation were identified. A patient encounter was defined as a patient for whom the student wrote at least 1 note titled either Medicine Admission Note or Medicine Inpatient Progress Note, on any of the dates during their AI rotation. We then counted the total number of notes written by each student during their rotation. A patient identifier is associated with each note. The number of distinct patient identifiers was also tallied to establish the total number of patients seen during the rotation by the individual student as the primary caregiver.

We associated each patient encounter with an inpatient admission profile that included patient admission and discharge dates, International Classification of Diseases, 9th Revision (ICD‐9) diagnosis codes, and admitting specialty. Primary diagnosis codes were queried for each admission and were counted for individual students and in aggregate. We tallied both the individual student and aggregate patient medications prescribed during the dates of admission and ordered to a patient location consistent with an acute medical ward (therefore excluding orders placed if a patient was transferred to an intensive care unit). Similar queries were completed for laboratory and radiological testing.

The VA EMR keeps an active problem list on each patient, and items are associated with an ICD‐9 code. To assemble the active problems available for evaluation by the student on the day of a patient's admission, we queried all problem list items added prior to, but not discontinued before, the day of admission. We then tallied the results for every patient seen by each individual student and in aggregate.

To assess the team exposures for each AI student, we queried all discharge summaries cosigned by the student's attending during the dates of the student's rotation. We assumed the student's team members wrote these discharge summaries. After excluding the student's patients, the resultant list represented the team patient exposures for each student. This list was also queried for the number of patients seen, primary diagnoses, medications, problems, labs, and radiology. The number of team admissions counted included all patients who spent at least 1 day on the team while the student was rotating. All other team exposure counts completed included only patients who were both admitted and discharged within the dates of the student's rotation.

RESULTS

An AI rotation is 4 weeks in duration. Students competed a total of 128 rotations from July 30, 2008 through November 21, 2011. We included all rotations during this time period in the analysis. Tables 1, 2, 3, 4, 5 report results in 4 categories. The Student category tallies the total number of specific exposures (diagnoses, problems, medications, lab values, or radiology tests) for all patients primarily assigned to a student. The Team category tallies the total number of exposures for all patients assigned to other members of the student's inpatient team. The Primary % category identifies the percentage of students who had at least 1 assigned patient with the evaluated clinical exposure. The All Patients % category identifies the percentage of students who had at least 1 student‐assigned patient or at least 1 team‐assigned patient with the evaluated clinical exposure.

Most Common Primary Diagnoses
DiagnosisStudentTeamPrimary%All Patients %
Obstructive chronic bronchitis, with acute exacerbation10224157%91%
Pneumonia, organism unspecified9122849%91%
Acute renal failure, unspecified7317046%83%
Urinary tract infection, site not specified6914943%87%
Congestive heart failure, unspecified6511441%68%
Alcohol withdrawal4610126%61%
Alcoholic cirrhosis of liver289816%57%
Cellulitis and abscess of leg, except foot266118%45%
Acute pancreatitis235116%43%
Intestinal infection due to Clostridium difficile223017%33%
Malignant neoplasm of bronchus and lung, unspecified223816%35%
Acute on chronic diastolic heart failure224516%39%
Encounter for antineoplastic chemotherapy219615%48%
Dehydration197813%46%
Anemia, unspecified193613%30%
Pneumonitis due to inhalation of food or vomitus192513%24%
Syncope and collapse163813%39%
Other pulmonary embolism and infarction154112%26%
Unspecified pleural effusion153710%34%
Acute respiratory failure154211%35%
Most Common Problem List Items
ProblemStudentTeamPrimary%All Patients %
Hypertension1,6653,280100%100%
Tobacco use disorder1,3502,759100%100%
Unknown cause morbidity/mortality1,1542,370100%100%
Hyperlipidemia1,0362,04499%100%
Diabetes mellitus 2 without complication8651,709100%100%
Chronic airway obstruction6001,132100%100%
Esophageal reflux5831,13199%100%
Depressive disorder5101,005100%100%
Dermatophytosis of nail49893998%100%
Alcohol dependence44196697%100%
Chronic ischemic heart disease38575895%100%
Osteoarthritis38379196%100%
Lumbago35769297%100%
Current useanticoagulation34262994%100%
Anemia33767497%100%
Inhibited sex excitement31761091%100%
Congestive heart failure29455191%100%
Peripheral vascular disease28852988%99%
Sensorineural hearing loss28053588%99%
Post‐traumatic stress disorder27452891%100%
Pure hypercholesterolemia26252188%100%
Coronary atherosclerosis25939687%95%
Obesity24650989%99%
Atrial fibrillation23646985%100%
Gout21638985%100%
Most Common Medications Prescribed
MedicationStudentTeamPrimary%All Patients %
Omeprazole1,3722,98199%100%
Heparin1,0672,27195%96%
Sodium chloride 0.9%9252,03699%100%
Aspirin8441,78298%100%
Potassium chloride7071,38799%100%
Metoprolol tartrate6931,31898%100%
Insulin regular6921,51899%100%
Acetaminophen6691,35198%100%
Simvastatin6481,40899%100%
Lisinopril5821,30998%100%
Furosemide5771,18698%100%
Docusate sodium5411,12798%100%
Vancomycin53197798%100%
Multivitamin4781,07496%100%
Piperacillin/tazobactam47078198%100%
Selected examples    
Prednisone30561393%100%
Insulin glargine24449281%98%
Spironolactone16738073%98%
Digoxin6812540%77%
Meropenem162111%24%
Common Laboratory Tests (Proxy)
Lab TestStudentTeamPrimary%All Patients %
  • NOTE: Abbreviations:SGOT, serum glutamic oxaloacetic transaminase; WBC, white blood cell.

Fingerstick glucose12,86924,946100%100%
Renal panel (serum sodium)7,72814,504100%100%
Complete blood count (blood hematocrit)7,37214,188100%100%
International normalized ratio3,7256,259100%100%
Liver function tests (serum SGOT)1,5703,18099%100%
Urinalysis (urine nitrite)7891,537100%100%
Arterial blood gas (arterial blood pH)76770478%99%
Hemoglobin A1C4851,17796%100%
Fractional excretion of sodium (urine creatinine)33667785%99%
Lactic acid19531465%96%
Ferritin19341374%99%
Thyroid‐stimulating hormone18439155%64%
Lipase15731758%91%
Hepatitis C antibody13932770%98%
Haptoglobin10120846%83%
B‐type natriuretic peptide9821248%87%
Cortisol7011934%60%
Rapid plasma reagin7017344%82%
Urine legionella antigen7012638%64%
D‐dimer5911134%72%
Digoxin456918%39%
Paracentesis labs (peritoneal fluid total protein)344716%34%
Thoracentesis labs (pleural fluid WBC count)334220%38%
C‐reactive protein306517%34%
Lumbar puncture labs (cerebrospinal fluid WBC count)225711%27%
Arthrocentesis (synovial fluid WBC count)14239%23%
Most Common Radiology Tests
Radiology TestStudentTeamPrimary%All Patients %
  • NOTE: Abbreviations: CT, computed tomography; KUB, kidney, ureter, and bladder; MRI, magnetic resonance imaging; PA, posteroanterior; PE, pulmonary embolism;PET, positron‐emission tomography.

Chest,2 views,PA and lateral9381,955100%100%
Chest portable41475196%100%
CT head without contrast23549982%100%
CT abdomen with contrast21836559%71%
CT pelvis with contrast21336459%70%
CT chest with contrast16335175%99%
Ultrasound kidney, bilateral11920861%92%
Abdomen 1 view10722059%93%
Ultrasound liver10018348%82%
Modified barium swallow9313053%82%
PET scan9318149%79%
Selected examples    
Acute abdomen series8517748%81%
CT chest, PE protocol6712637%73%
MRI brain with andwithout contrast5610934%66%
Chest decubitus517634%60%
Portable KUBfor Dobhoff placement426230%48%
Ventilation/perfusion lung scan152512%27%
Ultrasound thyroid8165%17%

Distinct Patients and Progress Notes

The mean number of progress notes written by a student was 67.2 (standard deviation [SD] 16.3). The mean number of distinct patients evaluated by a student during a rotation was 18.4 (SD 4.2). The mean number of team admissions per student rotation was 46.7 (SD 9.6) distinct patients.

Primary Diagnoses

A total of 2213 primary diagnoses were documented on patients assigned to students on AI rotations. A total of 5323 primary diagnoses were documented on patients assigned to other members of the team during the students' rotations. Therefore, the mean number of primary diagnoses seen by a student during a rotation was 58.9 (17.3 primary diagnoses for student‐assigned patients and 41.6 primary diagnoses for team patients). The students and teams encountered similar diagnoses (Table 1).

Problem List

Students and teams evaluated a total of 40,015 and 78,643 past medical problems, respectively. The mean number of problems seen by a student during a rotation was 927 (313 student, 614 team). Table 2 reports the most frequent problems assigned to primary student admissions. Students and teams evaluated similar problems. Hepatitis C (196 student, 410 team) was the only team problem that was in the team top 25 but not in the student top 25.

Medications

A total of 38,149 medications were prescribed to the students' primary patients. A total of 77,738 medications were prescribed to patients assigned to the rest of the team. The mean number of medication exposures for a student during a rotation was 905 (298 student, 607 team). The most frequently prescribed medications were similar between student and the team (Table 3). Team medications that were in the top 25 but not in the student top 25 included: hydralazine (300 student, 629 team), prednisone (305 student, 613 team), and oxycodone/acetaminophen (286 student, 608 team).

Labs

All laboratory tests with reported results were tallied. For common laboratory panels, single lab values (eg, serum hematocrit for a complete blood count) were selected as proxies to count the number of studies completed and evaluated. Table 4 shows a cross‐section of laboratory tests evaluated during AI rotations.

Radiology

A total of 6197 radiology tests were completed on patients assigned to students, whereas 11,761 radiology tests were completed on patients assigned to other team members. The mean number of radiology exposures for a student was 140 (48 student, 92 team). The most frequently seen radiology tests were similar between student and the team (Table 5).

DISCUSSION

As medical educators, we assume that the clinical training years allow learners to develop essential skills through their varied clinical experiences. Through exposure to direct patient care, to medical decision‐making scenarios, and to senior physician management practices, trainees build the knowledge base for independent practice. To ensure there is sufficient clinical exposure, data on what trainees are encountering may prove beneficial.

In this novel study, we quantified what learners encounter during a 1‐month team‐based inpatient rotation at a large teaching hospital. We effectively measured a number of aspects of internal medicine inpatient training that have been difficult to quantify in the past. The ability to extract learner‐specific data is becoming increasingly available in academic teaching hospitals. For example, VA medical centers have available a daily updated national data warehouse. The other steps necessary for using learner‐specific data include an understanding of the local inpatient processhow tests are ordered, what note titles are used by traineesas well as someone able to build the queries necessary for data extraction. Once built, data extraction should be able to continue as an automated process and used in real time by medical educators.

Our method of data collection has limitations. The orders placed on a learner's primary patients may not have been placed by the learner. For example, orders may have been placed by an overnight resident cross‐covering the learner's patients. We assumed that learners evaluated the results of all tests (or medication changes) that occurred at any time during their rotation, including cross‐cover periods or days off. In addition, our method for evaluating team exposure underestimates the number of team patients calculated for each learner by limiting the query only to patients whose hospital stay was completed before the student left the inpatient service. It is also difficult to know the how many of the exposures are realized by the learner. Differences in learner attention, contrasts in rounding styles, and varying presentation methods will affect the number of exposures truly attained by the learner. Finally, not all clinical exposures can be evaluated through review of an EMR. Clinical experiences, such as care coordination, patient education, and family counseling, cannot be easily extracted.

Data mining EMRs can enhance clinical medical education. Although our data collection was completed retrospectively, we could easily provide learner‐specific data in real time to ward attendings, chief residents, and program directors. This information could direct the development of teaching tools and individualization of curricula. Perhaps, even more importantly, it would also allow educators to define curricular gaps. Whether these gaps are due to the particular patient demographics of a medical center, the practice patterns and strengths of a particular institution, or career interests of a trainee, these gaps may skew the patient‐care experiences encountered by individual trainees. We can use these data to identify differences in clinical experience and then develop opportunities for learnersclinical, didactic, or simulatedto address deficiencies and provide well‐rounded clinical experiences.

Further investigation to better understand the relationship between direct patient‐care experience and clinical skill acquisition is needed. This information could help guide the development of standards on the number of exposures we expect our learners to have with different diagnostic or treatment modalities prior to independent practice. Using learner data to better understand the clinical experiences of our medical trainees, we can hopefully develop more precise and focused curricula to ensure we produce competent graduates.

Acknowledgments

This material is the result of work supported with resources and the use of facilities at the Louis Stokes Cleveland VA Medical Center. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

References
  1. Accreditation Council for Graduate Medical Education. Program requirements for graduate medical education in internal medicine. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQ-PIF/140_internal_medicine_07012013.pdf. Originally accessed December 18, 2012.
  2. Kasten SJ, Prince ME, Lypson ML. Residents make their lists and program directors check them twice: reviewing case logs. J Grad Med Educ. 2012;34:257260.
  3. Mattana J, Kerpen H, Lee C, et al. Quantifying internal medicine resident clinical experience using resident‐selected primary diagnosis codes. J Hosp Med. 2011;6(7):395400.
  4. Rattner SL, Louis DZ, Rabinowitz C, et al. Documenting and comparing medical students' clinical experiences. JAMA. 2001;286:10351040.
  5. Sequist TD, Singh S, Pereira AG, Rusinak D, Pearson SD. Use of an electronic medical record to profile the continuity clinic experiences of primary care residents. Acad Med. 2005;80:390394.
  6. Iglar K, Polsky J, Glazier R. Using a Web‐based system to monitor practice profiles in primary care residency training. Can Fam Physician. 2011;57:10301037.
  7. Nagler J, Harper MB, Bachur RG. An automated electronic case log: using electronic information systems to assess training in emergency medicine. Acad Emergency Med. 2006;13:733739.
  8. Simpao A, Heitz JW, McNulty SE, Chekemian B, Bren BR, Epstein RH. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system. Anesth Analg. 2011;112(2):422429.
  9. Nkoy FL, Petersen S, Matheny Antommaria AH, Maloney CG. Validation of an electronic system for recording medical student patient encounters. AMIA Annu Symp Proc. 2008;2008:510514.
  10. Sidlow R. The structure and content of the medical subinternship: a national survey. J Gen Intern Med. 2001;16:550553.
  11. Jolly BC, MacDonald MM. Education for practice: the role of practical experience in undergraduate and general clinical training. Med Educ. 1989;23:189195.
References
  1. Accreditation Council for Graduate Medical Education. Program requirements for graduate medical education in internal medicine. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQ-PIF/140_internal_medicine_07012013.pdf. Originally accessed December 18, 2012.
  2. Kasten SJ, Prince ME, Lypson ML. Residents make their lists and program directors check them twice: reviewing case logs. J Grad Med Educ. 2012;34:257260.
  3. Mattana J, Kerpen H, Lee C, et al. Quantifying internal medicine resident clinical experience using resident‐selected primary diagnosis codes. J Hosp Med. 2011;6(7):395400.
  4. Rattner SL, Louis DZ, Rabinowitz C, et al. Documenting and comparing medical students' clinical experiences. JAMA. 2001;286:10351040.
  5. Sequist TD, Singh S, Pereira AG, Rusinak D, Pearson SD. Use of an electronic medical record to profile the continuity clinic experiences of primary care residents. Acad Med. 2005;80:390394.
  6. Iglar K, Polsky J, Glazier R. Using a Web‐based system to monitor practice profiles in primary care residency training. Can Fam Physician. 2011;57:10301037.
  7. Nagler J, Harper MB, Bachur RG. An automated electronic case log: using electronic information systems to assess training in emergency medicine. Acad Emergency Med. 2006;13:733739.
  8. Simpao A, Heitz JW, McNulty SE, Chekemian B, Bren BR, Epstein RH. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system. Anesth Analg. 2011;112(2):422429.
  9. Nkoy FL, Petersen S, Matheny Antommaria AH, Maloney CG. Validation of an electronic system for recording medical student patient encounters. AMIA Annu Symp Proc. 2008;2008:510514.
  10. Sidlow R. The structure and content of the medical subinternship: a national survey. J Gen Intern Med. 2001;16:550553.
  11. Jolly BC, MacDonald MM. Education for practice: the role of practical experience in undergraduate and general clinical training. Med Educ. 1989;23:189195.
Issue
Journal of Hospital Medicine - 9(7)
Issue
Journal of Hospital Medicine - 9(7)
Page Number
436-440
Page Number
436-440
Publications
Publications
Article Type
Display Headline
Clinical exposures during internal medicine acting internship: Profiling student and team experiences
Display Headline
Clinical exposures during internal medicine acting internship: Profiling student and team experiences
Sections
Article Source

© 2014 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Todd I. Smith, MD, 10701 East Blvd 111(W), Cleveland, Ohio 44106; E‐mail: [email protected]
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Media Files