User login
“If your time to you is worth savin’
Then you better start swimmin’ or you’ll sink like a stone
For the times they are a-changin’...”
–Bob Dylan
The Accreditation Council for Graduate Medical Education requires residency programs to limit and track trainee work hours to reduce the risk of fatigue, burnout, and medical errors. These hours are documented most often by self-report, at the cost of additional administrative burden for trainees and programs, dubious accuracy, and potentially incentivizing misrepresentation.1
Thus, the study by Soleimani and colleagues2 in this issue is a welcome addition to the literature on duty-hours tracking. Using timestamp data from the electronic health record (EHR), the authors developed and collected validity evidence for an automated computerized algorithm to measure how much time trainees were spending on clinical work. The study was conducted at a large academic internal medicine residency program and tracked 203 trainees working 14,610 days. The authors compared their results to trainee self-report data. Though the approach centered on EHR access logs, it accommodated common scenarios of time away from the computer while at the hospital (eg, during patient rounds). Crucially, the algorithm included EHR access while at home. The absolute discrepancy between the algorithm and self-report averaged 1.38 hours per day. Notably, EHR work at home accounted for about an extra hour per day. When considering in-hospital work alone, the authors found 3% to 13% of trainees exceeding 80-hour workweek limits, but when adding out-of-hospital work, this percentage rose to 10% to 21%.
The authors used inventive methods to improve accuracy. They prespecified EHR functions that constituted active clinical work, classifying reading without editing notes or placing orders simply as “educational study,” which they excluded from duty hours. They ensured that time spent off-site was included and that logins from personal devices while in-hospital were not double-counted. Caveats to the study include the limited generalizability for institutions without the computational resources to replicate the model. The authors acknowledged the inherent flaw in using trainee self-report as the “gold standard,” and potentially some subset of the results could have been corroborated with time-motion observation studies.3 The decision to exclude passive medical record review at home as work arguably discounts the integral value that the “chart biopsy” has on direct patient care; it probably led to systematic underestimation of duty hours for junior and senior residents, who may be most likely to contribute in this way. Similarly, not counting time spent with patients at the end of the day after sign-out risks undercounting hours as well. Nonetheless, this study represents a rigorously designed and scalable approach to meeting regulatory requirements that can potentially lighten the administrative task load for trainees, improve reporting accuracy, and facilitate research comparing work hours to other variables of interest (eg, efficiency). The model can be generalized to other specialties and could document workload for staff physicians as well.
Merits of the study aside, the algorithm underscores troubling realities about the practice of medicine in the 21st century. Do we now equate clinical work with time on the computer? Is our contribution as physicians defined primarily by our presence at the keyboard, rather than the bedside?4 Future research facilitated by automated hours tracking is likely to further elucidate a connection between time spent in the EHR with burnout4 and job dissatisfaction, and the premise of this study is emblematic of the erosion of clinical work-life boundaries that began even before the pandemic.5 While the “times they are a-changin’,” in this respect, it may not be for the better.
1. Grabski DF, Goudreau BJ, Gillen JR, et al. Compliance with the Accreditation Council for Graduate Medical Education duty hours in a general surgery residency program: challenges and solutions in a teaching hospital. Surgery. 2020;167(2):302-307. https://doi.org/10.1016/j.surg.2019.05.029
2. Soleimani H, Adler-Milstein J, Cucina RJ, Murray SG. Automating measurement of trainee work hours. J Hosp Med. 2021;16(7):404-408. https://doi.org/10.12788/jhm.3607
3. Tipping MD, Forth VE, O’Leary KJ, et al. Where did the day go?—a time-motion study of hospitalists. J Hosp Med. 2010;5(6):323-328. https://doi.org/10.1002/jhm.790
4. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc. 2019;26(2):106-114. https://doi.org/10.1093/jamia/ocy145
5. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama time: working after work in the electronic health record. J Gen Intern Med. 2019;34(9):1695-1696. https://doi.org/10.1007/s11606-019-05055-x
“If your time to you is worth savin’
Then you better start swimmin’ or you’ll sink like a stone
For the times they are a-changin’...”
–Bob Dylan
The Accreditation Council for Graduate Medical Education requires residency programs to limit and track trainee work hours to reduce the risk of fatigue, burnout, and medical errors. These hours are documented most often by self-report, at the cost of additional administrative burden for trainees and programs, dubious accuracy, and potentially incentivizing misrepresentation.1
Thus, the study by Soleimani and colleagues2 in this issue is a welcome addition to the literature on duty-hours tracking. Using timestamp data from the electronic health record (EHR), the authors developed and collected validity evidence for an automated computerized algorithm to measure how much time trainees were spending on clinical work. The study was conducted at a large academic internal medicine residency program and tracked 203 trainees working 14,610 days. The authors compared their results to trainee self-report data. Though the approach centered on EHR access logs, it accommodated common scenarios of time away from the computer while at the hospital (eg, during patient rounds). Crucially, the algorithm included EHR access while at home. The absolute discrepancy between the algorithm and self-report averaged 1.38 hours per day. Notably, EHR work at home accounted for about an extra hour per day. When considering in-hospital work alone, the authors found 3% to 13% of trainees exceeding 80-hour workweek limits, but when adding out-of-hospital work, this percentage rose to 10% to 21%.
The authors used inventive methods to improve accuracy. They prespecified EHR functions that constituted active clinical work, classifying reading without editing notes or placing orders simply as “educational study,” which they excluded from duty hours. They ensured that time spent off-site was included and that logins from personal devices while in-hospital were not double-counted. Caveats to the study include the limited generalizability for institutions without the computational resources to replicate the model. The authors acknowledged the inherent flaw in using trainee self-report as the “gold standard,” and potentially some subset of the results could have been corroborated with time-motion observation studies.3 The decision to exclude passive medical record review at home as work arguably discounts the integral value that the “chart biopsy” has on direct patient care; it probably led to systematic underestimation of duty hours for junior and senior residents, who may be most likely to contribute in this way. Similarly, not counting time spent with patients at the end of the day after sign-out risks undercounting hours as well. Nonetheless, this study represents a rigorously designed and scalable approach to meeting regulatory requirements that can potentially lighten the administrative task load for trainees, improve reporting accuracy, and facilitate research comparing work hours to other variables of interest (eg, efficiency). The model can be generalized to other specialties and could document workload for staff physicians as well.
Merits of the study aside, the algorithm underscores troubling realities about the practice of medicine in the 21st century. Do we now equate clinical work with time on the computer? Is our contribution as physicians defined primarily by our presence at the keyboard, rather than the bedside?4 Future research facilitated by automated hours tracking is likely to further elucidate a connection between time spent in the EHR with burnout4 and job dissatisfaction, and the premise of this study is emblematic of the erosion of clinical work-life boundaries that began even before the pandemic.5 While the “times they are a-changin’,” in this respect, it may not be for the better.
“If your time to you is worth savin’
Then you better start swimmin’ or you’ll sink like a stone
For the times they are a-changin’...”
–Bob Dylan
The Accreditation Council for Graduate Medical Education requires residency programs to limit and track trainee work hours to reduce the risk of fatigue, burnout, and medical errors. These hours are documented most often by self-report, at the cost of additional administrative burden for trainees and programs, dubious accuracy, and potentially incentivizing misrepresentation.1
Thus, the study by Soleimani and colleagues2 in this issue is a welcome addition to the literature on duty-hours tracking. Using timestamp data from the electronic health record (EHR), the authors developed and collected validity evidence for an automated computerized algorithm to measure how much time trainees were spending on clinical work. The study was conducted at a large academic internal medicine residency program and tracked 203 trainees working 14,610 days. The authors compared their results to trainee self-report data. Though the approach centered on EHR access logs, it accommodated common scenarios of time away from the computer while at the hospital (eg, during patient rounds). Crucially, the algorithm included EHR access while at home. The absolute discrepancy between the algorithm and self-report averaged 1.38 hours per day. Notably, EHR work at home accounted for about an extra hour per day. When considering in-hospital work alone, the authors found 3% to 13% of trainees exceeding 80-hour workweek limits, but when adding out-of-hospital work, this percentage rose to 10% to 21%.
The authors used inventive methods to improve accuracy. They prespecified EHR functions that constituted active clinical work, classifying reading without editing notes or placing orders simply as “educational study,” which they excluded from duty hours. They ensured that time spent off-site was included and that logins from personal devices while in-hospital were not double-counted. Caveats to the study include the limited generalizability for institutions without the computational resources to replicate the model. The authors acknowledged the inherent flaw in using trainee self-report as the “gold standard,” and potentially some subset of the results could have been corroborated with time-motion observation studies.3 The decision to exclude passive medical record review at home as work arguably discounts the integral value that the “chart biopsy” has on direct patient care; it probably led to systematic underestimation of duty hours for junior and senior residents, who may be most likely to contribute in this way. Similarly, not counting time spent with patients at the end of the day after sign-out risks undercounting hours as well. Nonetheless, this study represents a rigorously designed and scalable approach to meeting regulatory requirements that can potentially lighten the administrative task load for trainees, improve reporting accuracy, and facilitate research comparing work hours to other variables of interest (eg, efficiency). The model can be generalized to other specialties and could document workload for staff physicians as well.
Merits of the study aside, the algorithm underscores troubling realities about the practice of medicine in the 21st century. Do we now equate clinical work with time on the computer? Is our contribution as physicians defined primarily by our presence at the keyboard, rather than the bedside?4 Future research facilitated by automated hours tracking is likely to further elucidate a connection between time spent in the EHR with burnout4 and job dissatisfaction, and the premise of this study is emblematic of the erosion of clinical work-life boundaries that began even before the pandemic.5 While the “times they are a-changin’,” in this respect, it may not be for the better.
1. Grabski DF, Goudreau BJ, Gillen JR, et al. Compliance with the Accreditation Council for Graduate Medical Education duty hours in a general surgery residency program: challenges and solutions in a teaching hospital. Surgery. 2020;167(2):302-307. https://doi.org/10.1016/j.surg.2019.05.029
2. Soleimani H, Adler-Milstein J, Cucina RJ, Murray SG. Automating measurement of trainee work hours. J Hosp Med. 2021;16(7):404-408. https://doi.org/10.12788/jhm.3607
3. Tipping MD, Forth VE, O’Leary KJ, et al. Where did the day go?—a time-motion study of hospitalists. J Hosp Med. 2010;5(6):323-328. https://doi.org/10.1002/jhm.790
4. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc. 2019;26(2):106-114. https://doi.org/10.1093/jamia/ocy145
5. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama time: working after work in the electronic health record. J Gen Intern Med. 2019;34(9):1695-1696. https://doi.org/10.1007/s11606-019-05055-x
1. Grabski DF, Goudreau BJ, Gillen JR, et al. Compliance with the Accreditation Council for Graduate Medical Education duty hours in a general surgery residency program: challenges and solutions in a teaching hospital. Surgery. 2020;167(2):302-307. https://doi.org/10.1016/j.surg.2019.05.029
2. Soleimani H, Adler-Milstein J, Cucina RJ, Murray SG. Automating measurement of trainee work hours. J Hosp Med. 2021;16(7):404-408. https://doi.org/10.12788/jhm.3607
3. Tipping MD, Forth VE, O’Leary KJ, et al. Where did the day go?—a time-motion study of hospitalists. J Hosp Med. 2010;5(6):323-328. https://doi.org/10.1002/jhm.790
4. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc. 2019;26(2):106-114. https://doi.org/10.1093/jamia/ocy145
5. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama time: working after work in the electronic health record. J Gen Intern Med. 2019;34(9):1695-1696. https://doi.org/10.1007/s11606-019-05055-x
© 2021 Society of Hospital Medicine