User login
Improving the Transition of Intravenous to Enteral Antibiotics in Pediatric Patients with Pneumonia or Skin and Soft Tissue Infections
Intravenous (IV) antibiotics are commonly used in hospitalized pediatric patients to treat bacterial infections. Antimicrobial stewardship guidelines published by the Infectious Diseases Society of America (IDSA) recommend institutions develop a systematic plan to convert from IV to enteral antibiotics, as early transition may reduce healthcare costs, decrease length of stay (LOS), and avoid prolonged IV access complications1 such as extravasation, thrombosis, and catheter-associated infections.2-5
Pediatric patients with community-acquired pneumonia (CAP) and mild skin and soft tissue infections (SSTI) may not require IV antibiotics, even if the patient is hospitalized.6 Although national guidelines for pediatric CAP and SSTI recommend IV antibiotics for hospitalized patients, these guidelines state that mild infections may be treated with enteral antibiotics and emphasize discontinuation of IV antibiotics when the patient meets discharge criteria.7,8 Furthermore, several enteral antibiotics used for the treatment of CAP and SSTI, such as cephalexin and clindamycin,9 have excellent bioavailability (>90%) or can achieve sufficient concentrations to attain the pharmacodynamic target (ie, amoxicillin and trimethoprim–sulfamethoxazole).10,11 Nonetheless, the guidelines do not explicitly outline criteria regarding the transition from IV to enteral antibiotics.7,8
At our institution, patients admitted to Hospital Medicine (HM) often remained on IV antibiotics until discharge. Data review revealed that antibiotic treatment of CAP and SSTI posed the greatest opportunity for early conversion to enteral therapy based on the high frequency of admissions and the ability of commonly used enteral antibiotics to attain pharmacodynamic targets. We sought to change practice culture by decoupling transition to enteral antibiotics from discharge and use administration of other enteral medications as an objective indicator for transition. Our aim was to increase the proportion of enterally administered antibiotic doses for HM patients aged >60 days admitted with uncomplicated CAP or SSTI from 44% to 75% in eight months.
METHODS
Context
Cincinnati Children’s Hospital Medical Center (CCHMC) is a large, urban, academic hospital. The HM division has 45 attendings and admits >8,000 general pediatric patients annually. The five HM teams at the main campus consist of attendings, fellows, residents, and medical students. One HM team serves as the resident quality improvement (QI) team where residents collaborate in a longitudinal study under the guidance of QI-trained coaches. The focus of this QI initiative was determined by resident consensus and aligned with a high-value care curriculum.12
To identify the target patient population, we investigated IV antimicrobials frequently used in HM patients. Ampicillin and clindamycin are commonly used IV antibiotics, most frequently corresponding with the diagnoses of CAP and SSTI, respectively, accounting for half of all antibiotic use on the HM service. Amoxicillin, the enteral equivalent of ampicillin, can achieve sufficient concentrations to attain the pharmacodynamic target at infection sites, and clindamycin has high bioavailability, making them ideal options for early transition. Our institution’s robust antimicrobial stewardship program has published local guidelines on using amoxicillin as the enteral antibiotic of choice for uncomplicated CAP, but it does not provide guidance on the timing of transition for either CAP or SSTI; the clinical team makes this decision.
HM attendings were surveyed to determine the criteria used to transition from IV to enteral antibiotics for patients with CAP or SSTI. The survey illustrated practice variability with providers using differing clinical criteria to signal the timing of transition. Additionally, only 49% of respondents (n = 37) rated themselves as “very comfortable” with residents making autonomous decisions to transition to enteral antibiotics. We chose to use the administration of other enteral medications, instead of discharge readiness, as an objective indicator of a patient’s readiness to transition to enteral antibiotics, given the low-risk patient population and the ability of the enteral antibiotics commonly used for CAP and SSTI to achieve pharmacodynamic targets.
The study population included patients aged >60 days admitted to HM with CAP or SSTI treated with any antibiotic. We excluded patients with potential complications or significant progression of their disease process, including patients with parapneumonic effusions or chest tubes, patients who underwent bronchoscopy, and patients with osteomyelitis, septic arthritis, or preseptal or orbital cellulitis. Past medical history and clinical status on admission were not used to exclude patients.
Interventions
Our multidisciplinary team, formed in January 2017, included HM attendings, HM fellows, pediatric residents, a critical care attending, a pharmacy resident, and an antimicrobial stewardship pharmacist. Under the guidance of QI coaches, the residents on the HM QI team developed and tested all interventions on their team and then determined which interventions would spread to the other four teams. The nursing director of our primary HM unit disseminated project updates to bedside nurses. A simplified failure mode and effects analysis identified areas for improvement and potential interventions. Interventions focused on the following key drivers (Figure 1): increased prescriber awareness of medication charge, standardization of conversion from IV to enteral antibiotics, clear definition of the patients ready for transition, ongoing evaluation of the antimicrobial plan, timely recognition by prescribers of patients ready for transition, culture shift regarding the appropriate administration route in the inpatient setting, and transparency of data. The team implemented sequential Plan-Do-Study-Act (PDSA) cycles13 to test the interventions.
Charge Table
To improve knowledge about the increased charge for commonly used IV medications compared with enteral formulations, a table comparing relative charges was shared during monthly resident morning conferences and at an HM faculty meeting. The table included charge comparisons between ampicillin and amoxicillin and IV and enteral clindamycin.
Standardized Language in Electronic Health Record (EHR) Antibiotic Plan on Rounds
Standardized language to document antibiotic transition plans was added to admission and progress note templates in the EHR. The standard template prompted residents to (1) define clinical transition criteria, (2) discuss attending comfort with transition overnight (based on survey results), and (3) document patient preference of solid or liquid dosage forms. Plans were reviewed and updated daily. We hypothesized that since residents use the information in the daily progress notes, including assessments and plans, to present on rounds, inclusion of the transition criteria in the note would prompt transition plan discussions.
Communication Bundle
To promote early transition to enteral antibiotics, we standardized the discussion about antibiotic transition between residents and attendings. During a weekly preexisting meeting, the resident QI team reviewed preferences for transitions with the new service attending. By identifying attending preferences early, residents were able to proactively transition patients who met the criteria (eg, antibiotic transition in the evening instead of waiting until morning rounds). This discussion also provided an opportunity to engage service attendings in the QI efforts, which were also shared at HM faculty meetings quarterly.
Recognizing that in times of high census, discussion of patient plans may be abbreviated during rounds, residents were asked to identify all patients on IV antibiotics while reviewing patient medication orders prior to rounds. As part of an existing daily prerounds huddle to discuss rounding logistics, residents listed all patients on IV antibiotics and discussed which patients were ready for transition. If patients could not be transitioned immediately, the team identified the transition criteria.
At preexisting evening huddles between overnight shift HM residents and the evening HM attending, residents identified patients who were prescribed IV antibiotics and discussed readiness for enteral transition. If a patient could be transitioned overnight, enteral antibiotic orders were placed. Overnight residents were also encouraged to review the transition criteria with families upon admission.
Real-time Identification of Failures and Feedback
For two weeks, the EHR was queried daily to identify patients admitted for uncomplicated CAP and SSTI who were on antibiotics as well as other enteral medications. A failure was defined as an IV antibiotic dose given to a patient who was administered any enteral medication. Residents on the QI team approached residents on other HM teams whenever patients were identified as a failed transition to learn about failure reasons.
Study of the Interventions
Data for HM patients who met the inclusion criteria were collected weekly from January 2016 through June 2018 via EHR query. We initially searched for diagnoses that fit under the disease categories of pneumonia and SSTI in the EHR, which generated a list of International Classification of Disease-9 and -10 Diagnosis codes (Appendix Figure 1). The query identified patients based on these codes and reported whether the identified patients took a dose of any enteral medication, excluding nystatin, sildenafil, tacrolimus, and mouthwashes, which are commonly continued during NPO status due to no need for absorption or limited parenteral options. It also reported the ordered route of administration for the queried antibiotics (Appendix Figure 1).
The 2016 calendar year established our baseline to account for seasonal variability. Data were reported weekly and reviewed to evaluate the impact of PDSA cycles and inform new interventions.
Measures
Our process measure was the total number of enteral antibiotic doses divided by all antibiotic doses in patients receiving any enteral medication. We reasoned that if patients were well enough to take medications enterally, they could be given an enteral antibiotic that is highly bioavailable or readily achieves concentrations that attain pharmacodynamic targets. This practice change was a culture shift, decoupling the switch to enteral antibiotics from discharge readiness. Our EHR query reported only the antibiotic doses given to patients who took an enteral medication on the day of antibiotic administration and excluded patients who received only IV medications.
Outcome measures included antimicrobial costs per patient encounter using average wholesale prices, which were reported in our EHR query, and LOS. To ensure that transitions of IV to enteral antibiotics were not negatively impacting patient outcomes, patient readmissions within seven days served as a balancing measure.
Analysis
An annotated statistical process control p-chart tracked the impact of interventions on the proportion of antibiotic doses that were enterally administered during hospitalization. An x-bar and an s-chart tracked the impact of interventions on antimicrobial costs per patient encounter and on LOS. A p-chart and an encounters-between g-chart were used to evaluate the impact of our interventions on readmissions. Control chart rules for identifying special cause were used for center line shifts.14
Ethical Considerations
This study was part of a larger study of the residency high-value care curriculum,12 which was deemed exempt by the CCHMC IRB.
RESULTS
The baseline data collected included 372 patients and the postintervention period in 2017 included 326 patients (Table). Approximately two-thirds of patients had a diagnosis of CAP.
The percentage of antibiotic doses given enterally increased from 44% to 80% within eight months (Figure 2). When studying the impact of interventions, residents on the HM QI team found that the standard EHR template added to daily notes did not consistently prompt residents to discuss antibiotic plans and thus was abandoned. Initial improvement coincided with standardizing discussions between residents and attendings regarding transitions. Furthermore, discussion of all patients on IV antibiotics during the prerounds huddle allowed for reliable, daily communication about antibiotic plans and was subsequently spread to and adopted by all HM teams. The percentage of enterally administered antibiotic doses increased to >75% after the evening huddle, which involved all HM teams, and real-time identification of failures on all HM teams with provider feedback. Despite variability when the total number of antibiotic doses prescribed per week was low (<10), we demonstrated sustainability for 11 months (Figure 2), during which the prerounds and evening huddle discussions were continued and an updated control chart was shown monthly to residents during their educational conferences.
Residents on the QI team spoke directly with other HM residents when there were missed opportunities for transition. Based on these discussions and intermittent chart reviews, common reasons for failure to transition in patients with CAP included admission for failed outpatient enteral treatment, recent evaluation by critical care physicians for possible transfer to the intensive care unit, and difficulty weaning oxygen. For patients with SSTI, hand abscesses requiring drainage by surgery and treatment failure with other antibiotics constituted many of the IV antibiotic doses given to patients on enteral medications.
Antimicrobial costs per patient encounter decreased by 70% over one year; the shift in costs coincided with the second shift in our process measure (Appendix Figure 2A). Based on an estimate of 350 patients admitted per year for uncomplicated CAP or SSTI, this translates to an annual cost savings of approximately $29,000. The standard deviation of costs per patient encounter decreased by 84% (Appendix Figure 2B), suggesting a decrease in the variability of prescribing practices.
The average LOS in our patient population prior to intervention was 2.1 days and did not change (Appendix Figure 2C), but the standard deviation decreased by >50% (Appendix Figure 2D). There was no shift in the mean seven-day readmission rate or the number of encounters between readmissions (2.6% and 26, respectively; Appendix Figure 3). In addition, the hospital billing department did not identify an increase in insurance denials related to the route of antibiotic administration.
DISCUSSION
Summary
Using improvement science, we promoted earlier transition to enteral antibiotics for children hospitalized with uncomplicated CAP and SSTI by linking the decision for transition to the ability to take other enteral medications, rather than to discharge readiness. We increased the percentage of enterally administered antibiotic doses in this patient population from 44% to 80% in eight months. Although we did not observe a decrease in LOS as previously noted in a cost analysis study comparing pediatric patients with CAP treated with oral antibiotics versus those treated with IV antibiotics,15 we did find a decrease in LOS variability and in antimicrobial costs to our patients. These cost savings did not include potential savings from nursing or pharmacy labor. In addition, we noted a decrease in the variability in antibiotic prescribing practice, which demonstrates provider ability and willingness to couple antibiotic route transition to an objective characteristic (administration of other enteral medications).
A strength of our study was that residents, the most frequent prescribers of antibiotics on our HM service, were highly involved in the QI initiative, including defining the SMART aim, identifying key drivers, developing interventions, and completing sequential PDSA cycles. Under the guidance of QI-trained coaches, residents developed feasible interventions and assessed their success in real time. Consistent with other studies,16,17 resident buy-in and involvement led to the success of our improvement study.
Interpretation
Despite emerging evidence regarding the timing of transition to enteral antibiotics, several factors impeded early transition at our institution, including physician culture, variable practice habits, and hospital workflow. Evidence supports the use of enteral antibiotics in immunocompetent children hospitalized for uncomplicated CAP who do not have chronic lung disease, are not in shock, and have oxygen saturations >85%.6 Although existing literature suggests that in pediatric patients admitted for SSTIs not involving the eye or bone, IV antibiotics may be transitioned when clinical improvement, evidenced by a reduction in fever or erythema, is noted,6 enteral antibiotics that achieve appropriate concentrations to attain pharmacodynamic targets should have the same efficacy as that of IV antibiotics.9 Using the criterion of administration of any medication enterally to identify a patient’s readiness to transition, we were able to overcome practice variation among providers who may have differing opinions of what constitutes clinical improvement. Of note, new evidence is emerging on predictors of enteral antibiotic treatment failure in patients with CAP and SSTI to guide transition timing, but these studies have largely focused on the adult population or were performed in the outpatient and emergency department (ED) settings.18,19 Regardless, the stable number of encounters between readmissions in our patient population likely indicates that treatment failure in these patients was rare.
Rising healthcare costs have led to concerns around sustainability of the healthcare system;20,21 tackling overuse in clinical practice, as in our study, is one mitigation strategy. Several studies have used QI methods to facilitate the provision of high-value care through the decrease of continuous monitor overuse and extraneous ordering of electrolytes.22,23 Our QI study adds to the high-value care literature by safely decreasing the use of IV antibiotics. One retrospective study demonstrated that a one-day decrease in the use of IV antibiotics in pneumonia resulted in decreased costs without an increase in readmissions, similar to our findings.24 In adults, QI initiatives aimed at improving early transition of antibiotics utilized electronic trigger tools.25,26 Fischer et al. used active orders for scheduled enteral medications or an enteral diet as indication that a patient’s IV medications could be converted to enteral form.26
Our work is not without limitations. The list of ICD-9 and -10 codes used to query the EHR did not capture all diagnoses that would be considered as uncomplicated CAP or SSTI. However, we included an extensive list of diagnoses to ensure that the majority of patients meeting our inclusion criteria were captured. Our process measure did not account for patients on IV antibiotics who were not administered other enteral medications but tolerating an enteral diet. These patients were not identified in our EHR query and were not included in our process measure as a failure. However, in latter interventions, residents identified all patients on IV antibiotics, so that patients not identified by our EHR query benefited from our work. Furthermore, this QI study was conducted at a single institution and several interventions took advantage of preexisting structured huddles and a resident QI curriculum, which may not exist at other institutions. Our study does highlight that engaging frontline providers, such as residents, to review antibiotic orders consistently and question the appropriateness of the administration route is key to making incremental changes in prescribing practices.
CONCLUSIONS
Through a partnership between HM and Pharmacy and with substantial resident involvement, we improved the transition of IV antibiotics in patients with CAP or SSTI by increasing the percentage of enterally administered antibiotic doses and reducing antimicrobial costs and variability in antibiotic prescribing practices. This work illustrates how reducing overuse of IV antibiotics promotes high-value care and aligns with initiatives to prevent avoidable harm.27 Our work highlights that standardized discussions about medication orders to create consensus around enteral antibiotic transitions, real-time feedback, and challenging the status quo can influence practice habits and effect change.
Next steps include testing automated methods to notify providers of opportunities for transition from IV to enteral antibiotics through embedded clinical decision support, a method similar to the electronic trigger tools used in adult QI studies.25,26 Since our prerounds huddle includes identifying all patients on IV antibiotics, studying the transition to enteral antibiotics and its effect on prescribing practices in other diagnoses (ie, urinary tract infection and osteomyelitis) may contribute to spreading these efforts. Partnering with our ED colleagues may be an important next step, as several patients admitted to HM on IV antibiotics are given their first dose in the ED.
Acknowledgments
The authors would like to thank the faculty of the James M. Anderson Center for Health Systems Excellence Intermediate Improvement Science Series for their guidance in the planning of this project. The authors would also like to thank Ms. Ursula Bradshaw and Mr. Michael Ponti-Zins for obtaining the hospital data on length of stay and readmissions. The authors acknowledge Dr. Philip Hagedorn for his assistance with the software that queries the electronic health record and Dr. Laura Brower and Dr. Joanna Thomson for their assistance with statistical analysis. The authors are grateful to all the residents and coaches on the QI Hospital Medicine team who contributed ideas on study design and interventions.
1. Dellit TH, Owens RC, McGowan JE, Jr, et al. Infectious diseases society of America and the society for healthcare epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis. 2007;44(2):159-177. https://doi.org/10.1086/510393.
2. Shah SS, Srivastava R, Wu S, et al. Intravenous Versus oral antibiotics for postdischarge treatment of complicated pneumonia. Pediatrics. 2016;138(6). https://doi.org/10.1542/peds.2016-1692.
3. Keren R, Shah SS, Srivastava R, et al. Comparative effectiveness of intravenous vs oral antibiotics for postdischarge treatment of acute osteomyelitis in children. JAMA Pediatr. 2015;169(2):120-128. https://doi.org/10.1001/jamapediatrics.2014.2822.
4. Jumani K, Advani S, Reich NG, Gosey L, Milstone AM. Risk factors for peripherally inserted central venous catheter complications in children. JAMA Pediatr. 2013;167(5):429-435.https://doi.org/10.1001/jamapediatrics.2013.775.
5. Zaoutis T, Localio AR, Leckerman K, et al. Prolonged intravenous therapy versus early transition to oral antimicrobial therapy for acute osteomyelitis in children. Pediatrics. 2009;123(2):636-642. https://doi.org/10.1542/peds.2008-0596.
6. McMullan BJ, Andresen D, Blyth CC, et al. Antibiotic duration and timing of the switch from intravenous to oral route for bacterial infections in children: systematic review and guidelines. Lancet Infect Dis. 2016;16(8):e139-e152. https://doi.org/10.1016/S1473-3099(16)30024-X.
7. Bradley JS, Byington CL, Shah SS, et al. The management of community-acquired pneumonia in infants and children older than 3 months of age: clinical practice guidelines by the Pediatric Infectious Diseases Society and the Infectious Diseases Society of America. Clin Infect Dis. 2011;53(7):e25-e76. https://doi.org/10.1093/cid/cir531.
8. Stevens DL, Bisno AL, Chambers HF, et al. Executive summary: practice guidelines for the diagnosis and management of skin and soft tissue infections: 2014 update by the infectious diseases society of America. Clin Infect Dis. 2014;59(2):147-159. https://doi.org/10.1093/cid/ciu444.
9. MacGregor RR, Graziani AL. Oral administration of antibiotics: a rational alternative to the parenteral route. Clin Infect Dis. 1997;24(3):457-467. https://doi.org/10.1093/clinids/24.3.457.
10. Downes KJ, Hahn A, Wiles J, Courter JD, Vinks AA. Dose optimisation of antibiotics in children: application of pharmacokinetics/pharmacodynamics in paediatrics. Int J Antimicrob Agents. 2014;43(3):223-230. https://doi.org/10.1016/j.ijantimicag.2013.11.006.
11. Autmizguine J, Melloni C, Hornik CP, et al. Population pharmacokinetics of trimethoprim-sulfamethoxazole in infants and children. Antimicrob Agents Chemother. 2018;62(1):e01813-e01817. https://doi.org/10.1128/AAC.01813-17.
12. Dewan M, Herrmann LE, Tchou MJ, et al. Development and evaluation of high-value pediatrics: a high-value care pediatric resident curriculum. Hosp Pediatr. 2018;8(12):785-792. https://doi.org/10.1542/hpeds.2018-0115
13. Langley GJ, Moen RD, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. New Jersey, US: John Wiley & Sons; 2009.
14. Benneyan JC. Use and interpretation of statistical quality control charts. Int J Qual Health Care. 1998;10(1):69-73. https://doi.org/10.1093/intqhc/10.1.69.
15. Lorgelly PK, Atkinson M, Lakhanpaul M, et al. Oral versus i.v. antibiotics for community-acquired pneumonia in children: a cost-minimisation analysis. Eur Respir J. 2010;35(4):858-864. https://doi.org/10.1183/09031936.00087209.
16. Vidyarthi AR, Green AL, Rosenbluth G, Baron RB. Engaging residents and fellows to improve institution-wide quality: the first six years of a novel financial incentive program. Acad Med. 2014;89(3):460-468. https://doi.org/10.1097/ACM.0000000000000159.
17. Stinnett-Donnelly JM, Stevens PG, Hood VL. Developing a high value care programme from the bottom up: a programme of faculty-resident improvement projects targeting harmful or unnecessary care. BMJ Qual Saf. 2016;25(11):901-908. https://doi.org/10.1136/bmjqs-2015-004546.
18. Peterson D, McLeod S, Woolfrey K, McRae A. Predictors of failure of empiric outpatient antibiotic therapy in emergency department patients with uncomplicated cellulitis. Acad Emerg Med. 2014;21(5):526-531. https://doi.org/10.1111/acem.12371.
19. Yadav K, Suh KN, Eagles D, et al. Predictors of oral antibiotic treatment failure for non-purulent skin and soft tissue infections in the emergency department. Acad Emerg Med. 2018;20(S1):S24-S25. https://doi.org/10.1017/cem.2018.114.
20. Organisation for Economic Co-operation and Development. Healthcare costs unsustainable in advanced economies without reform. http://www.oecd.org/health/healthcarecostsunsustainableinadvancedeconomieswithoutreform.htm. Accessed June 28, 2018; 2015.
21. Berwick DM, Hackbarth AD. Eliminating waste in US health care. JAMA. 2012;307(14):1513-1516. https://doi.org/10.1001/jama.2012.362.
22. Schondelmeyer AC, Simmons JM, Statile AM, et al. Using quality improvement to reduce continuous pulse oximetry use in children with wheezing. Pediatrics. 2015;135(4):e1044-e1051. https://doi.org/10.1542/peds.2014-2295.
23. Tchou MJ, Tang Girdwood S, Wormser B, et al. Reducing electrolyte testing in hospitalized children by using quality improvement methods. Pediatrics. 2018;141(5). https://doi.org/10.1542/peds.2017-3187.
24. Christensen EW, Spaulding AB, Pomputius WF, Grapentine SP. Effects of hospital practice patterns for antibiotic administration for pneumonia on hospital lengths of stay and costs. J Pediatr Infect Dis Soc. 2019;8(2):115-121. https://doi.org/10.1093/jpids/piy003.
25. Berrevoets MAH, Pot JHLW, Houterman AE, et al. An electronic trigger tool to optimise intravenous to oral antibiotic switch: a controlled, interrupted time series study. Antimicrob Resist Infect Control. 2017;6:81. https://doi.org/10.1186/s13756-017-0239-3.
26. Fischer MA, Solomon DH, Teich JM, Avorn J. Conversion from intravenous to oral medications: assessment of a computerized intervention for hospitalized patients. Arch Intern Med. 2003;163(21):2585-2589. https://doi.org/10.1001/archinte.163.21.2585.
27. Schroeder AR, Harris SJ, Newman TB. Safely doing less: a missing component of the patient safety dialogue. Pediatrics. 2011;128(6):e1596-e1597. https://doi.org/10.1542/peds.2011-2726.
Intravenous (IV) antibiotics are commonly used in hospitalized pediatric patients to treat bacterial infections. Antimicrobial stewardship guidelines published by the Infectious Diseases Society of America (IDSA) recommend institutions develop a systematic plan to convert from IV to enteral antibiotics, as early transition may reduce healthcare costs, decrease length of stay (LOS), and avoid prolonged IV access complications1 such as extravasation, thrombosis, and catheter-associated infections.2-5
Pediatric patients with community-acquired pneumonia (CAP) and mild skin and soft tissue infections (SSTI) may not require IV antibiotics, even if the patient is hospitalized.6 Although national guidelines for pediatric CAP and SSTI recommend IV antibiotics for hospitalized patients, these guidelines state that mild infections may be treated with enteral antibiotics and emphasize discontinuation of IV antibiotics when the patient meets discharge criteria.7,8 Furthermore, several enteral antibiotics used for the treatment of CAP and SSTI, such as cephalexin and clindamycin,9 have excellent bioavailability (>90%) or can achieve sufficient concentrations to attain the pharmacodynamic target (ie, amoxicillin and trimethoprim–sulfamethoxazole).10,11 Nonetheless, the guidelines do not explicitly outline criteria regarding the transition from IV to enteral antibiotics.7,8
At our institution, patients admitted to Hospital Medicine (HM) often remained on IV antibiotics until discharge. Data review revealed that antibiotic treatment of CAP and SSTI posed the greatest opportunity for early conversion to enteral therapy based on the high frequency of admissions and the ability of commonly used enteral antibiotics to attain pharmacodynamic targets. We sought to change practice culture by decoupling transition to enteral antibiotics from discharge and use administration of other enteral medications as an objective indicator for transition. Our aim was to increase the proportion of enterally administered antibiotic doses for HM patients aged >60 days admitted with uncomplicated CAP or SSTI from 44% to 75% in eight months.
METHODS
Context
Cincinnati Children’s Hospital Medical Center (CCHMC) is a large, urban, academic hospital. The HM division has 45 attendings and admits >8,000 general pediatric patients annually. The five HM teams at the main campus consist of attendings, fellows, residents, and medical students. One HM team serves as the resident quality improvement (QI) team where residents collaborate in a longitudinal study under the guidance of QI-trained coaches. The focus of this QI initiative was determined by resident consensus and aligned with a high-value care curriculum.12
To identify the target patient population, we investigated IV antimicrobials frequently used in HM patients. Ampicillin and clindamycin are commonly used IV antibiotics, most frequently corresponding with the diagnoses of CAP and SSTI, respectively, accounting for half of all antibiotic use on the HM service. Amoxicillin, the enteral equivalent of ampicillin, can achieve sufficient concentrations to attain the pharmacodynamic target at infection sites, and clindamycin has high bioavailability, making them ideal options for early transition. Our institution’s robust antimicrobial stewardship program has published local guidelines on using amoxicillin as the enteral antibiotic of choice for uncomplicated CAP, but it does not provide guidance on the timing of transition for either CAP or SSTI; the clinical team makes this decision.
HM attendings were surveyed to determine the criteria used to transition from IV to enteral antibiotics for patients with CAP or SSTI. The survey illustrated practice variability with providers using differing clinical criteria to signal the timing of transition. Additionally, only 49% of respondents (n = 37) rated themselves as “very comfortable” with residents making autonomous decisions to transition to enteral antibiotics. We chose to use the administration of other enteral medications, instead of discharge readiness, as an objective indicator of a patient’s readiness to transition to enteral antibiotics, given the low-risk patient population and the ability of the enteral antibiotics commonly used for CAP and SSTI to achieve pharmacodynamic targets.
The study population included patients aged >60 days admitted to HM with CAP or SSTI treated with any antibiotic. We excluded patients with potential complications or significant progression of their disease process, including patients with parapneumonic effusions or chest tubes, patients who underwent bronchoscopy, and patients with osteomyelitis, septic arthritis, or preseptal or orbital cellulitis. Past medical history and clinical status on admission were not used to exclude patients.
Interventions
Our multidisciplinary team, formed in January 2017, included HM attendings, HM fellows, pediatric residents, a critical care attending, a pharmacy resident, and an antimicrobial stewardship pharmacist. Under the guidance of QI coaches, the residents on the HM QI team developed and tested all interventions on their team and then determined which interventions would spread to the other four teams. The nursing director of our primary HM unit disseminated project updates to bedside nurses. A simplified failure mode and effects analysis identified areas for improvement and potential interventions. Interventions focused on the following key drivers (Figure 1): increased prescriber awareness of medication charge, standardization of conversion from IV to enteral antibiotics, clear definition of the patients ready for transition, ongoing evaluation of the antimicrobial plan, timely recognition by prescribers of patients ready for transition, culture shift regarding the appropriate administration route in the inpatient setting, and transparency of data. The team implemented sequential Plan-Do-Study-Act (PDSA) cycles13 to test the interventions.
Charge Table
To improve knowledge about the increased charge for commonly used IV medications compared with enteral formulations, a table comparing relative charges was shared during monthly resident morning conferences and at an HM faculty meeting. The table included charge comparisons between ampicillin and amoxicillin and IV and enteral clindamycin.
Standardized Language in Electronic Health Record (EHR) Antibiotic Plan on Rounds
Standardized language to document antibiotic transition plans was added to admission and progress note templates in the EHR. The standard template prompted residents to (1) define clinical transition criteria, (2) discuss attending comfort with transition overnight (based on survey results), and (3) document patient preference of solid or liquid dosage forms. Plans were reviewed and updated daily. We hypothesized that since residents use the information in the daily progress notes, including assessments and plans, to present on rounds, inclusion of the transition criteria in the note would prompt transition plan discussions.
Communication Bundle
To promote early transition to enteral antibiotics, we standardized the discussion about antibiotic transition between residents and attendings. During a weekly preexisting meeting, the resident QI team reviewed preferences for transitions with the new service attending. By identifying attending preferences early, residents were able to proactively transition patients who met the criteria (eg, antibiotic transition in the evening instead of waiting until morning rounds). This discussion also provided an opportunity to engage service attendings in the QI efforts, which were also shared at HM faculty meetings quarterly.
Recognizing that in times of high census, discussion of patient plans may be abbreviated during rounds, residents were asked to identify all patients on IV antibiotics while reviewing patient medication orders prior to rounds. As part of an existing daily prerounds huddle to discuss rounding logistics, residents listed all patients on IV antibiotics and discussed which patients were ready for transition. If patients could not be transitioned immediately, the team identified the transition criteria.
At preexisting evening huddles between overnight shift HM residents and the evening HM attending, residents identified patients who were prescribed IV antibiotics and discussed readiness for enteral transition. If a patient could be transitioned overnight, enteral antibiotic orders were placed. Overnight residents were also encouraged to review the transition criteria with families upon admission.
Real-time Identification of Failures and Feedback
For two weeks, the EHR was queried daily to identify patients admitted for uncomplicated CAP and SSTI who were on antibiotics as well as other enteral medications. A failure was defined as an IV antibiotic dose given to a patient who was administered any enteral medication. Residents on the QI team approached residents on other HM teams whenever patients were identified as a failed transition to learn about failure reasons.
Study of the Interventions
Data for HM patients who met the inclusion criteria were collected weekly from January 2016 through June 2018 via EHR query. We initially searched for diagnoses that fit under the disease categories of pneumonia and SSTI in the EHR, which generated a list of International Classification of Disease-9 and -10 Diagnosis codes (Appendix Figure 1). The query identified patients based on these codes and reported whether the identified patients took a dose of any enteral medication, excluding nystatin, sildenafil, tacrolimus, and mouthwashes, which are commonly continued during NPO status due to no need for absorption or limited parenteral options. It also reported the ordered route of administration for the queried antibiotics (Appendix Figure 1).
The 2016 calendar year established our baseline to account for seasonal variability. Data were reported weekly and reviewed to evaluate the impact of PDSA cycles and inform new interventions.
Measures
Our process measure was the total number of enteral antibiotic doses divided by all antibiotic doses in patients receiving any enteral medication. We reasoned that if patients were well enough to take medications enterally, they could be given an enteral antibiotic that is highly bioavailable or readily achieves concentrations that attain pharmacodynamic targets. This practice change was a culture shift, decoupling the switch to enteral antibiotics from discharge readiness. Our EHR query reported only the antibiotic doses given to patients who took an enteral medication on the day of antibiotic administration and excluded patients who received only IV medications.
Outcome measures included antimicrobial costs per patient encounter using average wholesale prices, which were reported in our EHR query, and LOS. To ensure that transitions of IV to enteral antibiotics were not negatively impacting patient outcomes, patient readmissions within seven days served as a balancing measure.
Analysis
An annotated statistical process control p-chart tracked the impact of interventions on the proportion of antibiotic doses that were enterally administered during hospitalization. An x-bar and an s-chart tracked the impact of interventions on antimicrobial costs per patient encounter and on LOS. A p-chart and an encounters-between g-chart were used to evaluate the impact of our interventions on readmissions. Control chart rules for identifying special cause were used for center line shifts.14
Ethical Considerations
This study was part of a larger study of the residency high-value care curriculum,12 which was deemed exempt by the CCHMC IRB.
RESULTS
The baseline data collected included 372 patients and the postintervention period in 2017 included 326 patients (Table). Approximately two-thirds of patients had a diagnosis of CAP.
The percentage of antibiotic doses given enterally increased from 44% to 80% within eight months (Figure 2). When studying the impact of interventions, residents on the HM QI team found that the standard EHR template added to daily notes did not consistently prompt residents to discuss antibiotic plans and thus was abandoned. Initial improvement coincided with standardizing discussions between residents and attendings regarding transitions. Furthermore, discussion of all patients on IV antibiotics during the prerounds huddle allowed for reliable, daily communication about antibiotic plans and was subsequently spread to and adopted by all HM teams. The percentage of enterally administered antibiotic doses increased to >75% after the evening huddle, which involved all HM teams, and real-time identification of failures on all HM teams with provider feedback. Despite variability when the total number of antibiotic doses prescribed per week was low (<10), we demonstrated sustainability for 11 months (Figure 2), during which the prerounds and evening huddle discussions were continued and an updated control chart was shown monthly to residents during their educational conferences.
Residents on the QI team spoke directly with other HM residents when there were missed opportunities for transition. Based on these discussions and intermittent chart reviews, common reasons for failure to transition in patients with CAP included admission for failed outpatient enteral treatment, recent evaluation by critical care physicians for possible transfer to the intensive care unit, and difficulty weaning oxygen. For patients with SSTI, hand abscesses requiring drainage by surgery and treatment failure with other antibiotics constituted many of the IV antibiotic doses given to patients on enteral medications.
Antimicrobial costs per patient encounter decreased by 70% over one year; the shift in costs coincided with the second shift in our process measure (Appendix Figure 2A). Based on an estimate of 350 patients admitted per year for uncomplicated CAP or SSTI, this translates to an annual cost savings of approximately $29,000. The standard deviation of costs per patient encounter decreased by 84% (Appendix Figure 2B), suggesting a decrease in the variability of prescribing practices.
The average LOS in our patient population prior to intervention was 2.1 days and did not change (Appendix Figure 2C), but the standard deviation decreased by >50% (Appendix Figure 2D). There was no shift in the mean seven-day readmission rate or the number of encounters between readmissions (2.6% and 26, respectively; Appendix Figure 3). In addition, the hospital billing department did not identify an increase in insurance denials related to the route of antibiotic administration.
DISCUSSION
Summary
Using improvement science, we promoted earlier transition to enteral antibiotics for children hospitalized with uncomplicated CAP and SSTI by linking the decision for transition to the ability to take other enteral medications, rather than to discharge readiness. We increased the percentage of enterally administered antibiotic doses in this patient population from 44% to 80% in eight months. Although we did not observe a decrease in LOS as previously noted in a cost analysis study comparing pediatric patients with CAP treated with oral antibiotics versus those treated with IV antibiotics,15 we did find a decrease in LOS variability and in antimicrobial costs to our patients. These cost savings did not include potential savings from nursing or pharmacy labor. In addition, we noted a decrease in the variability in antibiotic prescribing practice, which demonstrates provider ability and willingness to couple antibiotic route transition to an objective characteristic (administration of other enteral medications).
A strength of our study was that residents, the most frequent prescribers of antibiotics on our HM service, were highly involved in the QI initiative, including defining the SMART aim, identifying key drivers, developing interventions, and completing sequential PDSA cycles. Under the guidance of QI-trained coaches, residents developed feasible interventions and assessed their success in real time. Consistent with other studies,16,17 resident buy-in and involvement led to the success of our improvement study.
Interpretation
Despite emerging evidence regarding the timing of transition to enteral antibiotics, several factors impeded early transition at our institution, including physician culture, variable practice habits, and hospital workflow. Evidence supports the use of enteral antibiotics in immunocompetent children hospitalized for uncomplicated CAP who do not have chronic lung disease, are not in shock, and have oxygen saturations >85%.6 Although existing literature suggests that in pediatric patients admitted for SSTIs not involving the eye or bone, IV antibiotics may be transitioned when clinical improvement, evidenced by a reduction in fever or erythema, is noted,6 enteral antibiotics that achieve appropriate concentrations to attain pharmacodynamic targets should have the same efficacy as that of IV antibiotics.9 Using the criterion of administration of any medication enterally to identify a patient’s readiness to transition, we were able to overcome practice variation among providers who may have differing opinions of what constitutes clinical improvement. Of note, new evidence is emerging on predictors of enteral antibiotic treatment failure in patients with CAP and SSTI to guide transition timing, but these studies have largely focused on the adult population or were performed in the outpatient and emergency department (ED) settings.18,19 Regardless, the stable number of encounters between readmissions in our patient population likely indicates that treatment failure in these patients was rare.
Rising healthcare costs have led to concerns around sustainability of the healthcare system;20,21 tackling overuse in clinical practice, as in our study, is one mitigation strategy. Several studies have used QI methods to facilitate the provision of high-value care through the decrease of continuous monitor overuse and extraneous ordering of electrolytes.22,23 Our QI study adds to the high-value care literature by safely decreasing the use of IV antibiotics. One retrospective study demonstrated that a one-day decrease in the use of IV antibiotics in pneumonia resulted in decreased costs without an increase in readmissions, similar to our findings.24 In adults, QI initiatives aimed at improving early transition of antibiotics utilized electronic trigger tools.25,26 Fischer et al. used active orders for scheduled enteral medications or an enteral diet as indication that a patient’s IV medications could be converted to enteral form.26
Our work is not without limitations. The list of ICD-9 and -10 codes used to query the EHR did not capture all diagnoses that would be considered as uncomplicated CAP or SSTI. However, we included an extensive list of diagnoses to ensure that the majority of patients meeting our inclusion criteria were captured. Our process measure did not account for patients on IV antibiotics who were not administered other enteral medications but tolerating an enteral diet. These patients were not identified in our EHR query and were not included in our process measure as a failure. However, in latter interventions, residents identified all patients on IV antibiotics, so that patients not identified by our EHR query benefited from our work. Furthermore, this QI study was conducted at a single institution and several interventions took advantage of preexisting structured huddles and a resident QI curriculum, which may not exist at other institutions. Our study does highlight that engaging frontline providers, such as residents, to review antibiotic orders consistently and question the appropriateness of the administration route is key to making incremental changes in prescribing practices.
CONCLUSIONS
Through a partnership between HM and Pharmacy and with substantial resident involvement, we improved the transition of IV antibiotics in patients with CAP or SSTI by increasing the percentage of enterally administered antibiotic doses and reducing antimicrobial costs and variability in antibiotic prescribing practices. This work illustrates how reducing overuse of IV antibiotics promotes high-value care and aligns with initiatives to prevent avoidable harm.27 Our work highlights that standardized discussions about medication orders to create consensus around enteral antibiotic transitions, real-time feedback, and challenging the status quo can influence practice habits and effect change.
Next steps include testing automated methods to notify providers of opportunities for transition from IV to enteral antibiotics through embedded clinical decision support, a method similar to the electronic trigger tools used in adult QI studies.25,26 Since our prerounds huddle includes identifying all patients on IV antibiotics, studying the transition to enteral antibiotics and its effect on prescribing practices in other diagnoses (ie, urinary tract infection and osteomyelitis) may contribute to spreading these efforts. Partnering with our ED colleagues may be an important next step, as several patients admitted to HM on IV antibiotics are given their first dose in the ED.
Acknowledgments
The authors would like to thank the faculty of the James M. Anderson Center for Health Systems Excellence Intermediate Improvement Science Series for their guidance in the planning of this project. The authors would also like to thank Ms. Ursula Bradshaw and Mr. Michael Ponti-Zins for obtaining the hospital data on length of stay and readmissions. The authors acknowledge Dr. Philip Hagedorn for his assistance with the software that queries the electronic health record and Dr. Laura Brower and Dr. Joanna Thomson for their assistance with statistical analysis. The authors are grateful to all the residents and coaches on the QI Hospital Medicine team who contributed ideas on study design and interventions.
Intravenous (IV) antibiotics are commonly used in hospitalized pediatric patients to treat bacterial infections. Antimicrobial stewardship guidelines published by the Infectious Diseases Society of America (IDSA) recommend institutions develop a systematic plan to convert from IV to enteral antibiotics, as early transition may reduce healthcare costs, decrease length of stay (LOS), and avoid prolonged IV access complications1 such as extravasation, thrombosis, and catheter-associated infections.2-5
Pediatric patients with community-acquired pneumonia (CAP) and mild skin and soft tissue infections (SSTI) may not require IV antibiotics, even if the patient is hospitalized.6 Although national guidelines for pediatric CAP and SSTI recommend IV antibiotics for hospitalized patients, these guidelines state that mild infections may be treated with enteral antibiotics and emphasize discontinuation of IV antibiotics when the patient meets discharge criteria.7,8 Furthermore, several enteral antibiotics used for the treatment of CAP and SSTI, such as cephalexin and clindamycin,9 have excellent bioavailability (>90%) or can achieve sufficient concentrations to attain the pharmacodynamic target (ie, amoxicillin and trimethoprim–sulfamethoxazole).10,11 Nonetheless, the guidelines do not explicitly outline criteria regarding the transition from IV to enteral antibiotics.7,8
At our institution, patients admitted to Hospital Medicine (HM) often remained on IV antibiotics until discharge. Data review revealed that antibiotic treatment of CAP and SSTI posed the greatest opportunity for early conversion to enteral therapy based on the high frequency of admissions and the ability of commonly used enteral antibiotics to attain pharmacodynamic targets. We sought to change practice culture by decoupling transition to enteral antibiotics from discharge and use administration of other enteral medications as an objective indicator for transition. Our aim was to increase the proportion of enterally administered antibiotic doses for HM patients aged >60 days admitted with uncomplicated CAP or SSTI from 44% to 75% in eight months.
METHODS
Context
Cincinnati Children’s Hospital Medical Center (CCHMC) is a large, urban, academic hospital. The HM division has 45 attendings and admits >8,000 general pediatric patients annually. The five HM teams at the main campus consist of attendings, fellows, residents, and medical students. One HM team serves as the resident quality improvement (QI) team where residents collaborate in a longitudinal study under the guidance of QI-trained coaches. The focus of this QI initiative was determined by resident consensus and aligned with a high-value care curriculum.12
To identify the target patient population, we investigated IV antimicrobials frequently used in HM patients. Ampicillin and clindamycin are commonly used IV antibiotics, most frequently corresponding with the diagnoses of CAP and SSTI, respectively, accounting for half of all antibiotic use on the HM service. Amoxicillin, the enteral equivalent of ampicillin, can achieve sufficient concentrations to attain the pharmacodynamic target at infection sites, and clindamycin has high bioavailability, making them ideal options for early transition. Our institution’s robust antimicrobial stewardship program has published local guidelines on using amoxicillin as the enteral antibiotic of choice for uncomplicated CAP, but it does not provide guidance on the timing of transition for either CAP or SSTI; the clinical team makes this decision.
HM attendings were surveyed to determine the criteria used to transition from IV to enteral antibiotics for patients with CAP or SSTI. The survey illustrated practice variability with providers using differing clinical criteria to signal the timing of transition. Additionally, only 49% of respondents (n = 37) rated themselves as “very comfortable” with residents making autonomous decisions to transition to enteral antibiotics. We chose to use the administration of other enteral medications, instead of discharge readiness, as an objective indicator of a patient’s readiness to transition to enteral antibiotics, given the low-risk patient population and the ability of the enteral antibiotics commonly used for CAP and SSTI to achieve pharmacodynamic targets.
The study population included patients aged >60 days admitted to HM with CAP or SSTI treated with any antibiotic. We excluded patients with potential complications or significant progression of their disease process, including patients with parapneumonic effusions or chest tubes, patients who underwent bronchoscopy, and patients with osteomyelitis, septic arthritis, or preseptal or orbital cellulitis. Past medical history and clinical status on admission were not used to exclude patients.
Interventions
Our multidisciplinary team, formed in January 2017, included HM attendings, HM fellows, pediatric residents, a critical care attending, a pharmacy resident, and an antimicrobial stewardship pharmacist. Under the guidance of QI coaches, the residents on the HM QI team developed and tested all interventions on their team and then determined which interventions would spread to the other four teams. The nursing director of our primary HM unit disseminated project updates to bedside nurses. A simplified failure mode and effects analysis identified areas for improvement and potential interventions. Interventions focused on the following key drivers (Figure 1): increased prescriber awareness of medication charge, standardization of conversion from IV to enteral antibiotics, clear definition of the patients ready for transition, ongoing evaluation of the antimicrobial plan, timely recognition by prescribers of patients ready for transition, culture shift regarding the appropriate administration route in the inpatient setting, and transparency of data. The team implemented sequential Plan-Do-Study-Act (PDSA) cycles13 to test the interventions.
Charge Table
To improve knowledge about the increased charge for commonly used IV medications compared with enteral formulations, a table comparing relative charges was shared during monthly resident morning conferences and at an HM faculty meeting. The table included charge comparisons between ampicillin and amoxicillin and IV and enteral clindamycin.
Standardized Language in Electronic Health Record (EHR) Antibiotic Plan on Rounds
Standardized language to document antibiotic transition plans was added to admission and progress note templates in the EHR. The standard template prompted residents to (1) define clinical transition criteria, (2) discuss attending comfort with transition overnight (based on survey results), and (3) document patient preference of solid or liquid dosage forms. Plans were reviewed and updated daily. We hypothesized that since residents use the information in the daily progress notes, including assessments and plans, to present on rounds, inclusion of the transition criteria in the note would prompt transition plan discussions.
Communication Bundle
To promote early transition to enteral antibiotics, we standardized the discussion about antibiotic transition between residents and attendings. During a weekly preexisting meeting, the resident QI team reviewed preferences for transitions with the new service attending. By identifying attending preferences early, residents were able to proactively transition patients who met the criteria (eg, antibiotic transition in the evening instead of waiting until morning rounds). This discussion also provided an opportunity to engage service attendings in the QI efforts, which were also shared at HM faculty meetings quarterly.
Recognizing that in times of high census, discussion of patient plans may be abbreviated during rounds, residents were asked to identify all patients on IV antibiotics while reviewing patient medication orders prior to rounds. As part of an existing daily prerounds huddle to discuss rounding logistics, residents listed all patients on IV antibiotics and discussed which patients were ready for transition. If patients could not be transitioned immediately, the team identified the transition criteria.
At preexisting evening huddles between overnight shift HM residents and the evening HM attending, residents identified patients who were prescribed IV antibiotics and discussed readiness for enteral transition. If a patient could be transitioned overnight, enteral antibiotic orders were placed. Overnight residents were also encouraged to review the transition criteria with families upon admission.
Real-time Identification of Failures and Feedback
For two weeks, the EHR was queried daily to identify patients admitted for uncomplicated CAP and SSTI who were on antibiotics as well as other enteral medications. A failure was defined as an IV antibiotic dose given to a patient who was administered any enteral medication. Residents on the QI team approached residents on other HM teams whenever patients were identified as a failed transition to learn about failure reasons.
Study of the Interventions
Data for HM patients who met the inclusion criteria were collected weekly from January 2016 through June 2018 via EHR query. We initially searched for diagnoses that fit under the disease categories of pneumonia and SSTI in the EHR, which generated a list of International Classification of Disease-9 and -10 Diagnosis codes (Appendix Figure 1). The query identified patients based on these codes and reported whether the identified patients took a dose of any enteral medication, excluding nystatin, sildenafil, tacrolimus, and mouthwashes, which are commonly continued during NPO status due to no need for absorption or limited parenteral options. It also reported the ordered route of administration for the queried antibiotics (Appendix Figure 1).
The 2016 calendar year established our baseline to account for seasonal variability. Data were reported weekly and reviewed to evaluate the impact of PDSA cycles and inform new interventions.
Measures
Our process measure was the total number of enteral antibiotic doses divided by all antibiotic doses in patients receiving any enteral medication. We reasoned that if patients were well enough to take medications enterally, they could be given an enteral antibiotic that is highly bioavailable or readily achieves concentrations that attain pharmacodynamic targets. This practice change was a culture shift, decoupling the switch to enteral antibiotics from discharge readiness. Our EHR query reported only the antibiotic doses given to patients who took an enteral medication on the day of antibiotic administration and excluded patients who received only IV medications.
Outcome measures included antimicrobial costs per patient encounter using average wholesale prices, which were reported in our EHR query, and LOS. To ensure that transitions of IV to enteral antibiotics were not negatively impacting patient outcomes, patient readmissions within seven days served as a balancing measure.
Analysis
An annotated statistical process control p-chart tracked the impact of interventions on the proportion of antibiotic doses that were enterally administered during hospitalization. An x-bar and an s-chart tracked the impact of interventions on antimicrobial costs per patient encounter and on LOS. A p-chart and an encounters-between g-chart were used to evaluate the impact of our interventions on readmissions. Control chart rules for identifying special cause were used for center line shifts.14
Ethical Considerations
This study was part of a larger study of the residency high-value care curriculum,12 which was deemed exempt by the CCHMC IRB.
RESULTS
The baseline data collected included 372 patients and the postintervention period in 2017 included 326 patients (Table). Approximately two-thirds of patients had a diagnosis of CAP.
The percentage of antibiotic doses given enterally increased from 44% to 80% within eight months (Figure 2). When studying the impact of interventions, residents on the HM QI team found that the standard EHR template added to daily notes did not consistently prompt residents to discuss antibiotic plans and thus was abandoned. Initial improvement coincided with standardizing discussions between residents and attendings regarding transitions. Furthermore, discussion of all patients on IV antibiotics during the prerounds huddle allowed for reliable, daily communication about antibiotic plans and was subsequently spread to and adopted by all HM teams. The percentage of enterally administered antibiotic doses increased to >75% after the evening huddle, which involved all HM teams, and real-time identification of failures on all HM teams with provider feedback. Despite variability when the total number of antibiotic doses prescribed per week was low (<10), we demonstrated sustainability for 11 months (Figure 2), during which the prerounds and evening huddle discussions were continued and an updated control chart was shown monthly to residents during their educational conferences.
Residents on the QI team spoke directly with other HM residents when there were missed opportunities for transition. Based on these discussions and intermittent chart reviews, common reasons for failure to transition in patients with CAP included admission for failed outpatient enteral treatment, recent evaluation by critical care physicians for possible transfer to the intensive care unit, and difficulty weaning oxygen. For patients with SSTI, hand abscesses requiring drainage by surgery and treatment failure with other antibiotics constituted many of the IV antibiotic doses given to patients on enteral medications.
Antimicrobial costs per patient encounter decreased by 70% over one year; the shift in costs coincided with the second shift in our process measure (Appendix Figure 2A). Based on an estimate of 350 patients admitted per year for uncomplicated CAP or SSTI, this translates to an annual cost savings of approximately $29,000. The standard deviation of costs per patient encounter decreased by 84% (Appendix Figure 2B), suggesting a decrease in the variability of prescribing practices.
The average LOS in our patient population prior to intervention was 2.1 days and did not change (Appendix Figure 2C), but the standard deviation decreased by >50% (Appendix Figure 2D). There was no shift in the mean seven-day readmission rate or the number of encounters between readmissions (2.6% and 26, respectively; Appendix Figure 3). In addition, the hospital billing department did not identify an increase in insurance denials related to the route of antibiotic administration.
DISCUSSION
Summary
Using improvement science, we promoted earlier transition to enteral antibiotics for children hospitalized with uncomplicated CAP and SSTI by linking the decision for transition to the ability to take other enteral medications, rather than to discharge readiness. We increased the percentage of enterally administered antibiotic doses in this patient population from 44% to 80% in eight months. Although we did not observe a decrease in LOS as previously noted in a cost analysis study comparing pediatric patients with CAP treated with oral antibiotics versus those treated with IV antibiotics,15 we did find a decrease in LOS variability and in antimicrobial costs to our patients. These cost savings did not include potential savings from nursing or pharmacy labor. In addition, we noted a decrease in the variability in antibiotic prescribing practice, which demonstrates provider ability and willingness to couple antibiotic route transition to an objective characteristic (administration of other enteral medications).
A strength of our study was that residents, the most frequent prescribers of antibiotics on our HM service, were highly involved in the QI initiative, including defining the SMART aim, identifying key drivers, developing interventions, and completing sequential PDSA cycles. Under the guidance of QI-trained coaches, residents developed feasible interventions and assessed their success in real time. Consistent with other studies,16,17 resident buy-in and involvement led to the success of our improvement study.
Interpretation
Despite emerging evidence regarding the timing of transition to enteral antibiotics, several factors impeded early transition at our institution, including physician culture, variable practice habits, and hospital workflow. Evidence supports the use of enteral antibiotics in immunocompetent children hospitalized for uncomplicated CAP who do not have chronic lung disease, are not in shock, and have oxygen saturations >85%.6 Although existing literature suggests that in pediatric patients admitted for SSTIs not involving the eye or bone, IV antibiotics may be transitioned when clinical improvement, evidenced by a reduction in fever or erythema, is noted,6 enteral antibiotics that achieve appropriate concentrations to attain pharmacodynamic targets should have the same efficacy as that of IV antibiotics.9 Using the criterion of administration of any medication enterally to identify a patient’s readiness to transition, we were able to overcome practice variation among providers who may have differing opinions of what constitutes clinical improvement. Of note, new evidence is emerging on predictors of enteral antibiotic treatment failure in patients with CAP and SSTI to guide transition timing, but these studies have largely focused on the adult population or were performed in the outpatient and emergency department (ED) settings.18,19 Regardless, the stable number of encounters between readmissions in our patient population likely indicates that treatment failure in these patients was rare.
Rising healthcare costs have led to concerns around sustainability of the healthcare system;20,21 tackling overuse in clinical practice, as in our study, is one mitigation strategy. Several studies have used QI methods to facilitate the provision of high-value care through the decrease of continuous monitor overuse and extraneous ordering of electrolytes.22,23 Our QI study adds to the high-value care literature by safely decreasing the use of IV antibiotics. One retrospective study demonstrated that a one-day decrease in the use of IV antibiotics in pneumonia resulted in decreased costs without an increase in readmissions, similar to our findings.24 In adults, QI initiatives aimed at improving early transition of antibiotics utilized electronic trigger tools.25,26 Fischer et al. used active orders for scheduled enteral medications or an enteral diet as indication that a patient’s IV medications could be converted to enteral form.26
Our work is not without limitations. The list of ICD-9 and -10 codes used to query the EHR did not capture all diagnoses that would be considered as uncomplicated CAP or SSTI. However, we included an extensive list of diagnoses to ensure that the majority of patients meeting our inclusion criteria were captured. Our process measure did not account for patients on IV antibiotics who were not administered other enteral medications but tolerating an enteral diet. These patients were not identified in our EHR query and were not included in our process measure as a failure. However, in latter interventions, residents identified all patients on IV antibiotics, so that patients not identified by our EHR query benefited from our work. Furthermore, this QI study was conducted at a single institution and several interventions took advantage of preexisting structured huddles and a resident QI curriculum, which may not exist at other institutions. Our study does highlight that engaging frontline providers, such as residents, to review antibiotic orders consistently and question the appropriateness of the administration route is key to making incremental changes in prescribing practices.
CONCLUSIONS
Through a partnership between HM and Pharmacy and with substantial resident involvement, we improved the transition of IV antibiotics in patients with CAP or SSTI by increasing the percentage of enterally administered antibiotic doses and reducing antimicrobial costs and variability in antibiotic prescribing practices. This work illustrates how reducing overuse of IV antibiotics promotes high-value care and aligns with initiatives to prevent avoidable harm.27 Our work highlights that standardized discussions about medication orders to create consensus around enteral antibiotic transitions, real-time feedback, and challenging the status quo can influence practice habits and effect change.
Next steps include testing automated methods to notify providers of opportunities for transition from IV to enteral antibiotics through embedded clinical decision support, a method similar to the electronic trigger tools used in adult QI studies.25,26 Since our prerounds huddle includes identifying all patients on IV antibiotics, studying the transition to enteral antibiotics and its effect on prescribing practices in other diagnoses (ie, urinary tract infection and osteomyelitis) may contribute to spreading these efforts. Partnering with our ED colleagues may be an important next step, as several patients admitted to HM on IV antibiotics are given their first dose in the ED.
Acknowledgments
The authors would like to thank the faculty of the James M. Anderson Center for Health Systems Excellence Intermediate Improvement Science Series for their guidance in the planning of this project. The authors would also like to thank Ms. Ursula Bradshaw and Mr. Michael Ponti-Zins for obtaining the hospital data on length of stay and readmissions. The authors acknowledge Dr. Philip Hagedorn for his assistance with the software that queries the electronic health record and Dr. Laura Brower and Dr. Joanna Thomson for their assistance with statistical analysis. The authors are grateful to all the residents and coaches on the QI Hospital Medicine team who contributed ideas on study design and interventions.
1. Dellit TH, Owens RC, McGowan JE, Jr, et al. Infectious diseases society of America and the society for healthcare epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis. 2007;44(2):159-177. https://doi.org/10.1086/510393.
2. Shah SS, Srivastava R, Wu S, et al. Intravenous Versus oral antibiotics for postdischarge treatment of complicated pneumonia. Pediatrics. 2016;138(6). https://doi.org/10.1542/peds.2016-1692.
3. Keren R, Shah SS, Srivastava R, et al. Comparative effectiveness of intravenous vs oral antibiotics for postdischarge treatment of acute osteomyelitis in children. JAMA Pediatr. 2015;169(2):120-128. https://doi.org/10.1001/jamapediatrics.2014.2822.
4. Jumani K, Advani S, Reich NG, Gosey L, Milstone AM. Risk factors for peripherally inserted central venous catheter complications in children. JAMA Pediatr. 2013;167(5):429-435.https://doi.org/10.1001/jamapediatrics.2013.775.
5. Zaoutis T, Localio AR, Leckerman K, et al. Prolonged intravenous therapy versus early transition to oral antimicrobial therapy for acute osteomyelitis in children. Pediatrics. 2009;123(2):636-642. https://doi.org/10.1542/peds.2008-0596.
6. McMullan BJ, Andresen D, Blyth CC, et al. Antibiotic duration and timing of the switch from intravenous to oral route for bacterial infections in children: systematic review and guidelines. Lancet Infect Dis. 2016;16(8):e139-e152. https://doi.org/10.1016/S1473-3099(16)30024-X.
7. Bradley JS, Byington CL, Shah SS, et al. The management of community-acquired pneumonia in infants and children older than 3 months of age: clinical practice guidelines by the Pediatric Infectious Diseases Society and the Infectious Diseases Society of America. Clin Infect Dis. 2011;53(7):e25-e76. https://doi.org/10.1093/cid/cir531.
8. Stevens DL, Bisno AL, Chambers HF, et al. Executive summary: practice guidelines for the diagnosis and management of skin and soft tissue infections: 2014 update by the infectious diseases society of America. Clin Infect Dis. 2014;59(2):147-159. https://doi.org/10.1093/cid/ciu444.
9. MacGregor RR, Graziani AL. Oral administration of antibiotics: a rational alternative to the parenteral route. Clin Infect Dis. 1997;24(3):457-467. https://doi.org/10.1093/clinids/24.3.457.
10. Downes KJ, Hahn A, Wiles J, Courter JD, Vinks AA. Dose optimisation of antibiotics in children: application of pharmacokinetics/pharmacodynamics in paediatrics. Int J Antimicrob Agents. 2014;43(3):223-230. https://doi.org/10.1016/j.ijantimicag.2013.11.006.
11. Autmizguine J, Melloni C, Hornik CP, et al. Population pharmacokinetics of trimethoprim-sulfamethoxazole in infants and children. Antimicrob Agents Chemother. 2018;62(1):e01813-e01817. https://doi.org/10.1128/AAC.01813-17.
12. Dewan M, Herrmann LE, Tchou MJ, et al. Development and evaluation of high-value pediatrics: a high-value care pediatric resident curriculum. Hosp Pediatr. 2018;8(12):785-792. https://doi.org/10.1542/hpeds.2018-0115
13. Langley GJ, Moen RD, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. New Jersey, US: John Wiley & Sons; 2009.
14. Benneyan JC. Use and interpretation of statistical quality control charts. Int J Qual Health Care. 1998;10(1):69-73. https://doi.org/10.1093/intqhc/10.1.69.
15. Lorgelly PK, Atkinson M, Lakhanpaul M, et al. Oral versus i.v. antibiotics for community-acquired pneumonia in children: a cost-minimisation analysis. Eur Respir J. 2010;35(4):858-864. https://doi.org/10.1183/09031936.00087209.
16. Vidyarthi AR, Green AL, Rosenbluth G, Baron RB. Engaging residents and fellows to improve institution-wide quality: the first six years of a novel financial incentive program. Acad Med. 2014;89(3):460-468. https://doi.org/10.1097/ACM.0000000000000159.
17. Stinnett-Donnelly JM, Stevens PG, Hood VL. Developing a high value care programme from the bottom up: a programme of faculty-resident improvement projects targeting harmful or unnecessary care. BMJ Qual Saf. 2016;25(11):901-908. https://doi.org/10.1136/bmjqs-2015-004546.
18. Peterson D, McLeod S, Woolfrey K, McRae A. Predictors of failure of empiric outpatient antibiotic therapy in emergency department patients with uncomplicated cellulitis. Acad Emerg Med. 2014;21(5):526-531. https://doi.org/10.1111/acem.12371.
19. Yadav K, Suh KN, Eagles D, et al. Predictors of oral antibiotic treatment failure for non-purulent skin and soft tissue infections in the emergency department. Acad Emerg Med. 2018;20(S1):S24-S25. https://doi.org/10.1017/cem.2018.114.
20. Organisation for Economic Co-operation and Development. Healthcare costs unsustainable in advanced economies without reform. http://www.oecd.org/health/healthcarecostsunsustainableinadvancedeconomieswithoutreform.htm. Accessed June 28, 2018; 2015.
21. Berwick DM, Hackbarth AD. Eliminating waste in US health care. JAMA. 2012;307(14):1513-1516. https://doi.org/10.1001/jama.2012.362.
22. Schondelmeyer AC, Simmons JM, Statile AM, et al. Using quality improvement to reduce continuous pulse oximetry use in children with wheezing. Pediatrics. 2015;135(4):e1044-e1051. https://doi.org/10.1542/peds.2014-2295.
23. Tchou MJ, Tang Girdwood S, Wormser B, et al. Reducing electrolyte testing in hospitalized children by using quality improvement methods. Pediatrics. 2018;141(5). https://doi.org/10.1542/peds.2017-3187.
24. Christensen EW, Spaulding AB, Pomputius WF, Grapentine SP. Effects of hospital practice patterns for antibiotic administration for pneumonia on hospital lengths of stay and costs. J Pediatr Infect Dis Soc. 2019;8(2):115-121. https://doi.org/10.1093/jpids/piy003.
25. Berrevoets MAH, Pot JHLW, Houterman AE, et al. An electronic trigger tool to optimise intravenous to oral antibiotic switch: a controlled, interrupted time series study. Antimicrob Resist Infect Control. 2017;6:81. https://doi.org/10.1186/s13756-017-0239-3.
26. Fischer MA, Solomon DH, Teich JM, Avorn J. Conversion from intravenous to oral medications: assessment of a computerized intervention for hospitalized patients. Arch Intern Med. 2003;163(21):2585-2589. https://doi.org/10.1001/archinte.163.21.2585.
27. Schroeder AR, Harris SJ, Newman TB. Safely doing less: a missing component of the patient safety dialogue. Pediatrics. 2011;128(6):e1596-e1597. https://doi.org/10.1542/peds.2011-2726.
1. Dellit TH, Owens RC, McGowan JE, Jr, et al. Infectious diseases society of America and the society for healthcare epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis. 2007;44(2):159-177. https://doi.org/10.1086/510393.
2. Shah SS, Srivastava R, Wu S, et al. Intravenous Versus oral antibiotics for postdischarge treatment of complicated pneumonia. Pediatrics. 2016;138(6). https://doi.org/10.1542/peds.2016-1692.
3. Keren R, Shah SS, Srivastava R, et al. Comparative effectiveness of intravenous vs oral antibiotics for postdischarge treatment of acute osteomyelitis in children. JAMA Pediatr. 2015;169(2):120-128. https://doi.org/10.1001/jamapediatrics.2014.2822.
4. Jumani K, Advani S, Reich NG, Gosey L, Milstone AM. Risk factors for peripherally inserted central venous catheter complications in children. JAMA Pediatr. 2013;167(5):429-435.https://doi.org/10.1001/jamapediatrics.2013.775.
5. Zaoutis T, Localio AR, Leckerman K, et al. Prolonged intravenous therapy versus early transition to oral antimicrobial therapy for acute osteomyelitis in children. Pediatrics. 2009;123(2):636-642. https://doi.org/10.1542/peds.2008-0596.
6. McMullan BJ, Andresen D, Blyth CC, et al. Antibiotic duration and timing of the switch from intravenous to oral route for bacterial infections in children: systematic review and guidelines. Lancet Infect Dis. 2016;16(8):e139-e152. https://doi.org/10.1016/S1473-3099(16)30024-X.
7. Bradley JS, Byington CL, Shah SS, et al. The management of community-acquired pneumonia in infants and children older than 3 months of age: clinical practice guidelines by the Pediatric Infectious Diseases Society and the Infectious Diseases Society of America. Clin Infect Dis. 2011;53(7):e25-e76. https://doi.org/10.1093/cid/cir531.
8. Stevens DL, Bisno AL, Chambers HF, et al. Executive summary: practice guidelines for the diagnosis and management of skin and soft tissue infections: 2014 update by the infectious diseases society of America. Clin Infect Dis. 2014;59(2):147-159. https://doi.org/10.1093/cid/ciu444.
9. MacGregor RR, Graziani AL. Oral administration of antibiotics: a rational alternative to the parenteral route. Clin Infect Dis. 1997;24(3):457-467. https://doi.org/10.1093/clinids/24.3.457.
10. Downes KJ, Hahn A, Wiles J, Courter JD, Vinks AA. Dose optimisation of antibiotics in children: application of pharmacokinetics/pharmacodynamics in paediatrics. Int J Antimicrob Agents. 2014;43(3):223-230. https://doi.org/10.1016/j.ijantimicag.2013.11.006.
11. Autmizguine J, Melloni C, Hornik CP, et al. Population pharmacokinetics of trimethoprim-sulfamethoxazole in infants and children. Antimicrob Agents Chemother. 2018;62(1):e01813-e01817. https://doi.org/10.1128/AAC.01813-17.
12. Dewan M, Herrmann LE, Tchou MJ, et al. Development and evaluation of high-value pediatrics: a high-value care pediatric resident curriculum. Hosp Pediatr. 2018;8(12):785-792. https://doi.org/10.1542/hpeds.2018-0115
13. Langley GJ, Moen RD, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. New Jersey, US: John Wiley & Sons; 2009.
14. Benneyan JC. Use and interpretation of statistical quality control charts. Int J Qual Health Care. 1998;10(1):69-73. https://doi.org/10.1093/intqhc/10.1.69.
15. Lorgelly PK, Atkinson M, Lakhanpaul M, et al. Oral versus i.v. antibiotics for community-acquired pneumonia in children: a cost-minimisation analysis. Eur Respir J. 2010;35(4):858-864. https://doi.org/10.1183/09031936.00087209.
16. Vidyarthi AR, Green AL, Rosenbluth G, Baron RB. Engaging residents and fellows to improve institution-wide quality: the first six years of a novel financial incentive program. Acad Med. 2014;89(3):460-468. https://doi.org/10.1097/ACM.0000000000000159.
17. Stinnett-Donnelly JM, Stevens PG, Hood VL. Developing a high value care programme from the bottom up: a programme of faculty-resident improvement projects targeting harmful or unnecessary care. BMJ Qual Saf. 2016;25(11):901-908. https://doi.org/10.1136/bmjqs-2015-004546.
18. Peterson D, McLeod S, Woolfrey K, McRae A. Predictors of failure of empiric outpatient antibiotic therapy in emergency department patients with uncomplicated cellulitis. Acad Emerg Med. 2014;21(5):526-531. https://doi.org/10.1111/acem.12371.
19. Yadav K, Suh KN, Eagles D, et al. Predictors of oral antibiotic treatment failure for non-purulent skin and soft tissue infections in the emergency department. Acad Emerg Med. 2018;20(S1):S24-S25. https://doi.org/10.1017/cem.2018.114.
20. Organisation for Economic Co-operation and Development. Healthcare costs unsustainable in advanced economies without reform. http://www.oecd.org/health/healthcarecostsunsustainableinadvancedeconomieswithoutreform.htm. Accessed June 28, 2018; 2015.
21. Berwick DM, Hackbarth AD. Eliminating waste in US health care. JAMA. 2012;307(14):1513-1516. https://doi.org/10.1001/jama.2012.362.
22. Schondelmeyer AC, Simmons JM, Statile AM, et al. Using quality improvement to reduce continuous pulse oximetry use in children with wheezing. Pediatrics. 2015;135(4):e1044-e1051. https://doi.org/10.1542/peds.2014-2295.
23. Tchou MJ, Tang Girdwood S, Wormser B, et al. Reducing electrolyte testing in hospitalized children by using quality improvement methods. Pediatrics. 2018;141(5). https://doi.org/10.1542/peds.2017-3187.
24. Christensen EW, Spaulding AB, Pomputius WF, Grapentine SP. Effects of hospital practice patterns for antibiotic administration for pneumonia on hospital lengths of stay and costs. J Pediatr Infect Dis Soc. 2019;8(2):115-121. https://doi.org/10.1093/jpids/piy003.
25. Berrevoets MAH, Pot JHLW, Houterman AE, et al. An electronic trigger tool to optimise intravenous to oral antibiotic switch: a controlled, interrupted time series study. Antimicrob Resist Infect Control. 2017;6:81. https://doi.org/10.1186/s13756-017-0239-3.
26. Fischer MA, Solomon DH, Teich JM, Avorn J. Conversion from intravenous to oral medications: assessment of a computerized intervention for hospitalized patients. Arch Intern Med. 2003;163(21):2585-2589. https://doi.org/10.1001/archinte.163.21.2585.
27. Schroeder AR, Harris SJ, Newman TB. Safely doing less: a missing component of the patient safety dialogue. Pediatrics. 2011;128(6):e1596-e1597. https://doi.org/10.1542/peds.2011-2726.
© 2019 Society of Hospital Medicine
Examining the “Repletion Reflex”: The Association between Serum Potassium and Outcomes in Hospitalized Patients with Heart Failure
Heart failure (HF) is a leading cause of hospital admission and mortality, accounting for approximately 900,000 discharges in 2014.1 One-year all-cause mortality risk has been estimated at 17% after hospitalization,2 and roughly 50% of deaths are related to sudden cardiac death, mostly due to ventricular arrhythmia.
The principles underlying potassium management in acute HF are complex. Both low and high values have been linked to fatal arrhythmias, notably ventricular fibrillation, and small serum changes often reflect large total body potassium fluctuations.11 Recent literature links hypokalemia to general membrane hypoexcitability, skeletal muscle hyporeflexia, and arrhythmias initiated by reduced sodium-potassium adenosine triphosphatase activity, leading to increased intracellular calcium and regional variations in action potential duration.12 Potassium abnormalities are common at admission and may be exacerbated by both acute illness and treatments given during hospitalization, including baseline potassium, acute kidney injury, aggressive diuretic therapy, or other potassium-related treatments and conditions.13 The success of potassium repletion may also be affected by the choice of HF therapies.14
The belief that patients with HF must maintain a potassium >4.0 mEq/L remains pervasive, with at least one family medicine guideline recommending that patients with HF maintain a serum potassium level >4.0 mEq/L.
METHODS
Data Sources and Cohort Definition
The Institutional Review Board at Baystate Medical Center approved this study. We identified patients with HF who were admitted for more than 72 hours between January 2010 and December 2012 to hospitals contributing to the HealthFacts database, a multihospital dataset derived from the comprehensive electronic health records of 116 geographically and structurally diverse hospitals throughout the United States (Cerner Corp.). HealthFacts—which includes date-stamped pharmacy, laboratory, and billing information—contains records of more than 84 million acute admissions, emergency room visits, and ambulatory visits. We limited the sample to hospitals that contributed to the pharmacy, laboratory, and diagnosis segments.
We included patients who had a principal International Classification of Disease (ICD-9-CM) diagnosis of HF or a principal diagnosis of respiratory failure with secondary diagnosis of HF (ICD-9-CM codes for HF: 402.01, 402.11, 402.91, 404.01, 404.03, 404.11, 404.13, 404.91, 404.93, 428.xx16 and for respiratory failure: 518.81, 518.82, 518.84) and were 18 years or older. We ensured that patients were treated for acute decompensated HF during the hospitalization by restricting the cohort to patients in whom at least one HF therapy (eg, loop diuretics, metolazone, inotropes, and intra-aortic balloon pump) was initiated within the first two days of hospitalization. We excluded patients with a pediatric or psychiatric attending physician, those with elective admissions, and those who were transferred from or to another acute care facility because we could not accurately determine the onset or subsequent course of their illness.
Definition of Variables Describing Serum Potassium Levels
We limited the sample to patients hospitalized for longer than 72 hours in order to observe how initial potassium values influenced outcomes over the course of hospitalization. We chose an exposure window of 72 hours because this allowed, on average, three potential observations of serum potassium per patient. We further restricted the sample to those who had a normal potassium value (3.5-5.0 mEq/L) at admission (defined as 24 hours prior to admission through midnight of the day of admission) to ensure that the included patients did not have abnormal potassium values upon presentation. We identified the period of time from 24 hours prior to admission through 72 hours following admission as “the exposure window” (the time during which patients were eligible to be classified into average serum potassium levels of <4.0, 4.0-4.5, or >4.5 mEq/L). We excluded patients who, during this window, had fewer than three serum potassium levels drawn (“exposure” levels could be disproportionately influenced by a single value) or received sodium polystyrene (as this would indicate that the physicians felt the potassium was dangerously high). For patients with repeated hospitalizations, we randomly selected one visit for inclusion to reduce the risk of survivor bias. We calculated the mean of all serum potassium levels during the exposure window, including the admission value, and then evaluated two different categorizations of mean serum potassium, based on categories of risk previously reported in the literature:8,17,18: (1) <4.0, 4.0-4.5, or >4.5 mEq/L and (2) <4.0 versus ≥4.0 mEq/L.
Outcomes
We assessed three outcomes: in-hospital mortality, transfer to an intensive care unit (ICU), and length of stay (LOS). Admission to the ICU was defined as any evidence, after the exposure window, that the patient received care in the ICU. We excluded patients with ICU admissions during the exposure window from the analysis of this outcome. We calculated LOS as the difference between discharge date/time and the admission date/time.
Covariates and Comorbidity Adjustment
We obtained information on patient demographics (age and race) and identified the presence of comorbid conditions using previously derived and validated models.19,20 We then further quantified these conditions into a single combined score to adjust for differences in presenting illness severity (including kidney disease) and help reduce confounding.21 To account for presenting severity of illness, we calculated the Laboratory-based Acute Physiology Score (LAPS-2).22,23 LAPS-2 was developed for predicting mortality risk in general medical patients, but we previously externally validated it against other published clinical HF models in a cohort of patients hospitalized with acute decompensated HF.5
Potassium Repletion
Analysis
We evaluated the differences in patient characteristics across serum potassium categories. Categorical variables are presented as frequencies and percentages, whereas continuous variables are presented as means and standard deviations. For binary outcomes, we used generalized estimating equations (with a binomial family and logit link and clustering by hospital) to estimate incidence and calculate unadjusted and adjusted odds ratios (ORs) and 95% confidence intervals (CIs). For LOS, we estimated the median and 95% CIs using quantile regression with clustered standard errors.24 We calculated all models using both a binary exposure (<4.0 versus ≥4.0 mEq/L) and a three-level categorization (<4.0, 4.0-4.5, and >4.5 mEq/L) to explore the effects at the highest potassium level. We adjusted all models for age, race, LAPS-2 score, and combined comorbidity score. We conducted two sensitivity analyses. First, we restricted our sample to those who never received potassium during the exposure window, as these patients may be different than patients who required potassium repletion. Second, we stratified our findings by the presence or absence of acute or chronic renal insufficiency (defined as an admission creatinine >1 or the presence of a diagnostic code for renal insufficiency, as defined by Elixhauser et al.).19,21 Statistical significance was set at an alpha of 0.05. Analysis was completed using Stata v15.1, StataCorp LP, College Station, Texas.
RESULTS
Cohort Description
We identified patients from 56 geographically diverse US hospitals, although most were located in either the northeast (n = 21; 38%) or south (n = 18; 32%). A total of 59% of the hospitals were teaching hospitals, and nearly 95% were in an urban setting. We identified 13,163 patients with HF, of which 4,995 (38.0%) met the inclusion criteria. We excluded 3,744 (28.4%) patients with LOS < 72 hours, 2,210 (16.8%) with admission potassium values outside of the defined range, and 896 (6.8%) with fewer than three potassium values during the exposure window. Of the patients who met the inclusion criteria, 2,080 (41.6%), 2,326 (46.6%), and 589 (11.8%) were categorized in the <4.0, 4.0-4.5, and >4.5 mEq/L groups, respectively (Table 1). The groups were clinically similar in terms of age, sex, illness severity (LAPS-2), and comorbidity score. Compared with other racial groups, black patients had higher potassium values. While the <4.0 and 4.0-4.5 mEq/L groups were relatively similar, the group with mean potassium >4.5 mEq/L had higher admission creatinine and a greater prevalence of chronic kidney disease, deficiency anemias, and chronic obstructive pulmonary disease (Table 1).
Serum Potassium Values
Individuals’ mean serum potassium within the 72-hour exposure window ranged from 2.9 to 5.8 mEq/L (Table 2). In the <4.0, 4-4.5, and >4.5 mEq/L cohorts respectively, patients had a median serum potassium of 3.8 mEq/L (2.9-3.9), 4.2 mEq/L (4.0-4.5), and 4.7 mEq/L (4.5-5.8) during the exposure window. Approximately half of the patients in the <4.0 mEq/L group had a serum potassium <3.5 mEq/L at some point during the exposure window. In contrast, <10% of the other groups had this low value during the exposure window.
Potassium Repletion
Patients in the <4.0 mEq/L group were much more likely to receive potassium repletion during the exposure window when compared with the 4.0-4.5 mEq/L (71.5% vs 40.5%) and >4.5 mEq/L (71.5% vs 26.7%) groups. On days that they were eligible for repletion (defined as a daily potassium value <4.0 mEq/L), patients with mean serum potassium >4.0 mEq/L were less likely to receive potassium repletion compared with those with values <4.0 mEq/L. There were 592 (28.5%), 1,383 (59.5%), and 432 (73.3%) patients in the <4.0, 4-4.5, and >4,5 mEq/L groups, respectively, who did not receive potassium repletion therapy during the exposure window.
Relationship of Serum Potassium Levels and Outcomes
Overall, 3.7% (n = 187) of patients died during the hospitalization, 2.4% (n = 98) were admitted to the ICU after the exposure window, and the median LOS was 5.6 days. We did not observe a significant association between mean serum potassium of <4.0 or 4.0-4.5 mEq/L and increased risk of mortality, ICU transfer, or LOS (Table 3). Our unadjusted analysis showed that patients with values >4.5 mEq/L had worse outcomes, including more deaths (5.3%; OR = 1.55; 95% CI: 1.01 to 2.39) and ICU admission (3.8%; OR = 2.10; 95% CI: 1.16 to 3.80) compared with those with values <4.0 mEq/L (Table 3). We also found that, compared with the <4.0 mEq/L group, the >4.5 mEq/L group showed just over a half-day longer LOS (0.6 days; 95% CI: 0.0 to 1.0; Table 3). However, we found that mortality and ICU admission results were attenuated after adjustment for age, race, comorbidity score, and LAPS-2 and were no longer statistically significant, whereas the association with LOS was consistent after adjustment. When using a binary exposure (<4.0 versus ≥4.0 mEq/L), we observed no association between mean potassium value and increased risk of mortality, ICU transfer, or LOS both before and after adjustment for age, race, LAPS-2, and comorbidity score (data not shown).
Sensitivity Analyses
In the sensitivity analysis restricted to those who did not receive potassium repletion during the exposure window, we continued to observe no association between the <4.0 and 4.0-4.5 mEq/L groups and outcomes (Table 3). In adjusted models for the >4.5 versus <4.0 mEq/L groups, risk estimates for mortality were similar to the full sample, but statistical significance was lost (OR = 1.56; 95% CI: 0.81 to 3.01). Adjusted risk estimates for ICU transfer were attenuated and not statistically significant (OR = 1.40; 95% CI: 0.60 to 3.26). However, LOS estimates were very similar to that observed in the full dataset (0.6 days; 95% CI: 0.1 to 1.2).
When stratifying our results by the presence or absence of acute or chronic renal insufficiency, we continued to observe no increased risk of any outcome in the 4.0-4.5 mEq/L compared with the <4.0 mEq/L groups across all strata (Table 4). Interestingly, even after adjustment, we did find that most of the increased risk of mortality and ICU admission in the >4.5 versus <4.0 mEq/L groups was among those without renal insufficiency (mortality OR = 3.03; ICU admission OR = 3.00) and was not statistically significant in those with renal insufficiency (mortality OR = 1.27; ICU admission OR = 1.63). Adjusted LOS estimates remained relatively similar in this stratified analysis.
DISCUSSION
The best approach to mild serum potassium value abnormalities in patients hospitalized with HF remains unclear. Many physicians reflexively replete potassium to ensure all patients maintain a serum value of >4.0 mEq/L.15 Yet, in this large observational study of patients hospitalized with an acute HF exacerbation, we found little evidence of association between serum potassium <4.0 mEq/L and negative outcomes.
Compared with those with mean potassium values <4.0 mEq/L (in unadjusted models), there was an association between potassium values of >4.5 mEq/L and increased risk of mortality and ICU transfer. This association was attenuated after adjustment, suggesting that factors beyond potassium values influenced the observed relationship. These findings seem to suggest that unobserved differences in the >4.5 mEq/L group (there were observed differences in this group, eg, greater presenting severity and higher comorbidity scores, suggesting that there were also unobserved differences), and not average potassium value, were the reasons for the observed differences in outcomes. However, we cannot rule out the possibility that potassium >4.5 mEq/L has some associated increased risk compared with mean potassium values of <4.0 mEq/L for patients hospitalized with acute decompensated HF.
Patients in our study routinely received exogenous potassium: more than 70% of patients received repletion at least once, although it is notable that the majority of patients in the 4.0-4.5 and >4.5 mEq/L groups did not receive repletion. Despite this practice, the data supporting this approach to potassium management for patients hospitalized with HF remain mixed. A serum potassium decline of >15% during an acute HF hospital stay has been reported as a predictor of all-cause mortality after controlling for disease severity and associated comorbidities, including renal function.25 However, this study was focused on decline in admission potassium rather than an absolute cut-off (eg, >4.0 mEq/L). Additionally, potassium levels <3.9 mEq/L were associated with increased mortality in patients with acute HF following a myocardial infarction, but this study was not focused on patients with HF.26 Most of the prior literature in patients with HF was conducted in patients in outpatient settings and examined patients who were not experiencing acute exacerbations. MacDonald and Struthers advocate that patients with HF have their potassium maintained above 4.0 mEq/L but did not specify whether this included patients with acute HF exacerbations.10 Additionally, many studies evaluating potassium repletion were conducted before widespread availability of angiotensin-converting enzyme (ACE) inhibitors or potassium-sparing diuretics, including spironolactone. Prior work has consistently reported that hyperkalemia, defined as serum potassium >4.5 mEq/L, is associated with mortality in patients with acute HF over the course of hospitalization (which aligned with the results from our sensitivity analysis), but concurrent medication regimens and underlying impaired renal function likely accounted for most of this association.17 The picture is further complicated as patients with acute HF presenting with hypokalemia may be at risk for subsequent hyperkalemia, and potassium repletion can stimulate aldosterone secretion, potentially exacerbating underlying HF.27,28
These data are observational and are unlikely to change practice. However, daily potassium repletion represents a huge cost in time, money, and effort to the health system. Furthermore, the greatest burden occurs for the patients, who have labs drawn and values checked routinely and potassium administered orally or parenterally. While future randomized clinical trials (RCTs) would best examine the benefits of repletion, future pragmatic trials could attempt to disentangle the associated risks and benefits of potassium repletion in the absence of RCTs. Additionally, such studies could better take into account the role of concurrent medication use (like ACEs or angiotensin II receptor blockers), as well as assess the role of chronic renal insufficiency, acute kidney injury, and magnesium levels.29
This study has limitations. Its retrospective design leads to unmeasured confounding; however, we adjusted for multiple variables (including LAPS-2), which reflect the severity of disease at admission and underlying kidney function at presentation, as well as other comorbid conditions. In addition, data from the cohort only extend to 2012, so more recent changes in practice may not be completely reflected. The nature of the data did not allow us to directly investigate the relationship between serum potassium and arrhythmias, although ICU transfer and mortality were used as surrogates.
In conclusion, the benefit of a serum potassium level >4.0 mEq/L in patients admitted with HF remains unclear. We did not observe that mean potassium values <4.0 mEq/L were associated with worse outcomes, and, more concerning, there may be some risk for patients with mean values >4.5 mEq/L.
Acknowledgments
Dr. Lagu had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Disclosures
The authors report no potential conflicts of interest. Dr. Lagu has served as a consultant for the Yale Center for Outcomes Research and Evaluation, under contract to the Centers for Medicare and Medicaid Services, for which she has provided clinical and methodological expertise and input on the development, reevaluation, and implementation of hospital outcome and efficiency measures.
Funding
Dr. Lagu is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number K01HL114745 and R01 HL139985-01A1. Dr. Stefan is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number K01HL114631-01A1. Dr. Pack is supported by NHLBI 1K23HL135440. Dr. Lindenauer is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number 1K24HL132008.
Disclaimer
The views expressed in this manuscript do not necessarily reflect those of the Yale Center for Outcomes Research and Evaluation or the Centers for Medicare and Medicaid Services.
1. Benjamin EJ, Virani SS, Callaway CW, et al. Heart disease and stroke statistics–2018 update: a report from the American Heart Association. Circulation. 2018;137(12):e67-e492. https://doi.org/10.1161/CIR.0000000000000558.
2. Maggioni AP, Dahlström U, Filippatos G, et al. EURObservational Research Programme: regional differences and 1-year follow-up results of the Heart Failure Pilot Survey (ESC-HF Pilot). Eur J Heart Fail. 2013;15(7):808-817. https://doi.org/10.1093/eurjhf/hft050.
3. Tomaselli GF, Zipes DP. What causes sudden death in heart failure? Circ Res. 2004;95(8):754-763. https://doi.org/10.1161/01.RES.0000145047.
4. Bowen GS, Diop MS, Jiang L, Wu W-C, Rudolph JL. A multivariable prediction model for mortality in individuals admitted for heart failure. J Am Geriatr Soc. 2018;66(5):902-908. https://doi.org/10.1111/jgs.15319.
5. Lagu T, Pekow PS, Shieh M-S, et al. Validation and comparison of seven mortality prediction models for hospitalized patients with acute decompensated heart failure. Circ Heart Fail. 2016;9(8). https://doi.org/10.1161/CIRCHEARTFAILURE.115.002912.
6. Núñez J, Bayés-Genís A, Zannad F, et al. Long-term potassium monitoring and dynamics in heart failure and risk of mortality. Circulation. 2018;137(13):1320-1330. https://doi.org/10.1161/CIRCULATIONAHA.117.030576.
7. Vardeny O, Claggett B, Anand I, et al. Incidence, predictors, and outcomes related to hypo- and hyperkalemia in patients with severe heart failure treated with a mineralocorticoid receptor antagonist. Circ Heart Fail. 2014;7(4):573-579. https://doi.org/10.1161/CIRCHEARTFAILURE.114.00110.
8. Aldahl M, Jensen A-SC, Davidsen L, et al. Associations of serum potassium levels with mortality in chronic heart failure patients. Eur Heart J. 2017;38(38):2890-2896. https://doi.org/10.1093/eurheartj/ehx460.
9. Hoppe LK, Muhlack DC, Koenig W, Carr PR, Brenner H, Schöttker B. Association of abnormal serum potassium levels with arrhythmias and cardiovascular mortality: a systematic review and meta-analysis of observational studies. Cardiovasc Drugs Ther. 2018;32(2):197-212. https://doi.org/10.1007/s10557-018-6783-0.
10. Macdonald JE, Struthers AD. What is the optimal serum potassium level in cardiovascular patients? J Am Coll Cardiol. 2004;43(2):155-161. https://doi.org/10.1016/j.jacc.2003.06.021.
11. Hulting J. In-hospital ventricular fibrillation and its relation to serum potassium. Acta Med Scand Suppl. 1981;647(647):109-116. https://doi.org/10.1111/j.0954-6820.1981.tb02646.x.
12. Skogestad J, Aronsen JM. Hypokalemia-induced arrhythmias and heart failure: new insights and implications for therapy. Front Physiol. 2018;9:1500. https://doi.org/10.3389/fphys.2018.01500.
13. Tromp J, Ter Maaten JM, Damman K, et al. Serum potassium levels and outcome in acute heart failure (data from the PROTECT and COACH trials). Am J Cardiol. 2017;119(2):290-296. https://doi.org/10.1016/j.amjcard.2016.09.038.
14. Khan SS, Campia U, Chioncel O, et al. Changes in serum potassium levels during hospitalization in patients with worsening heart failure and reduced ejection fraction (from the EVEREST trial). Am J Cardiol. 2015;115(6):790-796. https://doi.org/10.1016/j.amjcard.2014.12.045
15. Viera AJ, Wouk N. Potassium disorders: hypokalemia and hyperkalemia. Am Fam Physician. 2015;92(6):487-495.
16. Krumholz HM, Wang Y, Mattera JA, et al. An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with heart failure. Circulation. 2006;113(13):1693-1701. https://doi.org/10.1161/CIRCULATIONAHA.105.611194.
17. Legrand M, Ludes P-O, Massy Z, et al. Association between hypo- and hyperkalemia and outcome in acute heart failure patients: the role of medications. Clin Res Cardiol. 2018;107(3):214-221. https://doi.org/10.1007/s00392-017-1173-3.
18. Kok W, Salah K, Stienen S. Are changes in serum potassium levels during admissions for acute decompensated heart failure irrelevant for prognosis: the end of the story? Am J Cardiol. 2015;116(5):825. https://doi.org/10.1016/j.amjcard.2015.05.059.
19. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8-27. https://doi.org/10.1097/00005650-199801000-00004.
20. Quan H, Parsons GA, Ghali WA. Validity of information on comorbidity derived from ICD-9-CCM administrative data. Med Care. 2002;40(8):675-685. https://doi.org/10.1097/01.MLR.0000020927.46398.5D.
21. Gagne JJ, Glynn RJ, Avorn J, Levin R, Schneeweiss S. A combined comorbidity score predicted mortality in elderly patients better than existing scores. J Clin Epidemiol. 2011;64(7):749-759. https://doi.org/10.1016/j.jclinepi.2010.10.004.
22. Escobar GJ, Gardner MN, Greene JD, Draper D, Kipnis P. Risk-adjusting hospital mortality using a comprehensive electronic record in an integrated health care delivery system. Med Care. 2013;51(5):446-453. https://doi.org/10.1097/MLR.0b013e3182881c8e.
23. Escobar GJ, Greene JD, Scheirer P, Gardner MN, Draper D, Kipnis P. Risk-adjusting hospital inpatient mortality using automated inpatient, outpatient, and laboratory databases. Med Care. 2008;46(3):232-239. https://doi.org/10.1097/MLR.0b013e3181589bb6.
24. Parente PMDC, Santos Silva JMC. Quantile regression with clustered data. J Econom Method. 2016;5(1):1-15. https://doi.org/10.1515/jem-2014-0011.
25. Salah K, Pinto YM, Eurlings LW, et al. Serum potassium decline during hospitalization for acute decompensated heart failure is a predictor of 6-month mortality, independent of N-terminal pro-B-type natriuretic peptide levels: An individual patient data analysis. Am Heart J. 2015;170(3):531-542.e1. https://doi.org/10.1016/j.ahj.2015.06.003.
26. Krogager ML, Eggers-Kaas L, Aasbjerg K, et al. Short-term mortality risk of serum potassium levels in acute heart failure following myocardial infarction. Eur Heart J Cardiovasc Pharmacother. 2015;1(4):245-251. https://doi.org/10.1093/ehjcvp/pvv026.
27. Crop MJ, Hoorn EJ, Lindemans J, Zietse R. Hypokalaemia and subsequent hyperkalaemia in hospitalized patients. Nephrol Dial Transplant. 2007;22(12):3471-3477.https://doi.org/10.1093/ndt/gfm471.
28. Kok W, Salah K, Stienen S. Serum potassium levels during admissions for acute decompensated heart failure: identifying possible threats to outcome. Am J Cardiol. 2018;121(1):141. https://doi.org/10.1016/j.amjcard.2017.09.032.
29. Freda BJ, Knee AB, Braden GL, Visintainer PF, Thakar CV. Effect of transient and sustained acute kidney injury on readmissions in acute decompensated heart failure. Am J Cardiol. 2017;119(11):1809-1814. https://doi.org/10.1016/j.amjcard.2017.02.044.
Heart failure (HF) is a leading cause of hospital admission and mortality, accounting for approximately 900,000 discharges in 2014.1 One-year all-cause mortality risk has been estimated at 17% after hospitalization,2 and roughly 50% of deaths are related to sudden cardiac death, mostly due to ventricular arrhythmia.
The principles underlying potassium management in acute HF are complex. Both low and high values have been linked to fatal arrhythmias, notably ventricular fibrillation, and small serum changes often reflect large total body potassium fluctuations.11 Recent literature links hypokalemia to general membrane hypoexcitability, skeletal muscle hyporeflexia, and arrhythmias initiated by reduced sodium-potassium adenosine triphosphatase activity, leading to increased intracellular calcium and regional variations in action potential duration.12 Potassium abnormalities are common at admission and may be exacerbated by both acute illness and treatments given during hospitalization, including baseline potassium, acute kidney injury, aggressive diuretic therapy, or other potassium-related treatments and conditions.13 The success of potassium repletion may also be affected by the choice of HF therapies.14
The belief that patients with HF must maintain a potassium >4.0 mEq/L remains pervasive, with at least one family medicine guideline recommending that patients with HF maintain a serum potassium level >4.0 mEq/L.
METHODS
Data Sources and Cohort Definition
The Institutional Review Board at Baystate Medical Center approved this study. We identified patients with HF who were admitted for more than 72 hours between January 2010 and December 2012 to hospitals contributing to the HealthFacts database, a multihospital dataset derived from the comprehensive electronic health records of 116 geographically and structurally diverse hospitals throughout the United States (Cerner Corp.). HealthFacts—which includes date-stamped pharmacy, laboratory, and billing information—contains records of more than 84 million acute admissions, emergency room visits, and ambulatory visits. We limited the sample to hospitals that contributed to the pharmacy, laboratory, and diagnosis segments.
We included patients who had a principal International Classification of Disease (ICD-9-CM) diagnosis of HF or a principal diagnosis of respiratory failure with secondary diagnosis of HF (ICD-9-CM codes for HF: 402.01, 402.11, 402.91, 404.01, 404.03, 404.11, 404.13, 404.91, 404.93, 428.xx16 and for respiratory failure: 518.81, 518.82, 518.84) and were 18 years or older. We ensured that patients were treated for acute decompensated HF during the hospitalization by restricting the cohort to patients in whom at least one HF therapy (eg, loop diuretics, metolazone, inotropes, and intra-aortic balloon pump) was initiated within the first two days of hospitalization. We excluded patients with a pediatric or psychiatric attending physician, those with elective admissions, and those who were transferred from or to another acute care facility because we could not accurately determine the onset or subsequent course of their illness.
Definition of Variables Describing Serum Potassium Levels
We limited the sample to patients hospitalized for longer than 72 hours in order to observe how initial potassium values influenced outcomes over the course of hospitalization. We chose an exposure window of 72 hours because this allowed, on average, three potential observations of serum potassium per patient. We further restricted the sample to those who had a normal potassium value (3.5-5.0 mEq/L) at admission (defined as 24 hours prior to admission through midnight of the day of admission) to ensure that the included patients did not have abnormal potassium values upon presentation. We identified the period of time from 24 hours prior to admission through 72 hours following admission as “the exposure window” (the time during which patients were eligible to be classified into average serum potassium levels of <4.0, 4.0-4.5, or >4.5 mEq/L). We excluded patients who, during this window, had fewer than three serum potassium levels drawn (“exposure” levels could be disproportionately influenced by a single value) or received sodium polystyrene (as this would indicate that the physicians felt the potassium was dangerously high). For patients with repeated hospitalizations, we randomly selected one visit for inclusion to reduce the risk of survivor bias. We calculated the mean of all serum potassium levels during the exposure window, including the admission value, and then evaluated two different categorizations of mean serum potassium, based on categories of risk previously reported in the literature:8,17,18: (1) <4.0, 4.0-4.5, or >4.5 mEq/L and (2) <4.0 versus ≥4.0 mEq/L.
Outcomes
We assessed three outcomes: in-hospital mortality, transfer to an intensive care unit (ICU), and length of stay (LOS). Admission to the ICU was defined as any evidence, after the exposure window, that the patient received care in the ICU. We excluded patients with ICU admissions during the exposure window from the analysis of this outcome. We calculated LOS as the difference between discharge date/time and the admission date/time.
Covariates and Comorbidity Adjustment
We obtained information on patient demographics (age and race) and identified the presence of comorbid conditions using previously derived and validated models.19,20 We then further quantified these conditions into a single combined score to adjust for differences in presenting illness severity (including kidney disease) and help reduce confounding.21 To account for presenting severity of illness, we calculated the Laboratory-based Acute Physiology Score (LAPS-2).22,23 LAPS-2 was developed for predicting mortality risk in general medical patients, but we previously externally validated it against other published clinical HF models in a cohort of patients hospitalized with acute decompensated HF.5
Potassium Repletion
Analysis
We evaluated the differences in patient characteristics across serum potassium categories. Categorical variables are presented as frequencies and percentages, whereas continuous variables are presented as means and standard deviations. For binary outcomes, we used generalized estimating equations (with a binomial family and logit link and clustering by hospital) to estimate incidence and calculate unadjusted and adjusted odds ratios (ORs) and 95% confidence intervals (CIs). For LOS, we estimated the median and 95% CIs using quantile regression with clustered standard errors.24 We calculated all models using both a binary exposure (<4.0 versus ≥4.0 mEq/L) and a three-level categorization (<4.0, 4.0-4.5, and >4.5 mEq/L) to explore the effects at the highest potassium level. We adjusted all models for age, race, LAPS-2 score, and combined comorbidity score. We conducted two sensitivity analyses. First, we restricted our sample to those who never received potassium during the exposure window, as these patients may be different than patients who required potassium repletion. Second, we stratified our findings by the presence or absence of acute or chronic renal insufficiency (defined as an admission creatinine >1 or the presence of a diagnostic code for renal insufficiency, as defined by Elixhauser et al.).19,21 Statistical significance was set at an alpha of 0.05. Analysis was completed using Stata v15.1, StataCorp LP, College Station, Texas.
RESULTS
Cohort Description
We identified patients from 56 geographically diverse US hospitals, although most were located in either the northeast (n = 21; 38%) or south (n = 18; 32%). A total of 59% of the hospitals were teaching hospitals, and nearly 95% were in an urban setting. We identified 13,163 patients with HF, of which 4,995 (38.0%) met the inclusion criteria. We excluded 3,744 (28.4%) patients with LOS < 72 hours, 2,210 (16.8%) with admission potassium values outside of the defined range, and 896 (6.8%) with fewer than three potassium values during the exposure window. Of the patients who met the inclusion criteria, 2,080 (41.6%), 2,326 (46.6%), and 589 (11.8%) were categorized in the <4.0, 4.0-4.5, and >4.5 mEq/L groups, respectively (Table 1). The groups were clinically similar in terms of age, sex, illness severity (LAPS-2), and comorbidity score. Compared with other racial groups, black patients had higher potassium values. While the <4.0 and 4.0-4.5 mEq/L groups were relatively similar, the group with mean potassium >4.5 mEq/L had higher admission creatinine and a greater prevalence of chronic kidney disease, deficiency anemias, and chronic obstructive pulmonary disease (Table 1).
Serum Potassium Values
Individuals’ mean serum potassium within the 72-hour exposure window ranged from 2.9 to 5.8 mEq/L (Table 2). In the <4.0, 4-4.5, and >4.5 mEq/L cohorts respectively, patients had a median serum potassium of 3.8 mEq/L (2.9-3.9), 4.2 mEq/L (4.0-4.5), and 4.7 mEq/L (4.5-5.8) during the exposure window. Approximately half of the patients in the <4.0 mEq/L group had a serum potassium <3.5 mEq/L at some point during the exposure window. In contrast, <10% of the other groups had this low value during the exposure window.
Potassium Repletion
Patients in the <4.0 mEq/L group were much more likely to receive potassium repletion during the exposure window when compared with the 4.0-4.5 mEq/L (71.5% vs 40.5%) and >4.5 mEq/L (71.5% vs 26.7%) groups. On days that they were eligible for repletion (defined as a daily potassium value <4.0 mEq/L), patients with mean serum potassium >4.0 mEq/L were less likely to receive potassium repletion compared with those with values <4.0 mEq/L. There were 592 (28.5%), 1,383 (59.5%), and 432 (73.3%) patients in the <4.0, 4-4.5, and >4,5 mEq/L groups, respectively, who did not receive potassium repletion therapy during the exposure window.
Relationship of Serum Potassium Levels and Outcomes
Overall, 3.7% (n = 187) of patients died during the hospitalization, 2.4% (n = 98) were admitted to the ICU after the exposure window, and the median LOS was 5.6 days. We did not observe a significant association between mean serum potassium of <4.0 or 4.0-4.5 mEq/L and increased risk of mortality, ICU transfer, or LOS (Table 3). Our unadjusted analysis showed that patients with values >4.5 mEq/L had worse outcomes, including more deaths (5.3%; OR = 1.55; 95% CI: 1.01 to 2.39) and ICU admission (3.8%; OR = 2.10; 95% CI: 1.16 to 3.80) compared with those with values <4.0 mEq/L (Table 3). We also found that, compared with the <4.0 mEq/L group, the >4.5 mEq/L group showed just over a half-day longer LOS (0.6 days; 95% CI: 0.0 to 1.0; Table 3). However, we found that mortality and ICU admission results were attenuated after adjustment for age, race, comorbidity score, and LAPS-2 and were no longer statistically significant, whereas the association with LOS was consistent after adjustment. When using a binary exposure (<4.0 versus ≥4.0 mEq/L), we observed no association between mean potassium value and increased risk of mortality, ICU transfer, or LOS both before and after adjustment for age, race, LAPS-2, and comorbidity score (data not shown).
Sensitivity Analyses
In the sensitivity analysis restricted to those who did not receive potassium repletion during the exposure window, we continued to observe no association between the <4.0 and 4.0-4.5 mEq/L groups and outcomes (Table 3). In adjusted models for the >4.5 versus <4.0 mEq/L groups, risk estimates for mortality were similar to the full sample, but statistical significance was lost (OR = 1.56; 95% CI: 0.81 to 3.01). Adjusted risk estimates for ICU transfer were attenuated and not statistically significant (OR = 1.40; 95% CI: 0.60 to 3.26). However, LOS estimates were very similar to that observed in the full dataset (0.6 days; 95% CI: 0.1 to 1.2).
When stratifying our results by the presence or absence of acute or chronic renal insufficiency, we continued to observe no increased risk of any outcome in the 4.0-4.5 mEq/L compared with the <4.0 mEq/L groups across all strata (Table 4). Interestingly, even after adjustment, we did find that most of the increased risk of mortality and ICU admission in the >4.5 versus <4.0 mEq/L groups was among those without renal insufficiency (mortality OR = 3.03; ICU admission OR = 3.00) and was not statistically significant in those with renal insufficiency (mortality OR = 1.27; ICU admission OR = 1.63). Adjusted LOS estimates remained relatively similar in this stratified analysis.
DISCUSSION
The best approach to mild serum potassium value abnormalities in patients hospitalized with HF remains unclear. Many physicians reflexively replete potassium to ensure all patients maintain a serum value of >4.0 mEq/L.15 Yet, in this large observational study of patients hospitalized with an acute HF exacerbation, we found little evidence of association between serum potassium <4.0 mEq/L and negative outcomes.
Compared with those with mean potassium values <4.0 mEq/L (in unadjusted models), there was an association between potassium values of >4.5 mEq/L and increased risk of mortality and ICU transfer. This association was attenuated after adjustment, suggesting that factors beyond potassium values influenced the observed relationship. These findings seem to suggest that unobserved differences in the >4.5 mEq/L group (there were observed differences in this group, eg, greater presenting severity and higher comorbidity scores, suggesting that there were also unobserved differences), and not average potassium value, were the reasons for the observed differences in outcomes. However, we cannot rule out the possibility that potassium >4.5 mEq/L has some associated increased risk compared with mean potassium values of <4.0 mEq/L for patients hospitalized with acute decompensated HF.
Patients in our study routinely received exogenous potassium: more than 70% of patients received repletion at least once, although it is notable that the majority of patients in the 4.0-4.5 and >4.5 mEq/L groups did not receive repletion. Despite this practice, the data supporting this approach to potassium management for patients hospitalized with HF remain mixed. A serum potassium decline of >15% during an acute HF hospital stay has been reported as a predictor of all-cause mortality after controlling for disease severity and associated comorbidities, including renal function.25 However, this study was focused on decline in admission potassium rather than an absolute cut-off (eg, >4.0 mEq/L). Additionally, potassium levels <3.9 mEq/L were associated with increased mortality in patients with acute HF following a myocardial infarction, but this study was not focused on patients with HF.26 Most of the prior literature in patients with HF was conducted in patients in outpatient settings and examined patients who were not experiencing acute exacerbations. MacDonald and Struthers advocate that patients with HF have their potassium maintained above 4.0 mEq/L but did not specify whether this included patients with acute HF exacerbations.10 Additionally, many studies evaluating potassium repletion were conducted before widespread availability of angiotensin-converting enzyme (ACE) inhibitors or potassium-sparing diuretics, including spironolactone. Prior work has consistently reported that hyperkalemia, defined as serum potassium >4.5 mEq/L, is associated with mortality in patients with acute HF over the course of hospitalization (which aligned with the results from our sensitivity analysis), but concurrent medication regimens and underlying impaired renal function likely accounted for most of this association.17 The picture is further complicated as patients with acute HF presenting with hypokalemia may be at risk for subsequent hyperkalemia, and potassium repletion can stimulate aldosterone secretion, potentially exacerbating underlying HF.27,28
These data are observational and are unlikely to change practice. However, daily potassium repletion represents a huge cost in time, money, and effort to the health system. Furthermore, the greatest burden occurs for the patients, who have labs drawn and values checked routinely and potassium administered orally or parenterally. While future randomized clinical trials (RCTs) would best examine the benefits of repletion, future pragmatic trials could attempt to disentangle the associated risks and benefits of potassium repletion in the absence of RCTs. Additionally, such studies could better take into account the role of concurrent medication use (like ACEs or angiotensin II receptor blockers), as well as assess the role of chronic renal insufficiency, acute kidney injury, and magnesium levels.29
This study has limitations. Its retrospective design leads to unmeasured confounding; however, we adjusted for multiple variables (including LAPS-2), which reflect the severity of disease at admission and underlying kidney function at presentation, as well as other comorbid conditions. In addition, data from the cohort only extend to 2012, so more recent changes in practice may not be completely reflected. The nature of the data did not allow us to directly investigate the relationship between serum potassium and arrhythmias, although ICU transfer and mortality were used as surrogates.
In conclusion, the benefit of a serum potassium level >4.0 mEq/L in patients admitted with HF remains unclear. We did not observe that mean potassium values <4.0 mEq/L were associated with worse outcomes, and, more concerning, there may be some risk for patients with mean values >4.5 mEq/L.
Acknowledgments
Dr. Lagu had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Disclosures
The authors report no potential conflicts of interest. Dr. Lagu has served as a consultant for the Yale Center for Outcomes Research and Evaluation, under contract to the Centers for Medicare and Medicaid Services, for which she has provided clinical and methodological expertise and input on the development, reevaluation, and implementation of hospital outcome and efficiency measures.
Funding
Dr. Lagu is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number K01HL114745 and R01 HL139985-01A1. Dr. Stefan is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number K01HL114631-01A1. Dr. Pack is supported by NHLBI 1K23HL135440. Dr. Lindenauer is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number 1K24HL132008.
Disclaimer
The views expressed in this manuscript do not necessarily reflect those of the Yale Center for Outcomes Research and Evaluation or the Centers for Medicare and Medicaid Services.
Heart failure (HF) is a leading cause of hospital admission and mortality, accounting for approximately 900,000 discharges in 2014.1 One-year all-cause mortality risk has been estimated at 17% after hospitalization,2 and roughly 50% of deaths are related to sudden cardiac death, mostly due to ventricular arrhythmia.
The principles underlying potassium management in acute HF are complex. Both low and high values have been linked to fatal arrhythmias, notably ventricular fibrillation, and small serum changes often reflect large total body potassium fluctuations.11 Recent literature links hypokalemia to general membrane hypoexcitability, skeletal muscle hyporeflexia, and arrhythmias initiated by reduced sodium-potassium adenosine triphosphatase activity, leading to increased intracellular calcium and regional variations in action potential duration.12 Potassium abnormalities are common at admission and may be exacerbated by both acute illness and treatments given during hospitalization, including baseline potassium, acute kidney injury, aggressive diuretic therapy, or other potassium-related treatments and conditions.13 The success of potassium repletion may also be affected by the choice of HF therapies.14
The belief that patients with HF must maintain a potassium >4.0 mEq/L remains pervasive, with at least one family medicine guideline recommending that patients with HF maintain a serum potassium level >4.0 mEq/L.
METHODS
Data Sources and Cohort Definition
The Institutional Review Board at Baystate Medical Center approved this study. We identified patients with HF who were admitted for more than 72 hours between January 2010 and December 2012 to hospitals contributing to the HealthFacts database, a multihospital dataset derived from the comprehensive electronic health records of 116 geographically and structurally diverse hospitals throughout the United States (Cerner Corp.). HealthFacts—which includes date-stamped pharmacy, laboratory, and billing information—contains records of more than 84 million acute admissions, emergency room visits, and ambulatory visits. We limited the sample to hospitals that contributed to the pharmacy, laboratory, and diagnosis segments.
We included patients who had a principal International Classification of Disease (ICD-9-CM) diagnosis of HF or a principal diagnosis of respiratory failure with secondary diagnosis of HF (ICD-9-CM codes for HF: 402.01, 402.11, 402.91, 404.01, 404.03, 404.11, 404.13, 404.91, 404.93, 428.xx16 and for respiratory failure: 518.81, 518.82, 518.84) and were 18 years or older. We ensured that patients were treated for acute decompensated HF during the hospitalization by restricting the cohort to patients in whom at least one HF therapy (eg, loop diuretics, metolazone, inotropes, and intra-aortic balloon pump) was initiated within the first two days of hospitalization. We excluded patients with a pediatric or psychiatric attending physician, those with elective admissions, and those who were transferred from or to another acute care facility because we could not accurately determine the onset or subsequent course of their illness.
Definition of Variables Describing Serum Potassium Levels
We limited the sample to patients hospitalized for longer than 72 hours in order to observe how initial potassium values influenced outcomes over the course of hospitalization. We chose an exposure window of 72 hours because this allowed, on average, three potential observations of serum potassium per patient. We further restricted the sample to those who had a normal potassium value (3.5-5.0 mEq/L) at admission (defined as 24 hours prior to admission through midnight of the day of admission) to ensure that the included patients did not have abnormal potassium values upon presentation. We identified the period of time from 24 hours prior to admission through 72 hours following admission as “the exposure window” (the time during which patients were eligible to be classified into average serum potassium levels of <4.0, 4.0-4.5, or >4.5 mEq/L). We excluded patients who, during this window, had fewer than three serum potassium levels drawn (“exposure” levels could be disproportionately influenced by a single value) or received sodium polystyrene (as this would indicate that the physicians felt the potassium was dangerously high). For patients with repeated hospitalizations, we randomly selected one visit for inclusion to reduce the risk of survivor bias. We calculated the mean of all serum potassium levels during the exposure window, including the admission value, and then evaluated two different categorizations of mean serum potassium, based on categories of risk previously reported in the literature:8,17,18: (1) <4.0, 4.0-4.5, or >4.5 mEq/L and (2) <4.0 versus ≥4.0 mEq/L.
Outcomes
We assessed three outcomes: in-hospital mortality, transfer to an intensive care unit (ICU), and length of stay (LOS). Admission to the ICU was defined as any evidence, after the exposure window, that the patient received care in the ICU. We excluded patients with ICU admissions during the exposure window from the analysis of this outcome. We calculated LOS as the difference between discharge date/time and the admission date/time.
Covariates and Comorbidity Adjustment
We obtained information on patient demographics (age and race) and identified the presence of comorbid conditions using previously derived and validated models.19,20 We then further quantified these conditions into a single combined score to adjust for differences in presenting illness severity (including kidney disease) and help reduce confounding.21 To account for presenting severity of illness, we calculated the Laboratory-based Acute Physiology Score (LAPS-2).22,23 LAPS-2 was developed for predicting mortality risk in general medical patients, but we previously externally validated it against other published clinical HF models in a cohort of patients hospitalized with acute decompensated HF.5
Potassium Repletion
Analysis
We evaluated the differences in patient characteristics across serum potassium categories. Categorical variables are presented as frequencies and percentages, whereas continuous variables are presented as means and standard deviations. For binary outcomes, we used generalized estimating equations (with a binomial family and logit link and clustering by hospital) to estimate incidence and calculate unadjusted and adjusted odds ratios (ORs) and 95% confidence intervals (CIs). For LOS, we estimated the median and 95% CIs using quantile regression with clustered standard errors.24 We calculated all models using both a binary exposure (<4.0 versus ≥4.0 mEq/L) and a three-level categorization (<4.0, 4.0-4.5, and >4.5 mEq/L) to explore the effects at the highest potassium level. We adjusted all models for age, race, LAPS-2 score, and combined comorbidity score. We conducted two sensitivity analyses. First, we restricted our sample to those who never received potassium during the exposure window, as these patients may be different than patients who required potassium repletion. Second, we stratified our findings by the presence or absence of acute or chronic renal insufficiency (defined as an admission creatinine >1 or the presence of a diagnostic code for renal insufficiency, as defined by Elixhauser et al.).19,21 Statistical significance was set at an alpha of 0.05. Analysis was completed using Stata v15.1, StataCorp LP, College Station, Texas.
RESULTS
Cohort Description
We identified patients from 56 geographically diverse US hospitals, although most were located in either the northeast (n = 21; 38%) or south (n = 18; 32%). A total of 59% of the hospitals were teaching hospitals, and nearly 95% were in an urban setting. We identified 13,163 patients with HF, of which 4,995 (38.0%) met the inclusion criteria. We excluded 3,744 (28.4%) patients with LOS < 72 hours, 2,210 (16.8%) with admission potassium values outside of the defined range, and 896 (6.8%) with fewer than three potassium values during the exposure window. Of the patients who met the inclusion criteria, 2,080 (41.6%), 2,326 (46.6%), and 589 (11.8%) were categorized in the <4.0, 4.0-4.5, and >4.5 mEq/L groups, respectively (Table 1). The groups were clinically similar in terms of age, sex, illness severity (LAPS-2), and comorbidity score. Compared with other racial groups, black patients had higher potassium values. While the <4.0 and 4.0-4.5 mEq/L groups were relatively similar, the group with mean potassium >4.5 mEq/L had higher admission creatinine and a greater prevalence of chronic kidney disease, deficiency anemias, and chronic obstructive pulmonary disease (Table 1).
Serum Potassium Values
Individuals’ mean serum potassium within the 72-hour exposure window ranged from 2.9 to 5.8 mEq/L (Table 2). In the <4.0, 4-4.5, and >4.5 mEq/L cohorts respectively, patients had a median serum potassium of 3.8 mEq/L (2.9-3.9), 4.2 mEq/L (4.0-4.5), and 4.7 mEq/L (4.5-5.8) during the exposure window. Approximately half of the patients in the <4.0 mEq/L group had a serum potassium <3.5 mEq/L at some point during the exposure window. In contrast, <10% of the other groups had this low value during the exposure window.
Potassium Repletion
Patients in the <4.0 mEq/L group were much more likely to receive potassium repletion during the exposure window when compared with the 4.0-4.5 mEq/L (71.5% vs 40.5%) and >4.5 mEq/L (71.5% vs 26.7%) groups. On days that they were eligible for repletion (defined as a daily potassium value <4.0 mEq/L), patients with mean serum potassium >4.0 mEq/L were less likely to receive potassium repletion compared with those with values <4.0 mEq/L. There were 592 (28.5%), 1,383 (59.5%), and 432 (73.3%) patients in the <4.0, 4-4.5, and >4,5 mEq/L groups, respectively, who did not receive potassium repletion therapy during the exposure window.
Relationship of Serum Potassium Levels and Outcomes
Overall, 3.7% (n = 187) of patients died during the hospitalization, 2.4% (n = 98) were admitted to the ICU after the exposure window, and the median LOS was 5.6 days. We did not observe a significant association between mean serum potassium of <4.0 or 4.0-4.5 mEq/L and increased risk of mortality, ICU transfer, or LOS (Table 3). Our unadjusted analysis showed that patients with values >4.5 mEq/L had worse outcomes, including more deaths (5.3%; OR = 1.55; 95% CI: 1.01 to 2.39) and ICU admission (3.8%; OR = 2.10; 95% CI: 1.16 to 3.80) compared with those with values <4.0 mEq/L (Table 3). We also found that, compared with the <4.0 mEq/L group, the >4.5 mEq/L group showed just over a half-day longer LOS (0.6 days; 95% CI: 0.0 to 1.0; Table 3). However, we found that mortality and ICU admission results were attenuated after adjustment for age, race, comorbidity score, and LAPS-2 and were no longer statistically significant, whereas the association with LOS was consistent after adjustment. When using a binary exposure (<4.0 versus ≥4.0 mEq/L), we observed no association between mean potassium value and increased risk of mortality, ICU transfer, or LOS both before and after adjustment for age, race, LAPS-2, and comorbidity score (data not shown).
Sensitivity Analyses
In the sensitivity analysis restricted to those who did not receive potassium repletion during the exposure window, we continued to observe no association between the <4.0 and 4.0-4.5 mEq/L groups and outcomes (Table 3). In adjusted models for the >4.5 versus <4.0 mEq/L groups, risk estimates for mortality were similar to the full sample, but statistical significance was lost (OR = 1.56; 95% CI: 0.81 to 3.01). Adjusted risk estimates for ICU transfer were attenuated and not statistically significant (OR = 1.40; 95% CI: 0.60 to 3.26). However, LOS estimates were very similar to that observed in the full dataset (0.6 days; 95% CI: 0.1 to 1.2).
When stratifying our results by the presence or absence of acute or chronic renal insufficiency, we continued to observe no increased risk of any outcome in the 4.0-4.5 mEq/L compared with the <4.0 mEq/L groups across all strata (Table 4). Interestingly, even after adjustment, we did find that most of the increased risk of mortality and ICU admission in the >4.5 versus <4.0 mEq/L groups was among those without renal insufficiency (mortality OR = 3.03; ICU admission OR = 3.00) and was not statistically significant in those with renal insufficiency (mortality OR = 1.27; ICU admission OR = 1.63). Adjusted LOS estimates remained relatively similar in this stratified analysis.
DISCUSSION
The best approach to mild serum potassium value abnormalities in patients hospitalized with HF remains unclear. Many physicians reflexively replete potassium to ensure all patients maintain a serum value of >4.0 mEq/L.15 Yet, in this large observational study of patients hospitalized with an acute HF exacerbation, we found little evidence of association between serum potassium <4.0 mEq/L and negative outcomes.
Compared with those with mean potassium values <4.0 mEq/L (in unadjusted models), there was an association between potassium values of >4.5 mEq/L and increased risk of mortality and ICU transfer. This association was attenuated after adjustment, suggesting that factors beyond potassium values influenced the observed relationship. These findings seem to suggest that unobserved differences in the >4.5 mEq/L group (there were observed differences in this group, eg, greater presenting severity and higher comorbidity scores, suggesting that there were also unobserved differences), and not average potassium value, were the reasons for the observed differences in outcomes. However, we cannot rule out the possibility that potassium >4.5 mEq/L has some associated increased risk compared with mean potassium values of <4.0 mEq/L for patients hospitalized with acute decompensated HF.
Patients in our study routinely received exogenous potassium: more than 70% of patients received repletion at least once, although it is notable that the majority of patients in the 4.0-4.5 and >4.5 mEq/L groups did not receive repletion. Despite this practice, the data supporting this approach to potassium management for patients hospitalized with HF remain mixed. A serum potassium decline of >15% during an acute HF hospital stay has been reported as a predictor of all-cause mortality after controlling for disease severity and associated comorbidities, including renal function.25 However, this study was focused on decline in admission potassium rather than an absolute cut-off (eg, >4.0 mEq/L). Additionally, potassium levels <3.9 mEq/L were associated with increased mortality in patients with acute HF following a myocardial infarction, but this study was not focused on patients with HF.26 Most of the prior literature in patients with HF was conducted in patients in outpatient settings and examined patients who were not experiencing acute exacerbations. MacDonald and Struthers advocate that patients with HF have their potassium maintained above 4.0 mEq/L but did not specify whether this included patients with acute HF exacerbations.10 Additionally, many studies evaluating potassium repletion were conducted before widespread availability of angiotensin-converting enzyme (ACE) inhibitors or potassium-sparing diuretics, including spironolactone. Prior work has consistently reported that hyperkalemia, defined as serum potassium >4.5 mEq/L, is associated with mortality in patients with acute HF over the course of hospitalization (which aligned with the results from our sensitivity analysis), but concurrent medication regimens and underlying impaired renal function likely accounted for most of this association.17 The picture is further complicated as patients with acute HF presenting with hypokalemia may be at risk for subsequent hyperkalemia, and potassium repletion can stimulate aldosterone secretion, potentially exacerbating underlying HF.27,28
These data are observational and are unlikely to change practice. However, daily potassium repletion represents a huge cost in time, money, and effort to the health system. Furthermore, the greatest burden occurs for the patients, who have labs drawn and values checked routinely and potassium administered orally or parenterally. While future randomized clinical trials (RCTs) would best examine the benefits of repletion, future pragmatic trials could attempt to disentangle the associated risks and benefits of potassium repletion in the absence of RCTs. Additionally, such studies could better take into account the role of concurrent medication use (like ACEs or angiotensin II receptor blockers), as well as assess the role of chronic renal insufficiency, acute kidney injury, and magnesium levels.29
This study has limitations. Its retrospective design leads to unmeasured confounding; however, we adjusted for multiple variables (including LAPS-2), which reflect the severity of disease at admission and underlying kidney function at presentation, as well as other comorbid conditions. In addition, data from the cohort only extend to 2012, so more recent changes in practice may not be completely reflected. The nature of the data did not allow us to directly investigate the relationship between serum potassium and arrhythmias, although ICU transfer and mortality were used as surrogates.
In conclusion, the benefit of a serum potassium level >4.0 mEq/L in patients admitted with HF remains unclear. We did not observe that mean potassium values <4.0 mEq/L were associated with worse outcomes, and, more concerning, there may be some risk for patients with mean values >4.5 mEq/L.
Acknowledgments
Dr. Lagu had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Disclosures
The authors report no potential conflicts of interest. Dr. Lagu has served as a consultant for the Yale Center for Outcomes Research and Evaluation, under contract to the Centers for Medicare and Medicaid Services, for which she has provided clinical and methodological expertise and input on the development, reevaluation, and implementation of hospital outcome and efficiency measures.
Funding
Dr. Lagu is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number K01HL114745 and R01 HL139985-01A1. Dr. Stefan is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number K01HL114631-01A1. Dr. Pack is supported by NHLBI 1K23HL135440. Dr. Lindenauer is supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number 1K24HL132008.
Disclaimer
The views expressed in this manuscript do not necessarily reflect those of the Yale Center for Outcomes Research and Evaluation or the Centers for Medicare and Medicaid Services.
1. Benjamin EJ, Virani SS, Callaway CW, et al. Heart disease and stroke statistics–2018 update: a report from the American Heart Association. Circulation. 2018;137(12):e67-e492. https://doi.org/10.1161/CIR.0000000000000558.
2. Maggioni AP, Dahlström U, Filippatos G, et al. EURObservational Research Programme: regional differences and 1-year follow-up results of the Heart Failure Pilot Survey (ESC-HF Pilot). Eur J Heart Fail. 2013;15(7):808-817. https://doi.org/10.1093/eurjhf/hft050.
3. Tomaselli GF, Zipes DP. What causes sudden death in heart failure? Circ Res. 2004;95(8):754-763. https://doi.org/10.1161/01.RES.0000145047.
4. Bowen GS, Diop MS, Jiang L, Wu W-C, Rudolph JL. A multivariable prediction model for mortality in individuals admitted for heart failure. J Am Geriatr Soc. 2018;66(5):902-908. https://doi.org/10.1111/jgs.15319.
5. Lagu T, Pekow PS, Shieh M-S, et al. Validation and comparison of seven mortality prediction models for hospitalized patients with acute decompensated heart failure. Circ Heart Fail. 2016;9(8). https://doi.org/10.1161/CIRCHEARTFAILURE.115.002912.
6. Núñez J, Bayés-Genís A, Zannad F, et al. Long-term potassium monitoring and dynamics in heart failure and risk of mortality. Circulation. 2018;137(13):1320-1330. https://doi.org/10.1161/CIRCULATIONAHA.117.030576.
7. Vardeny O, Claggett B, Anand I, et al. Incidence, predictors, and outcomes related to hypo- and hyperkalemia in patients with severe heart failure treated with a mineralocorticoid receptor antagonist. Circ Heart Fail. 2014;7(4):573-579. https://doi.org/10.1161/CIRCHEARTFAILURE.114.00110.
8. Aldahl M, Jensen A-SC, Davidsen L, et al. Associations of serum potassium levels with mortality in chronic heart failure patients. Eur Heart J. 2017;38(38):2890-2896. https://doi.org/10.1093/eurheartj/ehx460.
9. Hoppe LK, Muhlack DC, Koenig W, Carr PR, Brenner H, Schöttker B. Association of abnormal serum potassium levels with arrhythmias and cardiovascular mortality: a systematic review and meta-analysis of observational studies. Cardiovasc Drugs Ther. 2018;32(2):197-212. https://doi.org/10.1007/s10557-018-6783-0.
10. Macdonald JE, Struthers AD. What is the optimal serum potassium level in cardiovascular patients? J Am Coll Cardiol. 2004;43(2):155-161. https://doi.org/10.1016/j.jacc.2003.06.021.
11. Hulting J. In-hospital ventricular fibrillation and its relation to serum potassium. Acta Med Scand Suppl. 1981;647(647):109-116. https://doi.org/10.1111/j.0954-6820.1981.tb02646.x.
12. Skogestad J, Aronsen JM. Hypokalemia-induced arrhythmias and heart failure: new insights and implications for therapy. Front Physiol. 2018;9:1500. https://doi.org/10.3389/fphys.2018.01500.
13. Tromp J, Ter Maaten JM, Damman K, et al. Serum potassium levels and outcome in acute heart failure (data from the PROTECT and COACH trials). Am J Cardiol. 2017;119(2):290-296. https://doi.org/10.1016/j.amjcard.2016.09.038.
14. Khan SS, Campia U, Chioncel O, et al. Changes in serum potassium levels during hospitalization in patients with worsening heart failure and reduced ejection fraction (from the EVEREST trial). Am J Cardiol. 2015;115(6):790-796. https://doi.org/10.1016/j.amjcard.2014.12.045
15. Viera AJ, Wouk N. Potassium disorders: hypokalemia and hyperkalemia. Am Fam Physician. 2015;92(6):487-495.
16. Krumholz HM, Wang Y, Mattera JA, et al. An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with heart failure. Circulation. 2006;113(13):1693-1701. https://doi.org/10.1161/CIRCULATIONAHA.105.611194.
17. Legrand M, Ludes P-O, Massy Z, et al. Association between hypo- and hyperkalemia and outcome in acute heart failure patients: the role of medications. Clin Res Cardiol. 2018;107(3):214-221. https://doi.org/10.1007/s00392-017-1173-3.
18. Kok W, Salah K, Stienen S. Are changes in serum potassium levels during admissions for acute decompensated heart failure irrelevant for prognosis: the end of the story? Am J Cardiol. 2015;116(5):825. https://doi.org/10.1016/j.amjcard.2015.05.059.
19. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8-27. https://doi.org/10.1097/00005650-199801000-00004.
20. Quan H, Parsons GA, Ghali WA. Validity of information on comorbidity derived from ICD-9-CCM administrative data. Med Care. 2002;40(8):675-685. https://doi.org/10.1097/01.MLR.0000020927.46398.5D.
21. Gagne JJ, Glynn RJ, Avorn J, Levin R, Schneeweiss S. A combined comorbidity score predicted mortality in elderly patients better than existing scores. J Clin Epidemiol. 2011;64(7):749-759. https://doi.org/10.1016/j.jclinepi.2010.10.004.
22. Escobar GJ, Gardner MN, Greene JD, Draper D, Kipnis P. Risk-adjusting hospital mortality using a comprehensive electronic record in an integrated health care delivery system. Med Care. 2013;51(5):446-453. https://doi.org/10.1097/MLR.0b013e3182881c8e.
23. Escobar GJ, Greene JD, Scheirer P, Gardner MN, Draper D, Kipnis P. Risk-adjusting hospital inpatient mortality using automated inpatient, outpatient, and laboratory databases. Med Care. 2008;46(3):232-239. https://doi.org/10.1097/MLR.0b013e3181589bb6.
24. Parente PMDC, Santos Silva JMC. Quantile regression with clustered data. J Econom Method. 2016;5(1):1-15. https://doi.org/10.1515/jem-2014-0011.
25. Salah K, Pinto YM, Eurlings LW, et al. Serum potassium decline during hospitalization for acute decompensated heart failure is a predictor of 6-month mortality, independent of N-terminal pro-B-type natriuretic peptide levels: An individual patient data analysis. Am Heart J. 2015;170(3):531-542.e1. https://doi.org/10.1016/j.ahj.2015.06.003.
26. Krogager ML, Eggers-Kaas L, Aasbjerg K, et al. Short-term mortality risk of serum potassium levels in acute heart failure following myocardial infarction. Eur Heart J Cardiovasc Pharmacother. 2015;1(4):245-251. https://doi.org/10.1093/ehjcvp/pvv026.
27. Crop MJ, Hoorn EJ, Lindemans J, Zietse R. Hypokalaemia and subsequent hyperkalaemia in hospitalized patients. Nephrol Dial Transplant. 2007;22(12):3471-3477.https://doi.org/10.1093/ndt/gfm471.
28. Kok W, Salah K, Stienen S. Serum potassium levels during admissions for acute decompensated heart failure: identifying possible threats to outcome. Am J Cardiol. 2018;121(1):141. https://doi.org/10.1016/j.amjcard.2017.09.032.
29. Freda BJ, Knee AB, Braden GL, Visintainer PF, Thakar CV. Effect of transient and sustained acute kidney injury on readmissions in acute decompensated heart failure. Am J Cardiol. 2017;119(11):1809-1814. https://doi.org/10.1016/j.amjcard.2017.02.044.
1. Benjamin EJ, Virani SS, Callaway CW, et al. Heart disease and stroke statistics–2018 update: a report from the American Heart Association. Circulation. 2018;137(12):e67-e492. https://doi.org/10.1161/CIR.0000000000000558.
2. Maggioni AP, Dahlström U, Filippatos G, et al. EURObservational Research Programme: regional differences and 1-year follow-up results of the Heart Failure Pilot Survey (ESC-HF Pilot). Eur J Heart Fail. 2013;15(7):808-817. https://doi.org/10.1093/eurjhf/hft050.
3. Tomaselli GF, Zipes DP. What causes sudden death in heart failure? Circ Res. 2004;95(8):754-763. https://doi.org/10.1161/01.RES.0000145047.
4. Bowen GS, Diop MS, Jiang L, Wu W-C, Rudolph JL. A multivariable prediction model for mortality in individuals admitted for heart failure. J Am Geriatr Soc. 2018;66(5):902-908. https://doi.org/10.1111/jgs.15319.
5. Lagu T, Pekow PS, Shieh M-S, et al. Validation and comparison of seven mortality prediction models for hospitalized patients with acute decompensated heart failure. Circ Heart Fail. 2016;9(8). https://doi.org/10.1161/CIRCHEARTFAILURE.115.002912.
6. Núñez J, Bayés-Genís A, Zannad F, et al. Long-term potassium monitoring and dynamics in heart failure and risk of mortality. Circulation. 2018;137(13):1320-1330. https://doi.org/10.1161/CIRCULATIONAHA.117.030576.
7. Vardeny O, Claggett B, Anand I, et al. Incidence, predictors, and outcomes related to hypo- and hyperkalemia in patients with severe heart failure treated with a mineralocorticoid receptor antagonist. Circ Heart Fail. 2014;7(4):573-579. https://doi.org/10.1161/CIRCHEARTFAILURE.114.00110.
8. Aldahl M, Jensen A-SC, Davidsen L, et al. Associations of serum potassium levels with mortality in chronic heart failure patients. Eur Heart J. 2017;38(38):2890-2896. https://doi.org/10.1093/eurheartj/ehx460.
9. Hoppe LK, Muhlack DC, Koenig W, Carr PR, Brenner H, Schöttker B. Association of abnormal serum potassium levels with arrhythmias and cardiovascular mortality: a systematic review and meta-analysis of observational studies. Cardiovasc Drugs Ther. 2018;32(2):197-212. https://doi.org/10.1007/s10557-018-6783-0.
10. Macdonald JE, Struthers AD. What is the optimal serum potassium level in cardiovascular patients? J Am Coll Cardiol. 2004;43(2):155-161. https://doi.org/10.1016/j.jacc.2003.06.021.
11. Hulting J. In-hospital ventricular fibrillation and its relation to serum potassium. Acta Med Scand Suppl. 1981;647(647):109-116. https://doi.org/10.1111/j.0954-6820.1981.tb02646.x.
12. Skogestad J, Aronsen JM. Hypokalemia-induced arrhythmias and heart failure: new insights and implications for therapy. Front Physiol. 2018;9:1500. https://doi.org/10.3389/fphys.2018.01500.
13. Tromp J, Ter Maaten JM, Damman K, et al. Serum potassium levels and outcome in acute heart failure (data from the PROTECT and COACH trials). Am J Cardiol. 2017;119(2):290-296. https://doi.org/10.1016/j.amjcard.2016.09.038.
14. Khan SS, Campia U, Chioncel O, et al. Changes in serum potassium levels during hospitalization in patients with worsening heart failure and reduced ejection fraction (from the EVEREST trial). Am J Cardiol. 2015;115(6):790-796. https://doi.org/10.1016/j.amjcard.2014.12.045
15. Viera AJ, Wouk N. Potassium disorders: hypokalemia and hyperkalemia. Am Fam Physician. 2015;92(6):487-495.
16. Krumholz HM, Wang Y, Mattera JA, et al. An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with heart failure. Circulation. 2006;113(13):1693-1701. https://doi.org/10.1161/CIRCULATIONAHA.105.611194.
17. Legrand M, Ludes P-O, Massy Z, et al. Association between hypo- and hyperkalemia and outcome in acute heart failure patients: the role of medications. Clin Res Cardiol. 2018;107(3):214-221. https://doi.org/10.1007/s00392-017-1173-3.
18. Kok W, Salah K, Stienen S. Are changes in serum potassium levels during admissions for acute decompensated heart failure irrelevant for prognosis: the end of the story? Am J Cardiol. 2015;116(5):825. https://doi.org/10.1016/j.amjcard.2015.05.059.
19. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8-27. https://doi.org/10.1097/00005650-199801000-00004.
20. Quan H, Parsons GA, Ghali WA. Validity of information on comorbidity derived from ICD-9-CCM administrative data. Med Care. 2002;40(8):675-685. https://doi.org/10.1097/01.MLR.0000020927.46398.5D.
21. Gagne JJ, Glynn RJ, Avorn J, Levin R, Schneeweiss S. A combined comorbidity score predicted mortality in elderly patients better than existing scores. J Clin Epidemiol. 2011;64(7):749-759. https://doi.org/10.1016/j.jclinepi.2010.10.004.
22. Escobar GJ, Gardner MN, Greene JD, Draper D, Kipnis P. Risk-adjusting hospital mortality using a comprehensive electronic record in an integrated health care delivery system. Med Care. 2013;51(5):446-453. https://doi.org/10.1097/MLR.0b013e3182881c8e.
23. Escobar GJ, Greene JD, Scheirer P, Gardner MN, Draper D, Kipnis P. Risk-adjusting hospital inpatient mortality using automated inpatient, outpatient, and laboratory databases. Med Care. 2008;46(3):232-239. https://doi.org/10.1097/MLR.0b013e3181589bb6.
24. Parente PMDC, Santos Silva JMC. Quantile regression with clustered data. J Econom Method. 2016;5(1):1-15. https://doi.org/10.1515/jem-2014-0011.
25. Salah K, Pinto YM, Eurlings LW, et al. Serum potassium decline during hospitalization for acute decompensated heart failure is a predictor of 6-month mortality, independent of N-terminal pro-B-type natriuretic peptide levels: An individual patient data analysis. Am Heart J. 2015;170(3):531-542.e1. https://doi.org/10.1016/j.ahj.2015.06.003.
26. Krogager ML, Eggers-Kaas L, Aasbjerg K, et al. Short-term mortality risk of serum potassium levels in acute heart failure following myocardial infarction. Eur Heart J Cardiovasc Pharmacother. 2015;1(4):245-251. https://doi.org/10.1093/ehjcvp/pvv026.
27. Crop MJ, Hoorn EJ, Lindemans J, Zietse R. Hypokalaemia and subsequent hyperkalaemia in hospitalized patients. Nephrol Dial Transplant. 2007;22(12):3471-3477.https://doi.org/10.1093/ndt/gfm471.
28. Kok W, Salah K, Stienen S. Serum potassium levels during admissions for acute decompensated heart failure: identifying possible threats to outcome. Am J Cardiol. 2018;121(1):141. https://doi.org/10.1016/j.amjcard.2017.09.032.
29. Freda BJ, Knee AB, Braden GL, Visintainer PF, Thakar CV. Effect of transient and sustained acute kidney injury on readmissions in acute decompensated heart failure. Am J Cardiol. 2017;119(11):1809-1814. https://doi.org/10.1016/j.amjcard.2017.02.044.
© 2019 Society of Hospital Medicine
Collaboration, Not Calculation: A Qualitative Study of How Hospital Executives Value Hospital Medicine Groups
The field of hospital medicine has expanded rapidly since its inception in the late 1990s, and currently, most hospitals in the United States employ or contract with hospital medicine groups (HMGs).1-4 This dramatic growth began in response to several factors: primary care physicians (PCPs) opting out of inpatient care, the increasing acuity and complexity of inpatient care, and cost pressures on hospitals.5,6 Recent studies associate greater use of hospitalists with increased hospital revenues and modest improvements in hospital financial performance.7 However, funding the hospitalist delivery model required hospitals to share the savings hospitalists generate through facility billing and quality incentives.
Hospitalists’ professional fee revenues alone generally do not fund their salaries. An average HMG serving adult patients requires $176,658 from the hospital to support a full-time physician.8 Determining the appropriate level of HMG support typically occurs through negotiation with hospital executives. During the last 10 years, HMG size and hospitalist compensation have risen steadily, combining to increase the hospitalist labor costs borne by hospitals.4,8 Accordingly, hospital executives in challenging economic environments may pressure HMG leaders to accept diminished support or to demonstrate a better return on the hospital’s investment.
These negotiations are influenced by the beliefs of hospital executives about the value of the hospitalist labor model. Little is known about how hospital and health system executive leadership assess the value of hospitalists. A deeper understanding of executive attitudes and beliefs could inform HMG leaders seeking integrative (“win-win”) outcomes in contract and compensation negotiations. Members of the Society of Hospital Medicine (SHM) Practice Management Committee surveyed hospital executives to guide SHM program development. We sought to analyze transcripts from these interviews to describe how executives assess HMGs and to test the hypothesis that hospital executives apply specific financial models when determining the return on investment (ROI) from subsidizing an HMG.
METHODS
Study Design, Setting, and Participants
Members of the SHM Practice Management Committee conducted interviews with a convenience sample of 24 key informants representing the following stakeholders at hospitals employing hospitalists: Chief Executive Officers (CEOs), Presidents, Vice Presidents, Chief Medical Officers (CMOs), and Chief Financial Officers (CFOs). Participants were recruited from 17 fee-for-service healthcare organizations, including rural, suburban, urban, community, and academic medical centers. The semi-structured interviews occurred in person between January and March 2018; each one lasted an average of 45 minutes and were designed to guide SHM program and product development. Twenty-eight executives were recruited by e-mail, and four did not complete the interview due to scheduling difficulty. All the participants provided informed consent. The University of Washington Institutional Review Board approved the secondary analysis of deidentified transcripts.
Interview Guide and Data Collection
All interviews followed a guide with eight demographic questions and 10 open-ended questions (Appendix). Cognitive interviews were performed with two hospital executives outside the study cohort, resulting in the addition of one question and rewording one question for clarity. One-on-one interviews were performed by 10 committee members (range, 1-3 interviews). All interview audios were recorded, and no field notes were kept. The goal of the interviews was to obtain an understanding of how hospital executives value the contributions and costs of hospitalist groups.
The interviews began with questions about the informant’s current interactions with hospitalists and the origin of the hospitalist group at their facility. Informants then described the value they feel hospitalists bring to their hospital and occasions they were surprised or dissatisfied with the clinical or financial value delivered by the hospitalists. Participants described how they calculate a return on investment (ROI) for their hospitalist group, nonfinancial benefits and disadvantages to hospitalists, and how they believe hospitalists should participate in risk-sharing contracts.
Data Analysis
The interview audiotapes were transcribed and deidentified. A sample of eight transcripts was verified by participants to ensure accuracy. Three investigators (AAW, RC, CC) reviewed a random sample of five transcripts to identify and codify preliminary themes. We applied a general inductive framework with a content analysis approach. Two investigators (TM and MC) read all transcripts independently, coding the presence of each theme and quotations exemplifying these themes using qualitative analysis software (Dedoose Version 7.0.23, SocioCultural Research Consultants). A third investigator (AAW) read all the transcripts and resolved differences of opinion. Themes and code application were discussed among the study team after the second and fifth transcripts to add or clarify codes. No new codes were identified after the first review of the preliminary codebook, although investigators intermittently used an “unknown” code through the 20th transcript. After discussion to reach consensus, excerpts initially coded “unknown” were assigned existing codes; the 20th transcript represents the approximate point of reaching thematic saturation.
RESULTS
Of the 24 participants, 18 (75%) were male, representing a variety of roles: 7 (29.2%) CMOs, 5 (20.8%) Presidents, 5 (20.8%) CFOs, 4 (16.7%) CEOs, and 3 (12.5%) Vice Presidents. The participants represented all regions (Midwest 12 [50%], South 6 [25%], West 4 [16.7%], and East 2 [8.3%], community size (Urban 11 [45.8%], Suburban 8 [33.3%], and Rural 5 [20.8%]), and Hospital Types (Community 11 [45.8%], Multihospital System 5 [20.8%], Academic 5 [20.8%], Safety Net 2 [8.3%], and Critical Access 1 [4.2%]). We present specific themes below and supporting quotations in Tables 1 and 2.
Current Value of the HMG at the Respondent’s Hospital
Most executives reported their hospital’s HMG had operated for over a decade and had developed an earlier, outdated value framework. Interviewees described an initial mix of financial pressures, shifts in physician work preferences, increasing patient acuity, resident labor shortages, and unsolved hospital throughput needs that triggered a reactive conversion from community PCP staffing to hospitalist care teams, followed by refinements to realize value.
“I think initially here it was to deal with the resident caps, right? So, at that moment, the solution that was put in place probably made a lot of sense. If that’s all someone came in with, now I’d be scratching my head and said, what are you thinking?” (President, #2)
Respondents perceived that HMGs provide value in many domains, including financial contributions, high-quality care, organizational efficiency, academics, leadership of interprofessional teams, effective communication, system improvement, and beneficial influence on the care environment and other employees. Regarding the measurable generation of financial benefit, documentation for improved billing accuracy, increased hospital efficiency (eg, lower length of stay, early discharges), and comanagement arrangements were commonly identified.
“I don’t want a urologist with a stethoscope, so I’m happy to have the hospitalists say, ‘Look, I’ll take care of the patient. You do the procedure.’ Well, that’s inherently valuable, whether we measure it or whether we don’t.” (CMO, #21)
Executives generally expressed satisfaction with their HMG’s quality of care and the related pay-for-performance financial benefits from payers, attributing success to hospitalists’ familiarity with inpatient systems and willingness to standardize.
“I just think it’s having one structure, one group to go to, a standard rather than trying to push it through the medical staff.” (VP, #18)
Executives reported that HMGs generate substantial value that is difficult to measure financially. For example, a large bundle of excerpts organized around communication with patients, nurses, and other providers.
“If we have the right hospitalist staff, to engage them with the nursing staff would help to reduce my turnover rate…and create a very positive morale within the nursing units. That’s huge. That’s nonfinancial” (President, #15)
Executives particularly appreciated hospitalists’ work to aggregate input from multiple specialists and present a cohesive explanation to patients. Executives also felt that HMGs create significant unmeasured value by improving processes and outcomes on service lines beyond hospital medicine, achieving this through culture change, involvement in leadership, hospital-wide process redesign, and running rapid response teams. Some executives expressed a desire for hospitalists to assume this global quality responsibility more explicitly as a job expectation.
Executives described how they would evaluate a de novo proposal for hospitalist services, usually enumerating key general domains without explaining specifically how they would measure each element. The following priorities emerged: clinical excellence, capacity to collaborate with hospital leadership, the scope of services provided, cultural fit/alignment, financial performance, contract cost, pay-for-performance measures, and turnover. Regarding financial performance, respondents expected to know the cost of the proposal but lacked a specific price threshold. Instead, they sought to understand the total value of the proposal through its effect on metrics such as facility fees or resource use. Nonetheless, cultural fit was a critical, overriding driver of the hypothetical decision, despite difficulty defining beyond estimates of teamwork, alignment with hospital priorities, and qualities of the group leader.
“For us, it usually ends being how do we mix personally, do we like them?” (CMO, #5)
Alignment and Collaboration
The related concepts of “collaboration” and “alignment” emerged as prominent themes during all interviews. Executives highly valued hospitalist groups that could demonstrate alignment with hospital priorities and often used this concept to summarize the HMG’s success or failure across a group of value domains.
“If you’re just coming in to fill a shift and see 10 patients, you have much less value than somebody who’s going to play that active partnership role… hospitalist services need to partner with hospitals and be intimately involved with the success of the hospital.” (CMO, #20)
Alignment sometimes manifested in a quantified, explicit way, through incentive plans or shared savings plans. However, it most often manifested as a broader sense that the hospitalists’ work targeted the same priorities as the executive leaders and that hospitalists genuinely cared about those priorities. A “shift-work mentality” was expressed by some as the antithesis of alignment. Incorporating hospitalist leaders in hospital leadership and frequent communication arose as mechanisms to increase alignment.
Ways HMGs Fail to Meet Expectations
Respondents described unresolved disadvantages to the hospitalist care model.
“I mean, OPPE, how do you do that for a hospitalist? How can you do it? It’s hard to attribute a patient to someone….it is a weakness and I think we all know it.” (CMO, #21)
Executives also worried about inconsistent handoffs with primary care providers and the field’s demographics, finding it disproportionately comprised of junior or transient physicians. They also hoped that hospitalist innovators would solve clinician burnout and the high cost of inpatient care. Disappointments specific to the local HMG revolved around difficulty developing shared models of value and mechanisms to achieve them.
“I would like to have more dialog between the hospital leadership team and the hospitalist group…I would like to see a little bit more collaboration.” (President, #13)
These challenges emerged not as a deficiency with hospital medicine as a specialty, but a failure at their specific facility to achieve the goal of alignment through joint strategic planning.
Calculating Value
When asked if their hospital had a formal process to evaluate ROI for their HMG, two dominant answers emerged: (1) the executive lacked a formal process for determining ROI and was unaware of one used at their facility or (2) the executive evaluated HMG performance based on multiple measures, including cost, but did not attempt to calculate ROI or a summary value. Several described the financial evaluation process as too difficult or unnecessary.
“No. It’s too difficult to extract that data. I would say the best proxy that we could do it is our case mix index on our medicine service line.” (CMO, #20)
“No, not a formal process, no… I question the value of some of the other things we do with the medical group…but not the value of the hospitalists… I don’t think we’ve done a formal assessment. I appreciate the flexibility, especially in a small hospital.” (President, #10)
Rarely, executives described specific financial calculations that served as a proxy for ROI. These included calculating a contribution margin to compare against the cost of salary support or the application of external survey benchmarking comparisons for productivity and salary to evaluate the appropriateness of a limited set of financial indicators. Twice respondents alluded to more sophisticated measurements conducted by the finance department but lacked familiarity with the process. Several executives described ROI calculations for specific projects and discrete business decisions involving hospitalists, particularly considering hiring an additional hospitalist.
Executives generally struggled to recall specific ways that the nonfinancial contributions of hospitalists were incorporated into executive decisions regarding the hospitalist group. Two related themes emerged: first, the belief that hospitals could not function effectively without hospitalists, making their presence an expected cost of doing business. Second, absent measures of HMG ROI, executives appeared to determine an approximate overall value of hospitalists, rather than parsing the various contributions. A few respondents expressed alarm at the rise in hospitalist salaries, whereas others acknowledged market forces beyond their control.
“… there is going to be more of a demand for hospitalists, which is definitely going to drive up the compensation. So, I don’t worry that the compensation will be driven up so high that there won’t be a return [on investment].” (CFO, #16)
Some urged individual hospitalists to develop a deeper understanding of what supports their salary to avoid strained relationships with executives.
Evolution and Risk-Sharing Contracts
Respondents described an evolving conceptualization of the hospitalist’s value, occurring at both a broad, long-term scale and at an incremental, annual scale through minor modifications to incentive pay schemes. For most executives, hiring hospitalists as replacements for PCPs had become necessary and not a source of novel value; many executives described it as “the cost of doing business.” Some described gradually deemphasizing relative value unit (RVU) production to recognize other contributions. Several reported their general appreciation of hospitalists evolved as specific hospitalists matured and demonstrated new contributions to hospital function. Some leaders tried to speculate about future phases of this evolution, although details were sparse.
Among respondents with greater implementation of risk-sharing contracts or ACOs, executives did not describe significantly different goals for hospitalists; executives emphasized that hospitalists should accelerate existing efforts to reduce inpatient costs, length of stay, healthcare-acquired conditions, unnecessary testing, and readmissions. A theme emerged around hospitalists supporting the continuum of care, through improved communication and increased alignment with health systems.
“Where I see the real benefit…is to figure out a way to use hospitalists and match them up with the primary care physicians on the outpatient side to truly develop an integrated population-based medicine practice for all our patients.” (President, #15)
Executives believed that communication and collaboration with PCPs and postacute care providers would attract more measurement.
DISCUSSION
Our findings provide hospitalists with insight into the approach hospital executives may follow when determining the rationale for and extent of financial support for HMGs. The results did not support our hypothesis that executives commonly determine the appropriate support by summing detailed quantitative models for various HMG contributions. Instead, most hospital executives appear to make decisions about the appropriateness of financial support based on a small number of basic financial or care quality metrics combined with a subjective assessment of the HMG’s broader alignment with hospital priorities. However, we did find substantial evidence that hospital executives’ expectations of hospitalists have evolved in the last decade, creating the potential for dissociation from how hospitalists prioritize and value their own efforts. Together, our findings suggest that enhanced communication, relationship building, and collaboration with hospital leaders may help HMGs to maintain a shared model of value with hospital executives.
The general absence of summary value calculations suggests specific opportunities, benefits, and risks for HMG group leaders (Table 3). An important opportunity relates to the communication agenda about unmeasured or nonfinancial contributions. Although executives recognized many of these, our data suggest a need for HMG leaders to educate hospital leaders about their unmeasured contributions proactively. Although some might recommend doing so by quantifying and financially rewarding key intangible contributions (eg, measuring leadership in culture change9), this entails important risks.10 Some experts propose that the proliferation of physician pay-for-performance schemes threatens medical professionalism, fails patients, and misunderstands what motivates physicians.11 HMG groups that feel undervalued should hesitate before monetizing all aspects of their work, and consider emphasizing relationship-building as a platform for communication about their performance. Achieving better alignment with executives is not just an opportunity for HMG leaders, but for each hospitalist within the group. Although executives wanted to have deeper relationships with group members, this may not be feasible in large organizations. Instead, it is incumbent for HMG leaders to translate executives’ expectations and forge better alignment.
Residency may not adequately prepare hospitalists to meet key expectations hospital executives hold related to system leadership and interprofessional team leadership. For example, hospital leaders particularly valued HMGs’ perceived ability to improve nurse retention and morale. Unfortunately, residency curricula generally lack concerted instruction on the skills required to produce such interprofessional inpatient teams reliably. Similarly, executives strongly wanted HMGs to acknowledge a role as partners in running the quality, stewardship, and safety missions of the entire hospital. While residency training builds clinical competence through the care of individual patients, many residents do not receive experiential education in system design and leadership. This suggests a need for HMGs to provide early career training or mentorship in quality improvement and interprofessional teamwork. Executives and HMG leaders seeking a stable, mature workforce, should allocate resources to retaining mid and late career hospitalists through leadership roles or financial incentives for longevity.
As with many qualitative studies, the generalizability of our findings may be limited, particularly outside the US healthcare system. We invited executives from diverse practice settings but may not have captured all the relevant viewpoints. This study did not include Veterans Affairs hospitals, safety net hospitals were underrepresented, Midwestern hospitals were overrepresented and the participants were predominantly male. We were unable to determine the influence of employment model on participant beliefs about HMGs, nor did we elicit comparisons to other physician specialties that would highlight a distinct approach to negotiating with HMGs. Because we used hospitalists as interviewers, including some from the same institution as the interviewee, respondents may have dampened critiques or descriptions of unmet expectations. Our data do not provide quantitative support for any approach to determining or negotiating appropriate financial support for an HMG.
CONCLUSIONS
This work contributes new understanding of the expectations executives have for HMGs and individual hospitalists. This highlights opportunities for group leaders, hospitalists, medical educators, and quality improvement experts to produce a hospitalist labor force that can engage in productive and mutually satisfying relationships with hospital leaders. Hospitalists should strive to improve alignment and communication with executive groups.
Disclosures
The authors report no potential conflict of interest.
1. Lapps J, Flansbaum B, Leykum L, et al. Updating threshold-based identification of hospitalists in 2012 Medicare pay data. J Hosp Med. 2016;11(1):45-47. https://doi.org/10.1002/jhm.2480.
2. Wachter RM, Goldman L. Zero to 50,000–the 20th Anniversary of the hospitalist. NEJM. 2016;375(11):1009-1011. https://doi.org/10.1056/nejmp1607958.
3. Stevens JP, Nyweide DJ, Maresh S, et al. Comparison of hospital resource use and outcomes among hospitalists, primary care physicians, and other generalists. JAMA Intern Med. 2017;177(12):1781-1787. https://doi.org/10.1001/jamainternmed.2017.5824.
4. American Hospital Association (AHA) (2017), Hospital Statistics, AHA, Chicago, IL.
5. Wachter RM, Goldman L. The emerging role of “hospitalists” in the American health care system. NEJM. 1996;335(7):514-517. https://doi.org/10.1093/ajhp/53.20.2389a.
6. Pham HH, Devers KJ, Kuo S, et al. Health care market trends and the evolution of hospitalist use and roles. J Gen Intern Med. 2005;20(2):101-107. https://doi.org/10.1111/j.1525-1497.2005.40184.x.
7. Epané JP, Weech-Maldonado R, Hearld L, et al. Hospitals’ use of hospitalistas: implications for financial performance. Health Care Manage Rev. 2019;44(1):10-18. https://doi.org/10.1097/hmr.0000000000000170.
8. State of Hospital Medicine: 2018 Report Based on 2017 Data. Society of Hospital Medicine. https://sohm.hospitalmedicine.org/ Accessed December 9, 2018.
9. Carmeli A, Tishler A. The relationships between intangible organizational elements and organizational performance. Strategic Manag J. 2004;25(13):1257-1278. https://doi.org/10.1002/smj.428.
10. Bernard M. Strategic performance management: leveraging and measuring your intangible value drivers. Amsterdam: Butterworth-Heinemann, 2006.
11. Khullar D, Wolfson D, Casalino LP. Professionalism, performance, and the future of physician incentives. JAMA. 2018;320(23):2419-2420. https://doi.org/10.1001/jama.2018.17719.
The field of hospital medicine has expanded rapidly since its inception in the late 1990s, and currently, most hospitals in the United States employ or contract with hospital medicine groups (HMGs).1-4 This dramatic growth began in response to several factors: primary care physicians (PCPs) opting out of inpatient care, the increasing acuity and complexity of inpatient care, and cost pressures on hospitals.5,6 Recent studies associate greater use of hospitalists with increased hospital revenues and modest improvements in hospital financial performance.7 However, funding the hospitalist delivery model required hospitals to share the savings hospitalists generate through facility billing and quality incentives.
Hospitalists’ professional fee revenues alone generally do not fund their salaries. An average HMG serving adult patients requires $176,658 from the hospital to support a full-time physician.8 Determining the appropriate level of HMG support typically occurs through negotiation with hospital executives. During the last 10 years, HMG size and hospitalist compensation have risen steadily, combining to increase the hospitalist labor costs borne by hospitals.4,8 Accordingly, hospital executives in challenging economic environments may pressure HMG leaders to accept diminished support or to demonstrate a better return on the hospital’s investment.
These negotiations are influenced by the beliefs of hospital executives about the value of the hospitalist labor model. Little is known about how hospital and health system executive leadership assess the value of hospitalists. A deeper understanding of executive attitudes and beliefs could inform HMG leaders seeking integrative (“win-win”) outcomes in contract and compensation negotiations. Members of the Society of Hospital Medicine (SHM) Practice Management Committee surveyed hospital executives to guide SHM program development. We sought to analyze transcripts from these interviews to describe how executives assess HMGs and to test the hypothesis that hospital executives apply specific financial models when determining the return on investment (ROI) from subsidizing an HMG.
METHODS
Study Design, Setting, and Participants
Members of the SHM Practice Management Committee conducted interviews with a convenience sample of 24 key informants representing the following stakeholders at hospitals employing hospitalists: Chief Executive Officers (CEOs), Presidents, Vice Presidents, Chief Medical Officers (CMOs), and Chief Financial Officers (CFOs). Participants were recruited from 17 fee-for-service healthcare organizations, including rural, suburban, urban, community, and academic medical centers. The semi-structured interviews occurred in person between January and March 2018; each one lasted an average of 45 minutes and were designed to guide SHM program and product development. Twenty-eight executives were recruited by e-mail, and four did not complete the interview due to scheduling difficulty. All the participants provided informed consent. The University of Washington Institutional Review Board approved the secondary analysis of deidentified transcripts.
Interview Guide and Data Collection
All interviews followed a guide with eight demographic questions and 10 open-ended questions (Appendix). Cognitive interviews were performed with two hospital executives outside the study cohort, resulting in the addition of one question and rewording one question for clarity. One-on-one interviews were performed by 10 committee members (range, 1-3 interviews). All interview audios were recorded, and no field notes were kept. The goal of the interviews was to obtain an understanding of how hospital executives value the contributions and costs of hospitalist groups.
The interviews began with questions about the informant’s current interactions with hospitalists and the origin of the hospitalist group at their facility. Informants then described the value they feel hospitalists bring to their hospital and occasions they were surprised or dissatisfied with the clinical or financial value delivered by the hospitalists. Participants described how they calculate a return on investment (ROI) for their hospitalist group, nonfinancial benefits and disadvantages to hospitalists, and how they believe hospitalists should participate in risk-sharing contracts.
Data Analysis
The interview audiotapes were transcribed and deidentified. A sample of eight transcripts was verified by participants to ensure accuracy. Three investigators (AAW, RC, CC) reviewed a random sample of five transcripts to identify and codify preliminary themes. We applied a general inductive framework with a content analysis approach. Two investigators (TM and MC) read all transcripts independently, coding the presence of each theme and quotations exemplifying these themes using qualitative analysis software (Dedoose Version 7.0.23, SocioCultural Research Consultants). A third investigator (AAW) read all the transcripts and resolved differences of opinion. Themes and code application were discussed among the study team after the second and fifth transcripts to add or clarify codes. No new codes were identified after the first review of the preliminary codebook, although investigators intermittently used an “unknown” code through the 20th transcript. After discussion to reach consensus, excerpts initially coded “unknown” were assigned existing codes; the 20th transcript represents the approximate point of reaching thematic saturation.
RESULTS
Of the 24 participants, 18 (75%) were male, representing a variety of roles: 7 (29.2%) CMOs, 5 (20.8%) Presidents, 5 (20.8%) CFOs, 4 (16.7%) CEOs, and 3 (12.5%) Vice Presidents. The participants represented all regions (Midwest 12 [50%], South 6 [25%], West 4 [16.7%], and East 2 [8.3%], community size (Urban 11 [45.8%], Suburban 8 [33.3%], and Rural 5 [20.8%]), and Hospital Types (Community 11 [45.8%], Multihospital System 5 [20.8%], Academic 5 [20.8%], Safety Net 2 [8.3%], and Critical Access 1 [4.2%]). We present specific themes below and supporting quotations in Tables 1 and 2.
Current Value of the HMG at the Respondent’s Hospital
Most executives reported their hospital’s HMG had operated for over a decade and had developed an earlier, outdated value framework. Interviewees described an initial mix of financial pressures, shifts in physician work preferences, increasing patient acuity, resident labor shortages, and unsolved hospital throughput needs that triggered a reactive conversion from community PCP staffing to hospitalist care teams, followed by refinements to realize value.
“I think initially here it was to deal with the resident caps, right? So, at that moment, the solution that was put in place probably made a lot of sense. If that’s all someone came in with, now I’d be scratching my head and said, what are you thinking?” (President, #2)
Respondents perceived that HMGs provide value in many domains, including financial contributions, high-quality care, organizational efficiency, academics, leadership of interprofessional teams, effective communication, system improvement, and beneficial influence on the care environment and other employees. Regarding the measurable generation of financial benefit, documentation for improved billing accuracy, increased hospital efficiency (eg, lower length of stay, early discharges), and comanagement arrangements were commonly identified.
“I don’t want a urologist with a stethoscope, so I’m happy to have the hospitalists say, ‘Look, I’ll take care of the patient. You do the procedure.’ Well, that’s inherently valuable, whether we measure it or whether we don’t.” (CMO, #21)
Executives generally expressed satisfaction with their HMG’s quality of care and the related pay-for-performance financial benefits from payers, attributing success to hospitalists’ familiarity with inpatient systems and willingness to standardize.
“I just think it’s having one structure, one group to go to, a standard rather than trying to push it through the medical staff.” (VP, #18)
Executives reported that HMGs generate substantial value that is difficult to measure financially. For example, a large bundle of excerpts organized around communication with patients, nurses, and other providers.
“If we have the right hospitalist staff, to engage them with the nursing staff would help to reduce my turnover rate…and create a very positive morale within the nursing units. That’s huge. That’s nonfinancial” (President, #15)
Executives particularly appreciated hospitalists’ work to aggregate input from multiple specialists and present a cohesive explanation to patients. Executives also felt that HMGs create significant unmeasured value by improving processes and outcomes on service lines beyond hospital medicine, achieving this through culture change, involvement in leadership, hospital-wide process redesign, and running rapid response teams. Some executives expressed a desire for hospitalists to assume this global quality responsibility more explicitly as a job expectation.
Executives described how they would evaluate a de novo proposal for hospitalist services, usually enumerating key general domains without explaining specifically how they would measure each element. The following priorities emerged: clinical excellence, capacity to collaborate with hospital leadership, the scope of services provided, cultural fit/alignment, financial performance, contract cost, pay-for-performance measures, and turnover. Regarding financial performance, respondents expected to know the cost of the proposal but lacked a specific price threshold. Instead, they sought to understand the total value of the proposal through its effect on metrics such as facility fees or resource use. Nonetheless, cultural fit was a critical, overriding driver of the hypothetical decision, despite difficulty defining beyond estimates of teamwork, alignment with hospital priorities, and qualities of the group leader.
“For us, it usually ends being how do we mix personally, do we like them?” (CMO, #5)
Alignment and Collaboration
The related concepts of “collaboration” and “alignment” emerged as prominent themes during all interviews. Executives highly valued hospitalist groups that could demonstrate alignment with hospital priorities and often used this concept to summarize the HMG’s success or failure across a group of value domains.
“If you’re just coming in to fill a shift and see 10 patients, you have much less value than somebody who’s going to play that active partnership role… hospitalist services need to partner with hospitals and be intimately involved with the success of the hospital.” (CMO, #20)
Alignment sometimes manifested in a quantified, explicit way, through incentive plans or shared savings plans. However, it most often manifested as a broader sense that the hospitalists’ work targeted the same priorities as the executive leaders and that hospitalists genuinely cared about those priorities. A “shift-work mentality” was expressed by some as the antithesis of alignment. Incorporating hospitalist leaders in hospital leadership and frequent communication arose as mechanisms to increase alignment.
Ways HMGs Fail to Meet Expectations
Respondents described unresolved disadvantages to the hospitalist care model.
“I mean, OPPE, how do you do that for a hospitalist? How can you do it? It’s hard to attribute a patient to someone….it is a weakness and I think we all know it.” (CMO, #21)
Executives also worried about inconsistent handoffs with primary care providers and the field’s demographics, finding it disproportionately comprised of junior or transient physicians. They also hoped that hospitalist innovators would solve clinician burnout and the high cost of inpatient care. Disappointments specific to the local HMG revolved around difficulty developing shared models of value and mechanisms to achieve them.
“I would like to have more dialog between the hospital leadership team and the hospitalist group…I would like to see a little bit more collaboration.” (President, #13)
These challenges emerged not as a deficiency with hospital medicine as a specialty, but a failure at their specific facility to achieve the goal of alignment through joint strategic planning.
Calculating Value
When asked if their hospital had a formal process to evaluate ROI for their HMG, two dominant answers emerged: (1) the executive lacked a formal process for determining ROI and was unaware of one used at their facility or (2) the executive evaluated HMG performance based on multiple measures, including cost, but did not attempt to calculate ROI or a summary value. Several described the financial evaluation process as too difficult or unnecessary.
“No. It’s too difficult to extract that data. I would say the best proxy that we could do it is our case mix index on our medicine service line.” (CMO, #20)
“No, not a formal process, no… I question the value of some of the other things we do with the medical group…but not the value of the hospitalists… I don’t think we’ve done a formal assessment. I appreciate the flexibility, especially in a small hospital.” (President, #10)
Rarely, executives described specific financial calculations that served as a proxy for ROI. These included calculating a contribution margin to compare against the cost of salary support or the application of external survey benchmarking comparisons for productivity and salary to evaluate the appropriateness of a limited set of financial indicators. Twice respondents alluded to more sophisticated measurements conducted by the finance department but lacked familiarity with the process. Several executives described ROI calculations for specific projects and discrete business decisions involving hospitalists, particularly considering hiring an additional hospitalist.
Executives generally struggled to recall specific ways that the nonfinancial contributions of hospitalists were incorporated into executive decisions regarding the hospitalist group. Two related themes emerged: first, the belief that hospitals could not function effectively without hospitalists, making their presence an expected cost of doing business. Second, absent measures of HMG ROI, executives appeared to determine an approximate overall value of hospitalists, rather than parsing the various contributions. A few respondents expressed alarm at the rise in hospitalist salaries, whereas others acknowledged market forces beyond their control.
“… there is going to be more of a demand for hospitalists, which is definitely going to drive up the compensation. So, I don’t worry that the compensation will be driven up so high that there won’t be a return [on investment].” (CFO, #16)
Some urged individual hospitalists to develop a deeper understanding of what supports their salary to avoid strained relationships with executives.
Evolution and Risk-Sharing Contracts
Respondents described an evolving conceptualization of the hospitalist’s value, occurring at both a broad, long-term scale and at an incremental, annual scale through minor modifications to incentive pay schemes. For most executives, hiring hospitalists as replacements for PCPs had become necessary and not a source of novel value; many executives described it as “the cost of doing business.” Some described gradually deemphasizing relative value unit (RVU) production to recognize other contributions. Several reported their general appreciation of hospitalists evolved as specific hospitalists matured and demonstrated new contributions to hospital function. Some leaders tried to speculate about future phases of this evolution, although details were sparse.
Among respondents with greater implementation of risk-sharing contracts or ACOs, executives did not describe significantly different goals for hospitalists; executives emphasized that hospitalists should accelerate existing efforts to reduce inpatient costs, length of stay, healthcare-acquired conditions, unnecessary testing, and readmissions. A theme emerged around hospitalists supporting the continuum of care, through improved communication and increased alignment with health systems.
“Where I see the real benefit…is to figure out a way to use hospitalists and match them up with the primary care physicians on the outpatient side to truly develop an integrated population-based medicine practice for all our patients.” (President, #15)
Executives believed that communication and collaboration with PCPs and postacute care providers would attract more measurement.
DISCUSSION
Our findings provide hospitalists with insight into the approach hospital executives may follow when determining the rationale for and extent of financial support for HMGs. The results did not support our hypothesis that executives commonly determine the appropriate support by summing detailed quantitative models for various HMG contributions. Instead, most hospital executives appear to make decisions about the appropriateness of financial support based on a small number of basic financial or care quality metrics combined with a subjective assessment of the HMG’s broader alignment with hospital priorities. However, we did find substantial evidence that hospital executives’ expectations of hospitalists have evolved in the last decade, creating the potential for dissociation from how hospitalists prioritize and value their own efforts. Together, our findings suggest that enhanced communication, relationship building, and collaboration with hospital leaders may help HMGs to maintain a shared model of value with hospital executives.
The general absence of summary value calculations suggests specific opportunities, benefits, and risks for HMG group leaders (Table 3). An important opportunity relates to the communication agenda about unmeasured or nonfinancial contributions. Although executives recognized many of these, our data suggest a need for HMG leaders to educate hospital leaders about their unmeasured contributions proactively. Although some might recommend doing so by quantifying and financially rewarding key intangible contributions (eg, measuring leadership in culture change9), this entails important risks.10 Some experts propose that the proliferation of physician pay-for-performance schemes threatens medical professionalism, fails patients, and misunderstands what motivates physicians.11 HMG groups that feel undervalued should hesitate before monetizing all aspects of their work, and consider emphasizing relationship-building as a platform for communication about their performance. Achieving better alignment with executives is not just an opportunity for HMG leaders, but for each hospitalist within the group. Although executives wanted to have deeper relationships with group members, this may not be feasible in large organizations. Instead, it is incumbent for HMG leaders to translate executives’ expectations and forge better alignment.
Residency may not adequately prepare hospitalists to meet key expectations hospital executives hold related to system leadership and interprofessional team leadership. For example, hospital leaders particularly valued HMGs’ perceived ability to improve nurse retention and morale. Unfortunately, residency curricula generally lack concerted instruction on the skills required to produce such interprofessional inpatient teams reliably. Similarly, executives strongly wanted HMGs to acknowledge a role as partners in running the quality, stewardship, and safety missions of the entire hospital. While residency training builds clinical competence through the care of individual patients, many residents do not receive experiential education in system design and leadership. This suggests a need for HMGs to provide early career training or mentorship in quality improvement and interprofessional teamwork. Executives and HMG leaders seeking a stable, mature workforce, should allocate resources to retaining mid and late career hospitalists through leadership roles or financial incentives for longevity.
As with many qualitative studies, the generalizability of our findings may be limited, particularly outside the US healthcare system. We invited executives from diverse practice settings but may not have captured all the relevant viewpoints. This study did not include Veterans Affairs hospitals, safety net hospitals were underrepresented, Midwestern hospitals were overrepresented and the participants were predominantly male. We were unable to determine the influence of employment model on participant beliefs about HMGs, nor did we elicit comparisons to other physician specialties that would highlight a distinct approach to negotiating with HMGs. Because we used hospitalists as interviewers, including some from the same institution as the interviewee, respondents may have dampened critiques or descriptions of unmet expectations. Our data do not provide quantitative support for any approach to determining or negotiating appropriate financial support for an HMG.
CONCLUSIONS
This work contributes new understanding of the expectations executives have for HMGs and individual hospitalists. This highlights opportunities for group leaders, hospitalists, medical educators, and quality improvement experts to produce a hospitalist labor force that can engage in productive and mutually satisfying relationships with hospital leaders. Hospitalists should strive to improve alignment and communication with executive groups.
Disclosures
The authors report no potential conflict of interest.
The field of hospital medicine has expanded rapidly since its inception in the late 1990s, and currently, most hospitals in the United States employ or contract with hospital medicine groups (HMGs).1-4 This dramatic growth began in response to several factors: primary care physicians (PCPs) opting out of inpatient care, the increasing acuity and complexity of inpatient care, and cost pressures on hospitals.5,6 Recent studies associate greater use of hospitalists with increased hospital revenues and modest improvements in hospital financial performance.7 However, funding the hospitalist delivery model required hospitals to share the savings hospitalists generate through facility billing and quality incentives.
Hospitalists’ professional fee revenues alone generally do not fund their salaries. An average HMG serving adult patients requires $176,658 from the hospital to support a full-time physician.8 Determining the appropriate level of HMG support typically occurs through negotiation with hospital executives. During the last 10 years, HMG size and hospitalist compensation have risen steadily, combining to increase the hospitalist labor costs borne by hospitals.4,8 Accordingly, hospital executives in challenging economic environments may pressure HMG leaders to accept diminished support or to demonstrate a better return on the hospital’s investment.
These negotiations are influenced by the beliefs of hospital executives about the value of the hospitalist labor model. Little is known about how hospital and health system executive leadership assess the value of hospitalists. A deeper understanding of executive attitudes and beliefs could inform HMG leaders seeking integrative (“win-win”) outcomes in contract and compensation negotiations. Members of the Society of Hospital Medicine (SHM) Practice Management Committee surveyed hospital executives to guide SHM program development. We sought to analyze transcripts from these interviews to describe how executives assess HMGs and to test the hypothesis that hospital executives apply specific financial models when determining the return on investment (ROI) from subsidizing an HMG.
METHODS
Study Design, Setting, and Participants
Members of the SHM Practice Management Committee conducted interviews with a convenience sample of 24 key informants representing the following stakeholders at hospitals employing hospitalists: Chief Executive Officers (CEOs), Presidents, Vice Presidents, Chief Medical Officers (CMOs), and Chief Financial Officers (CFOs). Participants were recruited from 17 fee-for-service healthcare organizations, including rural, suburban, urban, community, and academic medical centers. The semi-structured interviews occurred in person between January and March 2018; each one lasted an average of 45 minutes and were designed to guide SHM program and product development. Twenty-eight executives were recruited by e-mail, and four did not complete the interview due to scheduling difficulty. All the participants provided informed consent. The University of Washington Institutional Review Board approved the secondary analysis of deidentified transcripts.
Interview Guide and Data Collection
All interviews followed a guide with eight demographic questions and 10 open-ended questions (Appendix). Cognitive interviews were performed with two hospital executives outside the study cohort, resulting in the addition of one question and rewording one question for clarity. One-on-one interviews were performed by 10 committee members (range, 1-3 interviews). All interview audios were recorded, and no field notes were kept. The goal of the interviews was to obtain an understanding of how hospital executives value the contributions and costs of hospitalist groups.
The interviews began with questions about the informant’s current interactions with hospitalists and the origin of the hospitalist group at their facility. Informants then described the value they feel hospitalists bring to their hospital and occasions they were surprised or dissatisfied with the clinical or financial value delivered by the hospitalists. Participants described how they calculate a return on investment (ROI) for their hospitalist group, nonfinancial benefits and disadvantages to hospitalists, and how they believe hospitalists should participate in risk-sharing contracts.
Data Analysis
The interview audiotapes were transcribed and deidentified. A sample of eight transcripts was verified by participants to ensure accuracy. Three investigators (AAW, RC, CC) reviewed a random sample of five transcripts to identify and codify preliminary themes. We applied a general inductive framework with a content analysis approach. Two investigators (TM and MC) read all transcripts independently, coding the presence of each theme and quotations exemplifying these themes using qualitative analysis software (Dedoose Version 7.0.23, SocioCultural Research Consultants). A third investigator (AAW) read all the transcripts and resolved differences of opinion. Themes and code application were discussed among the study team after the second and fifth transcripts to add or clarify codes. No new codes were identified after the first review of the preliminary codebook, although investigators intermittently used an “unknown” code through the 20th transcript. After discussion to reach consensus, excerpts initially coded “unknown” were assigned existing codes; the 20th transcript represents the approximate point of reaching thematic saturation.
RESULTS
Of the 24 participants, 18 (75%) were male, representing a variety of roles: 7 (29.2%) CMOs, 5 (20.8%) Presidents, 5 (20.8%) CFOs, 4 (16.7%) CEOs, and 3 (12.5%) Vice Presidents. The participants represented all regions (Midwest 12 [50%], South 6 [25%], West 4 [16.7%], and East 2 [8.3%], community size (Urban 11 [45.8%], Suburban 8 [33.3%], and Rural 5 [20.8%]), and Hospital Types (Community 11 [45.8%], Multihospital System 5 [20.8%], Academic 5 [20.8%], Safety Net 2 [8.3%], and Critical Access 1 [4.2%]). We present specific themes below and supporting quotations in Tables 1 and 2.
Current Value of the HMG at the Respondent’s Hospital
Most executives reported their hospital’s HMG had operated for over a decade and had developed an earlier, outdated value framework. Interviewees described an initial mix of financial pressures, shifts in physician work preferences, increasing patient acuity, resident labor shortages, and unsolved hospital throughput needs that triggered a reactive conversion from community PCP staffing to hospitalist care teams, followed by refinements to realize value.
“I think initially here it was to deal with the resident caps, right? So, at that moment, the solution that was put in place probably made a lot of sense. If that’s all someone came in with, now I’d be scratching my head and said, what are you thinking?” (President, #2)
Respondents perceived that HMGs provide value in many domains, including financial contributions, high-quality care, organizational efficiency, academics, leadership of interprofessional teams, effective communication, system improvement, and beneficial influence on the care environment and other employees. Regarding the measurable generation of financial benefit, documentation for improved billing accuracy, increased hospital efficiency (eg, lower length of stay, early discharges), and comanagement arrangements were commonly identified.
“I don’t want a urologist with a stethoscope, so I’m happy to have the hospitalists say, ‘Look, I’ll take care of the patient. You do the procedure.’ Well, that’s inherently valuable, whether we measure it or whether we don’t.” (CMO, #21)
Executives generally expressed satisfaction with their HMG’s quality of care and the related pay-for-performance financial benefits from payers, attributing success to hospitalists’ familiarity with inpatient systems and willingness to standardize.
“I just think it’s having one structure, one group to go to, a standard rather than trying to push it through the medical staff.” (VP, #18)
Executives reported that HMGs generate substantial value that is difficult to measure financially. For example, a large bundle of excerpts organized around communication with patients, nurses, and other providers.
“If we have the right hospitalist staff, to engage them with the nursing staff would help to reduce my turnover rate…and create a very positive morale within the nursing units. That’s huge. That’s nonfinancial” (President, #15)
Executives particularly appreciated hospitalists’ work to aggregate input from multiple specialists and present a cohesive explanation to patients. Executives also felt that HMGs create significant unmeasured value by improving processes and outcomes on service lines beyond hospital medicine, achieving this through culture change, involvement in leadership, hospital-wide process redesign, and running rapid response teams. Some executives expressed a desire for hospitalists to assume this global quality responsibility more explicitly as a job expectation.
Executives described how they would evaluate a de novo proposal for hospitalist services, usually enumerating key general domains without explaining specifically how they would measure each element. The following priorities emerged: clinical excellence, capacity to collaborate with hospital leadership, the scope of services provided, cultural fit/alignment, financial performance, contract cost, pay-for-performance measures, and turnover. Regarding financial performance, respondents expected to know the cost of the proposal but lacked a specific price threshold. Instead, they sought to understand the total value of the proposal through its effect on metrics such as facility fees or resource use. Nonetheless, cultural fit was a critical, overriding driver of the hypothetical decision, despite difficulty defining beyond estimates of teamwork, alignment with hospital priorities, and qualities of the group leader.
“For us, it usually ends being how do we mix personally, do we like them?” (CMO, #5)
Alignment and Collaboration
The related concepts of “collaboration” and “alignment” emerged as prominent themes during all interviews. Executives highly valued hospitalist groups that could demonstrate alignment with hospital priorities and often used this concept to summarize the HMG’s success or failure across a group of value domains.
“If you’re just coming in to fill a shift and see 10 patients, you have much less value than somebody who’s going to play that active partnership role… hospitalist services need to partner with hospitals and be intimately involved with the success of the hospital.” (CMO, #20)
Alignment sometimes manifested in a quantified, explicit way, through incentive plans or shared savings plans. However, it most often manifested as a broader sense that the hospitalists’ work targeted the same priorities as the executive leaders and that hospitalists genuinely cared about those priorities. A “shift-work mentality” was expressed by some as the antithesis of alignment. Incorporating hospitalist leaders in hospital leadership and frequent communication arose as mechanisms to increase alignment.
Ways HMGs Fail to Meet Expectations
Respondents described unresolved disadvantages to the hospitalist care model.
“I mean, OPPE, how do you do that for a hospitalist? How can you do it? It’s hard to attribute a patient to someone….it is a weakness and I think we all know it.” (CMO, #21)
Executives also worried about inconsistent handoffs with primary care providers and the field’s demographics, finding it disproportionately comprised of junior or transient physicians. They also hoped that hospitalist innovators would solve clinician burnout and the high cost of inpatient care. Disappointments specific to the local HMG revolved around difficulty developing shared models of value and mechanisms to achieve them.
“I would like to have more dialog between the hospital leadership team and the hospitalist group…I would like to see a little bit more collaboration.” (President, #13)
These challenges emerged not as a deficiency with hospital medicine as a specialty, but a failure at their specific facility to achieve the goal of alignment through joint strategic planning.
Calculating Value
When asked if their hospital had a formal process to evaluate ROI for their HMG, two dominant answers emerged: (1) the executive lacked a formal process for determining ROI and was unaware of one used at their facility or (2) the executive evaluated HMG performance based on multiple measures, including cost, but did not attempt to calculate ROI or a summary value. Several described the financial evaluation process as too difficult or unnecessary.
“No. It’s too difficult to extract that data. I would say the best proxy that we could do it is our case mix index on our medicine service line.” (CMO, #20)
“No, not a formal process, no… I question the value of some of the other things we do with the medical group…but not the value of the hospitalists… I don’t think we’ve done a formal assessment. I appreciate the flexibility, especially in a small hospital.” (President, #10)
Rarely, executives described specific financial calculations that served as a proxy for ROI. These included calculating a contribution margin to compare against the cost of salary support or the application of external survey benchmarking comparisons for productivity and salary to evaluate the appropriateness of a limited set of financial indicators. Twice respondents alluded to more sophisticated measurements conducted by the finance department but lacked familiarity with the process. Several executives described ROI calculations for specific projects and discrete business decisions involving hospitalists, particularly considering hiring an additional hospitalist.
Executives generally struggled to recall specific ways that the nonfinancial contributions of hospitalists were incorporated into executive decisions regarding the hospitalist group. Two related themes emerged: first, the belief that hospitals could not function effectively without hospitalists, making their presence an expected cost of doing business. Second, absent measures of HMG ROI, executives appeared to determine an approximate overall value of hospitalists, rather than parsing the various contributions. A few respondents expressed alarm at the rise in hospitalist salaries, whereas others acknowledged market forces beyond their control.
“… there is going to be more of a demand for hospitalists, which is definitely going to drive up the compensation. So, I don’t worry that the compensation will be driven up so high that there won’t be a return [on investment].” (CFO, #16)
Some urged individual hospitalists to develop a deeper understanding of what supports their salary to avoid strained relationships with executives.
Evolution and Risk-Sharing Contracts
Respondents described an evolving conceptualization of the hospitalist’s value, occurring at both a broad, long-term scale and at an incremental, annual scale through minor modifications to incentive pay schemes. For most executives, hiring hospitalists as replacements for PCPs had become necessary and not a source of novel value; many executives described it as “the cost of doing business.” Some described gradually deemphasizing relative value unit (RVU) production to recognize other contributions. Several reported their general appreciation of hospitalists evolved as specific hospitalists matured and demonstrated new contributions to hospital function. Some leaders tried to speculate about future phases of this evolution, although details were sparse.
Among respondents with greater implementation of risk-sharing contracts or ACOs, executives did not describe significantly different goals for hospitalists; executives emphasized that hospitalists should accelerate existing efforts to reduce inpatient costs, length of stay, healthcare-acquired conditions, unnecessary testing, and readmissions. A theme emerged around hospitalists supporting the continuum of care, through improved communication and increased alignment with health systems.
“Where I see the real benefit…is to figure out a way to use hospitalists and match them up with the primary care physicians on the outpatient side to truly develop an integrated population-based medicine practice for all our patients.” (President, #15)
Executives believed that communication and collaboration with PCPs and postacute care providers would attract more measurement.
DISCUSSION
Our findings provide hospitalists with insight into the approach hospital executives may follow when determining the rationale for and extent of financial support for HMGs. The results did not support our hypothesis that executives commonly determine the appropriate support by summing detailed quantitative models for various HMG contributions. Instead, most hospital executives appear to make decisions about the appropriateness of financial support based on a small number of basic financial or care quality metrics combined with a subjective assessment of the HMG’s broader alignment with hospital priorities. However, we did find substantial evidence that hospital executives’ expectations of hospitalists have evolved in the last decade, creating the potential for dissociation from how hospitalists prioritize and value their own efforts. Together, our findings suggest that enhanced communication, relationship building, and collaboration with hospital leaders may help HMGs to maintain a shared model of value with hospital executives.
The general absence of summary value calculations suggests specific opportunities, benefits, and risks for HMG group leaders (Table 3). An important opportunity relates to the communication agenda about unmeasured or nonfinancial contributions. Although executives recognized many of these, our data suggest a need for HMG leaders to educate hospital leaders about their unmeasured contributions proactively. Although some might recommend doing so by quantifying and financially rewarding key intangible contributions (eg, measuring leadership in culture change9), this entails important risks.10 Some experts propose that the proliferation of physician pay-for-performance schemes threatens medical professionalism, fails patients, and misunderstands what motivates physicians.11 HMG groups that feel undervalued should hesitate before monetizing all aspects of their work, and consider emphasizing relationship-building as a platform for communication about their performance. Achieving better alignment with executives is not just an opportunity for HMG leaders, but for each hospitalist within the group. Although executives wanted to have deeper relationships with group members, this may not be feasible in large organizations. Instead, it is incumbent for HMG leaders to translate executives’ expectations and forge better alignment.
Residency may not adequately prepare hospitalists to meet key expectations hospital executives hold related to system leadership and interprofessional team leadership. For example, hospital leaders particularly valued HMGs’ perceived ability to improve nurse retention and morale. Unfortunately, residency curricula generally lack concerted instruction on the skills required to produce such interprofessional inpatient teams reliably. Similarly, executives strongly wanted HMGs to acknowledge a role as partners in running the quality, stewardship, and safety missions of the entire hospital. While residency training builds clinical competence through the care of individual patients, many residents do not receive experiential education in system design and leadership. This suggests a need for HMGs to provide early career training or mentorship in quality improvement and interprofessional teamwork. Executives and HMG leaders seeking a stable, mature workforce, should allocate resources to retaining mid and late career hospitalists through leadership roles or financial incentives for longevity.
As with many qualitative studies, the generalizability of our findings may be limited, particularly outside the US healthcare system. We invited executives from diverse practice settings but may not have captured all the relevant viewpoints. This study did not include Veterans Affairs hospitals, safety net hospitals were underrepresented, Midwestern hospitals were overrepresented and the participants were predominantly male. We were unable to determine the influence of employment model on participant beliefs about HMGs, nor did we elicit comparisons to other physician specialties that would highlight a distinct approach to negotiating with HMGs. Because we used hospitalists as interviewers, including some from the same institution as the interviewee, respondents may have dampened critiques or descriptions of unmet expectations. Our data do not provide quantitative support for any approach to determining or negotiating appropriate financial support for an HMG.
CONCLUSIONS
This work contributes new understanding of the expectations executives have for HMGs and individual hospitalists. This highlights opportunities for group leaders, hospitalists, medical educators, and quality improvement experts to produce a hospitalist labor force that can engage in productive and mutually satisfying relationships with hospital leaders. Hospitalists should strive to improve alignment and communication with executive groups.
Disclosures
The authors report no potential conflict of interest.
1. Lapps J, Flansbaum B, Leykum L, et al. Updating threshold-based identification of hospitalists in 2012 Medicare pay data. J Hosp Med. 2016;11(1):45-47. https://doi.org/10.1002/jhm.2480.
2. Wachter RM, Goldman L. Zero to 50,000–the 20th Anniversary of the hospitalist. NEJM. 2016;375(11):1009-1011. https://doi.org/10.1056/nejmp1607958.
3. Stevens JP, Nyweide DJ, Maresh S, et al. Comparison of hospital resource use and outcomes among hospitalists, primary care physicians, and other generalists. JAMA Intern Med. 2017;177(12):1781-1787. https://doi.org/10.1001/jamainternmed.2017.5824.
4. American Hospital Association (AHA) (2017), Hospital Statistics, AHA, Chicago, IL.
5. Wachter RM, Goldman L. The emerging role of “hospitalists” in the American health care system. NEJM. 1996;335(7):514-517. https://doi.org/10.1093/ajhp/53.20.2389a.
6. Pham HH, Devers KJ, Kuo S, et al. Health care market trends and the evolution of hospitalist use and roles. J Gen Intern Med. 2005;20(2):101-107. https://doi.org/10.1111/j.1525-1497.2005.40184.x.
7. Epané JP, Weech-Maldonado R, Hearld L, et al. Hospitals’ use of hospitalistas: implications for financial performance. Health Care Manage Rev. 2019;44(1):10-18. https://doi.org/10.1097/hmr.0000000000000170.
8. State of Hospital Medicine: 2018 Report Based on 2017 Data. Society of Hospital Medicine. https://sohm.hospitalmedicine.org/ Accessed December 9, 2018.
9. Carmeli A, Tishler A. The relationships between intangible organizational elements and organizational performance. Strategic Manag J. 2004;25(13):1257-1278. https://doi.org/10.1002/smj.428.
10. Bernard M. Strategic performance management: leveraging and measuring your intangible value drivers. Amsterdam: Butterworth-Heinemann, 2006.
11. Khullar D, Wolfson D, Casalino LP. Professionalism, performance, and the future of physician incentives. JAMA. 2018;320(23):2419-2420. https://doi.org/10.1001/jama.2018.17719.
1. Lapps J, Flansbaum B, Leykum L, et al. Updating threshold-based identification of hospitalists in 2012 Medicare pay data. J Hosp Med. 2016;11(1):45-47. https://doi.org/10.1002/jhm.2480.
2. Wachter RM, Goldman L. Zero to 50,000–the 20th Anniversary of the hospitalist. NEJM. 2016;375(11):1009-1011. https://doi.org/10.1056/nejmp1607958.
3. Stevens JP, Nyweide DJ, Maresh S, et al. Comparison of hospital resource use and outcomes among hospitalists, primary care physicians, and other generalists. JAMA Intern Med. 2017;177(12):1781-1787. https://doi.org/10.1001/jamainternmed.2017.5824.
4. American Hospital Association (AHA) (2017), Hospital Statistics, AHA, Chicago, IL.
5. Wachter RM, Goldman L. The emerging role of “hospitalists” in the American health care system. NEJM. 1996;335(7):514-517. https://doi.org/10.1093/ajhp/53.20.2389a.
6. Pham HH, Devers KJ, Kuo S, et al. Health care market trends and the evolution of hospitalist use and roles. J Gen Intern Med. 2005;20(2):101-107. https://doi.org/10.1111/j.1525-1497.2005.40184.x.
7. Epané JP, Weech-Maldonado R, Hearld L, et al. Hospitals’ use of hospitalistas: implications for financial performance. Health Care Manage Rev. 2019;44(1):10-18. https://doi.org/10.1097/hmr.0000000000000170.
8. State of Hospital Medicine: 2018 Report Based on 2017 Data. Society of Hospital Medicine. https://sohm.hospitalmedicine.org/ Accessed December 9, 2018.
9. Carmeli A, Tishler A. The relationships between intangible organizational elements and organizational performance. Strategic Manag J. 2004;25(13):1257-1278. https://doi.org/10.1002/smj.428.
10. Bernard M. Strategic performance management: leveraging and measuring your intangible value drivers. Amsterdam: Butterworth-Heinemann, 2006.
11. Khullar D, Wolfson D, Casalino LP. Professionalism, performance, and the future of physician incentives. JAMA. 2018;320(23):2419-2420. https://doi.org/10.1001/jama.2018.17719.
© 2019 Society of Hospital Medicine
Inpatient Communication Barriers and Drivers When Caring for Limited English Proficiency Children
Immigrant children make up the fastest growing segment of the population in the United States.1 While most immigrant children are fluent in English, approximately 40% live with a parent who has limited English proficiency (LEP; ie, speaks English less than “very well”).2,3 In pediatrics, LEP status has been associated with longer hospitalizations,4 higher hospitalization costs,5 increased risk for serious adverse medical events,4,6 and more frequent emergency department reutilization.7 In the inpatient setting, multiple aspects of care present a variety of communication challenges,8 which are amplified by shift work and workflow complexity that result in patients and families interacting with numerous providers over the course of an inpatient stay.
Increasing access to trained professional interpreters when caring for LEP patients improves communication, patient satisfaction, adherence, and mortality.9-12 However, even when access to interpreter services is established, effective use is not guaranteed.13 Up to 57% of pediatricians report relying on family members to communicate with LEP patients and their caregivers;9 23% of pediatric residents categorized LEP encounters as frustrating while 78% perceived care of LEP patients to be “misdirected” (eg, delay in diagnosis or discharge) because of associated language barriers.14
Understanding experiences of frontline inpatient medical providers and interpreters is crucial in identifying challenges and ways to optimize communication for hospitalized LEP patients and families. However, there is a paucity of literature exploring the perspectives of medical providers and interpreters as it relates to communication with hospitalized LEP children and families. In this study, we sought to identify barriers and drivers of effective communication with pediatric patients and families with LEP in the inpatient setting from the perspective of frontline medical providers and interpreters.
METHODS
Study Design
This qualitative study used Group Level Assessment (GLA), a structured participatory methodology that allows diverse groups of stakeholders to generate and evaluate data in interactive sessions.15-18 GLA structure promotes active participation, group problem-solving, and development of actionable plans, distinguishing it from focus groups and in-depth semistructured interviews.15,19 This study received a human subject research exemption by the institutional review board.
Study Setting
Cincinnati Children’s Hospital Medical Center (CCHMC) is a large quaternary care center with ~200 patient encounters each day who require the use of interpreter services. Interpreters (in-person, video, and phone) are utilized during admission, formal family-centered rounds, hospital discharge, and other encounters with physicians, nurses, and other healthcare professionals. In-person interpreters are available in-house for Spanish and Arabic, with 18 additional languages available through regional vendors. Despite available resources, there is no standard way in which medical providers and interpreters work with one another.
Study Participants and Recruitment
Medical providers who care for hospitalized general pediatric patients were eligible to participate, including attending physicians, resident physicians, bedside nurses, and inpatient ancillary staff (eg, respiratory therapists, physical therapists). Interpreters employed by CCHMC with experience in the inpatient setting were also eligible. Individuals were recruited based on published recommendations to optimize discussion and group-thinking.15 Each participant was asked to take part in one GLA session. Participants were assigned to specific sessions based on roles (ie, physicians, nurses, and interpreters) to maximize engagement and minimize the impact of hierarchy.
Study Procedure
GLA involves a seven-step structured process (Appendix 1): climate setting, generating, appreciating, reflecting, understanding, selecting, and action.15,18 Qualitative data were generated individually and anonymously by participants on flip charts in response to prompts such as: “I worry that LEP families___,” “The biggest challenge when using interpreter services is___,” and “I find___ works well in providing care for LEP families.” Prompts were developed by study investigators, modified based on input from nursing and interpreter services leadership, and finalized by GLA facilitators. Fifty-one unique prompts were utilized (Appendix 2); the number of prompts used (ranging from 15 to 32 prompts) per session was based on published recommendations.15 During sessions, study investigators took detailed notes, including verbatim transcription of participant quotes. Upon conclusion of the session, each participant completed a demographic survey, including years of experience, languages spoken and perceived fluency,20 and ethnicity.
Data Analysis
Within each session, under the guidance of trained and experienced GLA facilitators (WB, HV), participants distilled and summarized qualitative data into themes, discussed and prioritized themes, and generated action items. Following completion of all sessions, analyzed data was compiled by the research team to determine similarities and differences across groups based on participant roles, consolidate themes into barriers and drivers of communication with LEP families, and determine any overlap of priorities for action. Findings were shared back with each group to ensure accuracy and relevance.
RESULTS
Participants
A total of 64 individuals participated (Table 1): hospital medicine physicians and residents (56%), inpatient nurses and ancillary staff (16%), and interpreters (28%). While 81% of physicians spoke multiple languages, only 25% reported speaking them well; two physicians were certified to communicate medical information without an interpreter present.
Themes Resulting from GLA Sessions
A total of four barriers (Table 2) and four drivers (Table 3) of effective communication with pediatric LEP patients and their families in the inpatient setting were identified by participants. Participants across all groups, despite enthusiasm around improving communication, were concerned about quality of care LEP families received, noting that the system is “designed to deliver less-good care” and that “we really haven’t figured out how to care for [LEP patients and families] in a [high-]quality and reliable way.” Variation in theme discussion was noted between groups based on participant role: physicians voiced concern about rapport with LEP families, nurses emphasized actionable tasks, and interpreters focused on heightened challenges in times of stress.
Barrier 1: Difficulties Accessing Interpreter Services
Medical providers (physicians and nurses) identified the “opaque process to access [interpreter] services” as one of their biggest challenges when communicating with LEP families. In particular, the process of scheduling interpreters was described as a “black box,” with physicians and nurses expressing difficulty determining if and when in-person interpreters were scheduled and uncertainty about when to use modalities other than in-person interpretation. Participants across groups highlighted the lack of systems knowledge from medical providers and limitations within the system that make predictable, timely, and reliable access to interpreters challenging, especially for uncommon languages. Medical providers desired more in-person interpreters who can “stay as long as clinically indicated,” citing frustration associated with using phone- and video-interpretation (eg, challenges locating technology, unfamiliarity with use, unreliable functionality of equipment). Interpreters voiced wanting to take time to finish each encounter fully without “being in a hurry because the next appointment is coming soon” or “rushing… in [to the next] session sweating.”
Barrier 2: Uncertainty in Communication with LEP Families
Participants across all groups described three areas of uncertainty as detailed in Table 2: (1) what to share and how to prioritize information during encounters with LEP patients and families, (2) what is communicated during interpretation, and (3) what LEP patients and families understand.
Barrier 3: Unclear and Inconsistent Expectations and Roles of Team Members
Given the complexity involved in communication between medical providers, interpreters, and families, participants across all groups reported feeling ill-prepared when navigating hospital encounters with LEP patients and families. Interpreters reported having little to no clinical context, medical providers reported having no knowledge of the assigned interpreter’s style, and both interpreters and medical providers reported that families have little idea of what to expect or how to engage. All groups voiced frustration about the lack of clarity regarding specific roles and scope of practice for each team member during an encounter, where multiple people end up “talking [or] using the interpreter at once.” Interpreters shared their expectations of medical providers to set the pace and lead conversations with LEP families. On the other hand, medical providers expressed a desire for interpreters to provide cultural context to the team without prompting and to interrupt during encounters when necessary to voice concerns or redirect conversations.
Barrier 4: Unmet Family Engagement Expectations
Participants across all groups articulated challenges with establishing rapport with LEP patients and families, sharing concerns that “inadequate communication” due to “cultural or language barriers” ultimately impacts quality of care. Participants reported decreased bidirectional engagement with and from LEP families. Medical providers not only noted difficulty in connecting with LEP families “on a more personal level” and providing frequent medical updates, but also felt that LEP families do not ask questions even when uncertain. Interpreters expressed concerns about medical providers “not [having] enough patience to answer families’ questions” while LEP families “shy away from asking questions.”
Driver 1: Utilizing a Team-Based Approach between Medical Providers and Interpreters
Participants from all groups emphasized that a mutual understanding of roles and shared expectations regarding communication and interpretation style, clinical context, and time constraints would establish a foundation for respect between medical providers and interpreters. They reported that a team-based approach to LEP patient and family encounters were crucial to achieving effective communication.
Driver 2: Understanding the Role of Cultural Context in Providing Culturally Effective Care.
Participants across all groups highlighted three different aspects of cultural context that drive effective communication: (1) medical providers’ perception of the family’s culture; (2) LEP families’ knowledge about the culture and healthcare system in the US, and (3) medical providers insight into their own preconceived ideas about LEP families.
Driver 3: Practicing Empathy for Patients and Families
All participants reported that respect for diversity and consideration of the backgrounds and perspectives of LEP patients and families are necessary. Furthermore, both medical providers and interpreters articulated a need to remain patient and mindful when interacting with LEP families despite challenges, especially since, as noted by interpreters, encounters may “take longer, but it’s for a reason.”
Driver 4: Using Effective Family-Centered Communication Strategies
Participants identified the use of effective family-centered communication principles as a driver to optimal communication. Many of the principles identified by medical providers and interpreters are generally applicable to all hospitalized patients and families regardless of English proficiency: optimizing verbal communication (eg, using shorter sentences, pausing to allow for interpretation), optimizing nonverbal communication (eg, setting, position, and body language), and assessment of family understanding and engagement (eg, use of teach back).
DISCUSSION
Frontline medical providers and interpreters identified barriers and drivers that impact communication with LEP patients and families during hospitalization. To our knowledge, this is the first study that uses a participatory method to explore the perspectives of medical providers and interpreters who care for LEP children and families in the inpatient setting. Despite existing difficulties and concerns regarding language barriers and its impact on quality of care for hospitalized LEP patients and families, participants were enthusiastic about how identified barriers and drivers may inform future improvement efforts. Notable action steps for future improvement discussed by our participants included: increased use and functionality of technology for timely and predictable access to interpreters, deliberate training for providers focused on delivery of culturally-effective care, consistent use of family-centered communication strategies including teach-back, and implementing interdisciplinary expectation setting through “presessions” before encounters with LEP families.
Participants elaborated on several barriers previously described in the literature including time constraints and technical problems.14,21,22 Such barriers may serve as deterrents to consistent and appropriate use of interpreters in healthcare settings.9 A heavy reliance on off-site interpreters (including phone- or video-interpreters) and lack of knowledge regarding resource availability likely amplified frustration for medical providers. Communication with LEP families can be daunting, especially when medical providers do not care for LEP families or work with interpreters on a regular basis.14 Standardizing the education of medical providers regarding available resources, as well as the logistics, process, and parameters for scheduling interpreters and using technology, was an action step identified by our GLA participants. Targeted education about the logistics of accessing interpreter services and having standardized ways to make technology use easier (ie, one-touch dialing in hospital rooms) has been associated with increased interpreter use and decreased interpreter-related delays in care.23
Our frontline medical providers expressed added concern about not spending as much time with LEP families. In fact, LEP families in the literature have perceived medical providers to spend less time with their children compared to their English-proficient counterparts.24 Language and cultural barriers, both perceived and real, may limit medical provider rapport with LEP patients and families14 and likely contribute to medical providers relying on their preconceived assumptions instead.25 Cultural competency education for medical providers, as highlighted by our GLA participants as an action item, can be used to provide more comprehensive and effective care.26,27
In addition to enhancing cultural humility through education, our participants emphasized the use of family-centered communication strategies as a driver of optimal family engagement and understanding. Actively inviting questions from families and utilizing teach-back, an established evidence-based strategy28-30 discussed by our participants, can be particularly powerful in assessing family understanding and engagement. While information should be presented in plain language for families in all encounters,31 these evidence-based practices are of particular importance when communicating with LEP families. They promote effective communication, empower families to share concerns in a structured manner, and allow medical providers to address matters in real-time with interpreters present.
Finally, our participants highlighted the need for partnerships between providers and interpreter services, noting unclear roles and expectations among interpreters and medical providers as a major barrier. Specifically, physicians noted confusion regarding the scope of an interpreter’s practice. Participants from GLA sessions discussed the importance of a team-based approach and suggested implementing a “presession” prior to encounters with LEP patients and families. Presessions—a concept well accepted among interpreters and recommended by consensus-based practice guidelines—enable medical providers and interpreters to establish shared expectations about scope of practice, communication, interpretation style, time constraints, and medical context prior to patient encounters.32,33
There are several limitations to our study. First, individuals who chose to participate were likely highly motivated by their clinical experiences with LEP patients and invested in improving communication with LEP families. Second, the study is limited in generalizability, as it was conducted at a single academic institution in a Midwestern city. Despite regional variations in available resources as well as patient and workforce demographics, our findings regarding major themes are in agreement with previously published literature and further add to our understanding of ways to improve communication with this vulnerable population across the care spectrum. Lastly, we were logistically limited in our ability to elicit the perspectives of LEP families due to the participatory nature of GLA; the need for multiple interpreters to simultaneously interact with LEP individuals would have not only hindered active LEP family participation but may have also biased the data generated by patients and families, as the services interpreters provide during their inpatient stay was the focus of our study. Engaging LEP families in their preferred language using participatory methods should be considered for future studies.
In conclusion, frontline providers of medical and language services identified barriers and drivers impacting the effective use of interpreter services when communicating with LEP families during hospitalization. Our enhanced understanding of barriers and drivers, as well as identified actionable interventions, will inform future improvement of communication and interactions with LEP families that contributes to effective and efficient family centered care. A framework for the development and implementation of organizational strategies aimed at improving communication with LEP families must include a thorough assessment of impact, feasibility, stakeholder involvement, and sustainability of specific interventions. While there is no simple formula to improve language services, health systems should establish and adopt language access policies, standardize communication practices, and develop processes to optimize the use of language services in the hospital. Furthermore, engagement with LEP families to better understand their perceptions and experiences with the healthcare system is crucial to improve communication between medical providers and LEP families in the inpatient setting and should be the subject of future studies.
Disclosures
The authors have no conflicts of interest to disclose.
Funding
No external funding was secured for this study. Dr. Joanna Thomson is supported by the Agency for Healthcare Research and Quality (Grant #K08 HS025138). Dr. Raglin Bignall was supported through a Ruth L. Kirschstein National Research Service Award (T32HP10027) when the study was conducted. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding organizations. The funding organizations had no role in the design, preparation, review, or approval of this paper.
1. The American Academy of Pediatrics Council on Community Pediatrics. Providing care for immigrant, migrant, and border children. Pediatrics. 2013;131(6):e2028-e2034. PubMed
2. Meneses C, Chilton L, Duffee J, et al. Council on Community Pediatrics Immigrant Health Tool Kit. The American Academy of Pediatrics. https://www.aap.org/en-us/Documents/cocp_toolkit_full.pdf. Accessed May 13, 2019.
3. Office for Civil Rights. Guidance to Federal Financial Assistance Recipients Regarding Title VI and the Prohibition Against National Origin Discrimination Affecting Limited English Proficient Persons. https://www.hhs.gov/civil-rights/for-individuals/special-topics/limited-english-proficiency/guidance-federal-financial-assistance-recipients-title-vi/index.html. Accessed May 13, 2019.
4. Lion KC, Rafton SA, Shafii J, et al. Association between language, serious adverse events, and length of stay Among hospitalized children. Hosp Pediatr. 2013;3(3):219-225. https://doi.org/10.1542/hpeds.2012-0091.
5. Lion KC, Wright DR, Desai AD, Mangione-Smith R. Costs of care for hospitalized children associated With preferred language and insurance type. Hosp Pediatr. 2017;7(2):70-78. https://doi.org/10.1542/hpeds.2016-0051.
6. Cohen AL, Rivara F, Marcuse EK, McPhillips H, Davis R. Are language barriers associated with serious medical events in hospitalized pediatric patients? Pediatrics. 2005;116(3):575-579. https://doi.org/10.1542/peds.2005-0521.
7. Samuels-Kalow ME, Stack AM, Amico K, Porter SC. Parental language and return visits to the Emergency Department After discharge. Pediatr Emerg Care. 2017;33(6):402-404. https://doi.org/10.1097/PEC.0000000000000592.
8. Unaka NI, Statile AM, Choe A, Shonna Yin H. Addressing health literacy in the inpatient setting. Curr Treat Options Pediatr. 2018;4(2):283-299. https://doi.org/10.1007/s40746-018-0122-3.
9. DeCamp LR, Kuo DZ, Flores G, O’Connor K, Minkovitz CS. Changes in language services use by US pediatricians. Pediatrics. 2013;132(2):e396-e406. https://doi.org/10.1542/peds.2012-2909.
10. Flores G. The impact of medical interpreter services on the quality of health care: A systematic review. Med Care Res Rev. 2005;62(3):255-299. https://doi.org/10.1177/1077558705275416.
11. Flores G, Abreu M, Barone CP, Bachur R, Lin H. Errors of medical interpretation and their potential clinical consequences: A comparison of professional versus hoc versus no interpreters. Ann Emerg Med. 2012;60(5):545-553. https://doi.org/10.1016/j.annemergmed.2012.01.025.
12. Anand KJ, Sepanski RJ, Giles K, Shah SH, Juarez PD. Pediatric intensive care unit mortality among Latino children before and after a multilevel health care delivery intervention. JAMA Pediatr. 2015;169(4):383-390. https://doi.org/10.1001/jamapediatrics.2014.3789.
13. The Joint Commission. Advancing Effective Communication, Cultural Competence, and Patient- and Family-Centered Care: A Roadmap for Hospitals. Oakbrook Terrace, IL: The Joint Commission; 2010.
14. Hernandez RG, Cowden JD, Moon M et al. Predictors of resident satisfaction in caring for limited English proficient families: a multisite study. Acad Pediatr. 2014;14(2):173-180. https://doi.org/10.1016/j.acap.2013.12.002.
15. Vaughn LM, Lohmueller M. Calling all stakeholders: group-level assessment (GLA)-a qualitative and participatory method for large groups. Eval Rev. 2014;38(4):336-355. https://doi.org/10.1177/0193841X14544903.
16. Vaughn LM, Jacquez F, Zhao J, Lang M. Partnering with students to explore the health needs of an ethnically diverse, low-resource school: an innovative large group assessment approach. Fam Commun Health. 2011;34(1):72-84. https://doi.org/10.1097/FCH.0b013e3181fded12.
17. Gosdin CH, Vaughn L. Perceptions of physician bedside handoff with nurse and family involvement. Hosp Pediatr. 2012;2(1):34-38. https://doi.org/10.1542/hpeds.2011-0008-2.
18. Graham KE, Schellinger AR, Vaughn LM. Developing strategies for positive change: transitioning foster youth to adulthood. Child Youth Serv Rev. 2015;54:71-79. https://doi.org/10.1016/j.childyouth.2015.04.014.
19. Vaughn LM. Group level assessment: A Large Group Method for Identifying Primary Issues and Needs within a community. London2014. http://methods.sagepub.com/case/group-level-assessment-large-group-primary-issues-needs-community. Accessed 2017/07/26.
20. Association of American Medical Colleges Electronic Residency Application Service. ERAS 2018 MyERAS Application Worksheet: Language Fluency. Washington, DC:: Association of American Medical Colleges; 2018:5.
21. Brisset C, Leanza Y, Laforest K. Working with interpreters in health care: A systematic review and meta-ethnography of qualitative studies. Patient Educ Couns. 2013;91(2):131-140. https://doi.org/10.1016/j.pec.2012.11.008.
22. Wiking E, Saleh-Stattin N, Johansson SE, Sundquist J. A description of some aspects of the triangular meeting between immigrant patients, their interpreters and GPs in primary health care in Stockholm, Sweden. Fam Pract. 2009;26(5):377-383. https://doi.org/10.1093/fampra/cmp052.
23. Lion KC, Ebel BE, Rafton S et al. Evaluation of a quality improvement intervention to increase use of telephonic interpretation. Pediatrics. 2015;135(3):e709-e716. https://doi.org/10.1542/peds.2014-2024.
24. Zurca AD, Fisher KR, Flor RJ, et al. Communication with limited English-proficient families in the PICU. Hosp Pediatr. 2017;7(1):9-15. https://doi.org/10.1542/hpeds.2016-0071.
25. Kodjo C. Cultural competence in clinician communication. Pediatr Rev. 2009;30(2):57-64. https://doi.org/10.1542/pir.30-2-57.
26. Britton CV, American Academy of Pediatrics Committee on Pediatric Workforce. Ensuring culturally effective pediatric care: implications for education and health policy. Pediatrics. 2004;114(6):1677-1685. https://doi.org/10.1542/peds.2004-2091.
27. The American Academy of Pediatrics. Culturally Effective Care Toolkit: Providing Cuturally Effective Pediatric Care; 2018. https://www.aap.org/en-us/professional-resources/practice-transformation/managing-patients/Pages/effective-care.aspx. Accessed May 13, 2019.
28. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. https://doi.org/10.1056/NEJMsa1405556.
29. Jager AJ, Wynia MK. Who gets a teach-back? Patient-reported incidence of experiencing a teach-back. J Health Commun. 2012;17 Supplement 3:294-302. https://doi.org/10.1080/10810730.2012.712624.
30. Kornburger C, Gibson C, Sadowski S, Maletta K, Klingbeil C. Using “teach-back” to promote a safe transition from hospital to home: an evidence-based approach to improving the discharge process. J Pediatr Nurs. 2013;28(3):282-291. https://doi.org/10.1016/j.pedn.2012.10.007.
31. Abrams MA, Klass P, Dreyer BP. Health literacy and children: recommendations for action. Pediatrics. 2009;124 Supplement 3:S327-S331. https://doi.org/10.1542/peds.2009-1162I.
32. Betancourt JR, Renfrew MR, Green AR, Lopez L, Wasserman M. Improving Patient Safety Systems for Patients with Limited English Proficiency: a Guide for Hospitals. Agency for Healthcare Research and Quality; 2012.
33. The National Council on Interpreting in Health Care. Best Practices for Communicating Through an Interpreter . https://refugeehealthta.org/access-to-care/language-access/best-practices-communicating-through-an-interpreter/. Accessed May 19, 2019.
Immigrant children make up the fastest growing segment of the population in the United States.1 While most immigrant children are fluent in English, approximately 40% live with a parent who has limited English proficiency (LEP; ie, speaks English less than “very well”).2,3 In pediatrics, LEP status has been associated with longer hospitalizations,4 higher hospitalization costs,5 increased risk for serious adverse medical events,4,6 and more frequent emergency department reutilization.7 In the inpatient setting, multiple aspects of care present a variety of communication challenges,8 which are amplified by shift work and workflow complexity that result in patients and families interacting with numerous providers over the course of an inpatient stay.
Increasing access to trained professional interpreters when caring for LEP patients improves communication, patient satisfaction, adherence, and mortality.9-12 However, even when access to interpreter services is established, effective use is not guaranteed.13 Up to 57% of pediatricians report relying on family members to communicate with LEP patients and their caregivers;9 23% of pediatric residents categorized LEP encounters as frustrating while 78% perceived care of LEP patients to be “misdirected” (eg, delay in diagnosis or discharge) because of associated language barriers.14
Understanding experiences of frontline inpatient medical providers and interpreters is crucial in identifying challenges and ways to optimize communication for hospitalized LEP patients and families. However, there is a paucity of literature exploring the perspectives of medical providers and interpreters as it relates to communication with hospitalized LEP children and families. In this study, we sought to identify barriers and drivers of effective communication with pediatric patients and families with LEP in the inpatient setting from the perspective of frontline medical providers and interpreters.
METHODS
Study Design
This qualitative study used Group Level Assessment (GLA), a structured participatory methodology that allows diverse groups of stakeholders to generate and evaluate data in interactive sessions.15-18 GLA structure promotes active participation, group problem-solving, and development of actionable plans, distinguishing it from focus groups and in-depth semistructured interviews.15,19 This study received a human subject research exemption by the institutional review board.
Study Setting
Cincinnati Children’s Hospital Medical Center (CCHMC) is a large quaternary care center with ~200 patient encounters each day who require the use of interpreter services. Interpreters (in-person, video, and phone) are utilized during admission, formal family-centered rounds, hospital discharge, and other encounters with physicians, nurses, and other healthcare professionals. In-person interpreters are available in-house for Spanish and Arabic, with 18 additional languages available through regional vendors. Despite available resources, there is no standard way in which medical providers and interpreters work with one another.
Study Participants and Recruitment
Medical providers who care for hospitalized general pediatric patients were eligible to participate, including attending physicians, resident physicians, bedside nurses, and inpatient ancillary staff (eg, respiratory therapists, physical therapists). Interpreters employed by CCHMC with experience in the inpatient setting were also eligible. Individuals were recruited based on published recommendations to optimize discussion and group-thinking.15 Each participant was asked to take part in one GLA session. Participants were assigned to specific sessions based on roles (ie, physicians, nurses, and interpreters) to maximize engagement and minimize the impact of hierarchy.
Study Procedure
GLA involves a seven-step structured process (Appendix 1): climate setting, generating, appreciating, reflecting, understanding, selecting, and action.15,18 Qualitative data were generated individually and anonymously by participants on flip charts in response to prompts such as: “I worry that LEP families___,” “The biggest challenge when using interpreter services is___,” and “I find___ works well in providing care for LEP families.” Prompts were developed by study investigators, modified based on input from nursing and interpreter services leadership, and finalized by GLA facilitators. Fifty-one unique prompts were utilized (Appendix 2); the number of prompts used (ranging from 15 to 32 prompts) per session was based on published recommendations.15 During sessions, study investigators took detailed notes, including verbatim transcription of participant quotes. Upon conclusion of the session, each participant completed a demographic survey, including years of experience, languages spoken and perceived fluency,20 and ethnicity.
Data Analysis
Within each session, under the guidance of trained and experienced GLA facilitators (WB, HV), participants distilled and summarized qualitative data into themes, discussed and prioritized themes, and generated action items. Following completion of all sessions, analyzed data was compiled by the research team to determine similarities and differences across groups based on participant roles, consolidate themes into barriers and drivers of communication with LEP families, and determine any overlap of priorities for action. Findings were shared back with each group to ensure accuracy and relevance.
RESULTS
Participants
A total of 64 individuals participated (Table 1): hospital medicine physicians and residents (56%), inpatient nurses and ancillary staff (16%), and interpreters (28%). While 81% of physicians spoke multiple languages, only 25% reported speaking them well; two physicians were certified to communicate medical information without an interpreter present.
Themes Resulting from GLA Sessions
A total of four barriers (Table 2) and four drivers (Table 3) of effective communication with pediatric LEP patients and their families in the inpatient setting were identified by participants. Participants across all groups, despite enthusiasm around improving communication, were concerned about quality of care LEP families received, noting that the system is “designed to deliver less-good care” and that “we really haven’t figured out how to care for [LEP patients and families] in a [high-]quality and reliable way.” Variation in theme discussion was noted between groups based on participant role: physicians voiced concern about rapport with LEP families, nurses emphasized actionable tasks, and interpreters focused on heightened challenges in times of stress.
Barrier 1: Difficulties Accessing Interpreter Services
Medical providers (physicians and nurses) identified the “opaque process to access [interpreter] services” as one of their biggest challenges when communicating with LEP families. In particular, the process of scheduling interpreters was described as a “black box,” with physicians and nurses expressing difficulty determining if and when in-person interpreters were scheduled and uncertainty about when to use modalities other than in-person interpretation. Participants across groups highlighted the lack of systems knowledge from medical providers and limitations within the system that make predictable, timely, and reliable access to interpreters challenging, especially for uncommon languages. Medical providers desired more in-person interpreters who can “stay as long as clinically indicated,” citing frustration associated with using phone- and video-interpretation (eg, challenges locating technology, unfamiliarity with use, unreliable functionality of equipment). Interpreters voiced wanting to take time to finish each encounter fully without “being in a hurry because the next appointment is coming soon” or “rushing… in [to the next] session sweating.”
Barrier 2: Uncertainty in Communication with LEP Families
Participants across all groups described three areas of uncertainty as detailed in Table 2: (1) what to share and how to prioritize information during encounters with LEP patients and families, (2) what is communicated during interpretation, and (3) what LEP patients and families understand.
Barrier 3: Unclear and Inconsistent Expectations and Roles of Team Members
Given the complexity involved in communication between medical providers, interpreters, and families, participants across all groups reported feeling ill-prepared when navigating hospital encounters with LEP patients and families. Interpreters reported having little to no clinical context, medical providers reported having no knowledge of the assigned interpreter’s style, and both interpreters and medical providers reported that families have little idea of what to expect or how to engage. All groups voiced frustration about the lack of clarity regarding specific roles and scope of practice for each team member during an encounter, where multiple people end up “talking [or] using the interpreter at once.” Interpreters shared their expectations of medical providers to set the pace and lead conversations with LEP families. On the other hand, medical providers expressed a desire for interpreters to provide cultural context to the team without prompting and to interrupt during encounters when necessary to voice concerns or redirect conversations.
Barrier 4: Unmet Family Engagement Expectations
Participants across all groups articulated challenges with establishing rapport with LEP patients and families, sharing concerns that “inadequate communication” due to “cultural or language barriers” ultimately impacts quality of care. Participants reported decreased bidirectional engagement with and from LEP families. Medical providers not only noted difficulty in connecting with LEP families “on a more personal level” and providing frequent medical updates, but also felt that LEP families do not ask questions even when uncertain. Interpreters expressed concerns about medical providers “not [having] enough patience to answer families’ questions” while LEP families “shy away from asking questions.”
Driver 1: Utilizing a Team-Based Approach between Medical Providers and Interpreters
Participants from all groups emphasized that a mutual understanding of roles and shared expectations regarding communication and interpretation style, clinical context, and time constraints would establish a foundation for respect between medical providers and interpreters. They reported that a team-based approach to LEP patient and family encounters were crucial to achieving effective communication.
Driver 2: Understanding the Role of Cultural Context in Providing Culturally Effective Care.
Participants across all groups highlighted three different aspects of cultural context that drive effective communication: (1) medical providers’ perception of the family’s culture; (2) LEP families’ knowledge about the culture and healthcare system in the US, and (3) medical providers insight into their own preconceived ideas about LEP families.
Driver 3: Practicing Empathy for Patients and Families
All participants reported that respect for diversity and consideration of the backgrounds and perspectives of LEP patients and families are necessary. Furthermore, both medical providers and interpreters articulated a need to remain patient and mindful when interacting with LEP families despite challenges, especially since, as noted by interpreters, encounters may “take longer, but it’s for a reason.”
Driver 4: Using Effective Family-Centered Communication Strategies
Participants identified the use of effective family-centered communication principles as a driver to optimal communication. Many of the principles identified by medical providers and interpreters are generally applicable to all hospitalized patients and families regardless of English proficiency: optimizing verbal communication (eg, using shorter sentences, pausing to allow for interpretation), optimizing nonverbal communication (eg, setting, position, and body language), and assessment of family understanding and engagement (eg, use of teach back).
DISCUSSION
Frontline medical providers and interpreters identified barriers and drivers that impact communication with LEP patients and families during hospitalization. To our knowledge, this is the first study that uses a participatory method to explore the perspectives of medical providers and interpreters who care for LEP children and families in the inpatient setting. Despite existing difficulties and concerns regarding language barriers and its impact on quality of care for hospitalized LEP patients and families, participants were enthusiastic about how identified barriers and drivers may inform future improvement efforts. Notable action steps for future improvement discussed by our participants included: increased use and functionality of technology for timely and predictable access to interpreters, deliberate training for providers focused on delivery of culturally-effective care, consistent use of family-centered communication strategies including teach-back, and implementing interdisciplinary expectation setting through “presessions” before encounters with LEP families.
Participants elaborated on several barriers previously described in the literature including time constraints and technical problems.14,21,22 Such barriers may serve as deterrents to consistent and appropriate use of interpreters in healthcare settings.9 A heavy reliance on off-site interpreters (including phone- or video-interpreters) and lack of knowledge regarding resource availability likely amplified frustration for medical providers. Communication with LEP families can be daunting, especially when medical providers do not care for LEP families or work with interpreters on a regular basis.14 Standardizing the education of medical providers regarding available resources, as well as the logistics, process, and parameters for scheduling interpreters and using technology, was an action step identified by our GLA participants. Targeted education about the logistics of accessing interpreter services and having standardized ways to make technology use easier (ie, one-touch dialing in hospital rooms) has been associated with increased interpreter use and decreased interpreter-related delays in care.23
Our frontline medical providers expressed added concern about not spending as much time with LEP families. In fact, LEP families in the literature have perceived medical providers to spend less time with their children compared to their English-proficient counterparts.24 Language and cultural barriers, both perceived and real, may limit medical provider rapport with LEP patients and families14 and likely contribute to medical providers relying on their preconceived assumptions instead.25 Cultural competency education for medical providers, as highlighted by our GLA participants as an action item, can be used to provide more comprehensive and effective care.26,27
In addition to enhancing cultural humility through education, our participants emphasized the use of family-centered communication strategies as a driver of optimal family engagement and understanding. Actively inviting questions from families and utilizing teach-back, an established evidence-based strategy28-30 discussed by our participants, can be particularly powerful in assessing family understanding and engagement. While information should be presented in plain language for families in all encounters,31 these evidence-based practices are of particular importance when communicating with LEP families. They promote effective communication, empower families to share concerns in a structured manner, and allow medical providers to address matters in real-time with interpreters present.
Finally, our participants highlighted the need for partnerships between providers and interpreter services, noting unclear roles and expectations among interpreters and medical providers as a major barrier. Specifically, physicians noted confusion regarding the scope of an interpreter’s practice. Participants from GLA sessions discussed the importance of a team-based approach and suggested implementing a “presession” prior to encounters with LEP patients and families. Presessions—a concept well accepted among interpreters and recommended by consensus-based practice guidelines—enable medical providers and interpreters to establish shared expectations about scope of practice, communication, interpretation style, time constraints, and medical context prior to patient encounters.32,33
There are several limitations to our study. First, individuals who chose to participate were likely highly motivated by their clinical experiences with LEP patients and invested in improving communication with LEP families. Second, the study is limited in generalizability, as it was conducted at a single academic institution in a Midwestern city. Despite regional variations in available resources as well as patient and workforce demographics, our findings regarding major themes are in agreement with previously published literature and further add to our understanding of ways to improve communication with this vulnerable population across the care spectrum. Lastly, we were logistically limited in our ability to elicit the perspectives of LEP families due to the participatory nature of GLA; the need for multiple interpreters to simultaneously interact with LEP individuals would have not only hindered active LEP family participation but may have also biased the data generated by patients and families, as the services interpreters provide during their inpatient stay was the focus of our study. Engaging LEP families in their preferred language using participatory methods should be considered for future studies.
In conclusion, frontline providers of medical and language services identified barriers and drivers impacting the effective use of interpreter services when communicating with LEP families during hospitalization. Our enhanced understanding of barriers and drivers, as well as identified actionable interventions, will inform future improvement of communication and interactions with LEP families that contributes to effective and efficient family centered care. A framework for the development and implementation of organizational strategies aimed at improving communication with LEP families must include a thorough assessment of impact, feasibility, stakeholder involvement, and sustainability of specific interventions. While there is no simple formula to improve language services, health systems should establish and adopt language access policies, standardize communication practices, and develop processes to optimize the use of language services in the hospital. Furthermore, engagement with LEP families to better understand their perceptions and experiences with the healthcare system is crucial to improve communication between medical providers and LEP families in the inpatient setting and should be the subject of future studies.
Disclosures
The authors have no conflicts of interest to disclose.
Funding
No external funding was secured for this study. Dr. Joanna Thomson is supported by the Agency for Healthcare Research and Quality (Grant #K08 HS025138). Dr. Raglin Bignall was supported through a Ruth L. Kirschstein National Research Service Award (T32HP10027) when the study was conducted. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding organizations. The funding organizations had no role in the design, preparation, review, or approval of this paper.
Immigrant children make up the fastest growing segment of the population in the United States.1 While most immigrant children are fluent in English, approximately 40% live with a parent who has limited English proficiency (LEP; ie, speaks English less than “very well”).2,3 In pediatrics, LEP status has been associated with longer hospitalizations,4 higher hospitalization costs,5 increased risk for serious adverse medical events,4,6 and more frequent emergency department reutilization.7 In the inpatient setting, multiple aspects of care present a variety of communication challenges,8 which are amplified by shift work and workflow complexity that result in patients and families interacting with numerous providers over the course of an inpatient stay.
Increasing access to trained professional interpreters when caring for LEP patients improves communication, patient satisfaction, adherence, and mortality.9-12 However, even when access to interpreter services is established, effective use is not guaranteed.13 Up to 57% of pediatricians report relying on family members to communicate with LEP patients and their caregivers;9 23% of pediatric residents categorized LEP encounters as frustrating while 78% perceived care of LEP patients to be “misdirected” (eg, delay in diagnosis or discharge) because of associated language barriers.14
Understanding experiences of frontline inpatient medical providers and interpreters is crucial in identifying challenges and ways to optimize communication for hospitalized LEP patients and families. However, there is a paucity of literature exploring the perspectives of medical providers and interpreters as it relates to communication with hospitalized LEP children and families. In this study, we sought to identify barriers and drivers of effective communication with pediatric patients and families with LEP in the inpatient setting from the perspective of frontline medical providers and interpreters.
METHODS
Study Design
This qualitative study used Group Level Assessment (GLA), a structured participatory methodology that allows diverse groups of stakeholders to generate and evaluate data in interactive sessions.15-18 GLA structure promotes active participation, group problem-solving, and development of actionable plans, distinguishing it from focus groups and in-depth semistructured interviews.15,19 This study received a human subject research exemption by the institutional review board.
Study Setting
Cincinnati Children’s Hospital Medical Center (CCHMC) is a large quaternary care center with ~200 patient encounters each day who require the use of interpreter services. Interpreters (in-person, video, and phone) are utilized during admission, formal family-centered rounds, hospital discharge, and other encounters with physicians, nurses, and other healthcare professionals. In-person interpreters are available in-house for Spanish and Arabic, with 18 additional languages available through regional vendors. Despite available resources, there is no standard way in which medical providers and interpreters work with one another.
Study Participants and Recruitment
Medical providers who care for hospitalized general pediatric patients were eligible to participate, including attending physicians, resident physicians, bedside nurses, and inpatient ancillary staff (eg, respiratory therapists, physical therapists). Interpreters employed by CCHMC with experience in the inpatient setting were also eligible. Individuals were recruited based on published recommendations to optimize discussion and group-thinking.15 Each participant was asked to take part in one GLA session. Participants were assigned to specific sessions based on roles (ie, physicians, nurses, and interpreters) to maximize engagement and minimize the impact of hierarchy.
Study Procedure
GLA involves a seven-step structured process (Appendix 1): climate setting, generating, appreciating, reflecting, understanding, selecting, and action.15,18 Qualitative data were generated individually and anonymously by participants on flip charts in response to prompts such as: “I worry that LEP families___,” “The biggest challenge when using interpreter services is___,” and “I find___ works well in providing care for LEP families.” Prompts were developed by study investigators, modified based on input from nursing and interpreter services leadership, and finalized by GLA facilitators. Fifty-one unique prompts were utilized (Appendix 2); the number of prompts used (ranging from 15 to 32 prompts) per session was based on published recommendations.15 During sessions, study investigators took detailed notes, including verbatim transcription of participant quotes. Upon conclusion of the session, each participant completed a demographic survey, including years of experience, languages spoken and perceived fluency,20 and ethnicity.
Data Analysis
Within each session, under the guidance of trained and experienced GLA facilitators (WB, HV), participants distilled and summarized qualitative data into themes, discussed and prioritized themes, and generated action items. Following completion of all sessions, analyzed data was compiled by the research team to determine similarities and differences across groups based on participant roles, consolidate themes into barriers and drivers of communication with LEP families, and determine any overlap of priorities for action. Findings were shared back with each group to ensure accuracy and relevance.
RESULTS
Participants
A total of 64 individuals participated (Table 1): hospital medicine physicians and residents (56%), inpatient nurses and ancillary staff (16%), and interpreters (28%). While 81% of physicians spoke multiple languages, only 25% reported speaking them well; two physicians were certified to communicate medical information without an interpreter present.
Themes Resulting from GLA Sessions
A total of four barriers (Table 2) and four drivers (Table 3) of effective communication with pediatric LEP patients and their families in the inpatient setting were identified by participants. Participants across all groups, despite enthusiasm around improving communication, were concerned about quality of care LEP families received, noting that the system is “designed to deliver less-good care” and that “we really haven’t figured out how to care for [LEP patients and families] in a [high-]quality and reliable way.” Variation in theme discussion was noted between groups based on participant role: physicians voiced concern about rapport with LEP families, nurses emphasized actionable tasks, and interpreters focused on heightened challenges in times of stress.
Barrier 1: Difficulties Accessing Interpreter Services
Medical providers (physicians and nurses) identified the “opaque process to access [interpreter] services” as one of their biggest challenges when communicating with LEP families. In particular, the process of scheduling interpreters was described as a “black box,” with physicians and nurses expressing difficulty determining if and when in-person interpreters were scheduled and uncertainty about when to use modalities other than in-person interpretation. Participants across groups highlighted the lack of systems knowledge from medical providers and limitations within the system that make predictable, timely, and reliable access to interpreters challenging, especially for uncommon languages. Medical providers desired more in-person interpreters who can “stay as long as clinically indicated,” citing frustration associated with using phone- and video-interpretation (eg, challenges locating technology, unfamiliarity with use, unreliable functionality of equipment). Interpreters voiced wanting to take time to finish each encounter fully without “being in a hurry because the next appointment is coming soon” or “rushing… in [to the next] session sweating.”
Barrier 2: Uncertainty in Communication with LEP Families
Participants across all groups described three areas of uncertainty as detailed in Table 2: (1) what to share and how to prioritize information during encounters with LEP patients and families, (2) what is communicated during interpretation, and (3) what LEP patients and families understand.
Barrier 3: Unclear and Inconsistent Expectations and Roles of Team Members
Given the complexity involved in communication between medical providers, interpreters, and families, participants across all groups reported feeling ill-prepared when navigating hospital encounters with LEP patients and families. Interpreters reported having little to no clinical context, medical providers reported having no knowledge of the assigned interpreter’s style, and both interpreters and medical providers reported that families have little idea of what to expect or how to engage. All groups voiced frustration about the lack of clarity regarding specific roles and scope of practice for each team member during an encounter, where multiple people end up “talking [or] using the interpreter at once.” Interpreters shared their expectations of medical providers to set the pace and lead conversations with LEP families. On the other hand, medical providers expressed a desire for interpreters to provide cultural context to the team without prompting and to interrupt during encounters when necessary to voice concerns or redirect conversations.
Barrier 4: Unmet Family Engagement Expectations
Participants across all groups articulated challenges with establishing rapport with LEP patients and families, sharing concerns that “inadequate communication” due to “cultural or language barriers” ultimately impacts quality of care. Participants reported decreased bidirectional engagement with and from LEP families. Medical providers not only noted difficulty in connecting with LEP families “on a more personal level” and providing frequent medical updates, but also felt that LEP families do not ask questions even when uncertain. Interpreters expressed concerns about medical providers “not [having] enough patience to answer families’ questions” while LEP families “shy away from asking questions.”
Driver 1: Utilizing a Team-Based Approach between Medical Providers and Interpreters
Participants from all groups emphasized that a mutual understanding of roles and shared expectations regarding communication and interpretation style, clinical context, and time constraints would establish a foundation for respect between medical providers and interpreters. They reported that a team-based approach to LEP patient and family encounters were crucial to achieving effective communication.
Driver 2: Understanding the Role of Cultural Context in Providing Culturally Effective Care.
Participants across all groups highlighted three different aspects of cultural context that drive effective communication: (1) medical providers’ perception of the family’s culture; (2) LEP families’ knowledge about the culture and healthcare system in the US, and (3) medical providers insight into their own preconceived ideas about LEP families.
Driver 3: Practicing Empathy for Patients and Families
All participants reported that respect for diversity and consideration of the backgrounds and perspectives of LEP patients and families are necessary. Furthermore, both medical providers and interpreters articulated a need to remain patient and mindful when interacting with LEP families despite challenges, especially since, as noted by interpreters, encounters may “take longer, but it’s for a reason.”
Driver 4: Using Effective Family-Centered Communication Strategies
Participants identified the use of effective family-centered communication principles as a driver to optimal communication. Many of the principles identified by medical providers and interpreters are generally applicable to all hospitalized patients and families regardless of English proficiency: optimizing verbal communication (eg, using shorter sentences, pausing to allow for interpretation), optimizing nonverbal communication (eg, setting, position, and body language), and assessment of family understanding and engagement (eg, use of teach back).
DISCUSSION
Frontline medical providers and interpreters identified barriers and drivers that impact communication with LEP patients and families during hospitalization. To our knowledge, this is the first study that uses a participatory method to explore the perspectives of medical providers and interpreters who care for LEP children and families in the inpatient setting. Despite existing difficulties and concerns regarding language barriers and its impact on quality of care for hospitalized LEP patients and families, participants were enthusiastic about how identified barriers and drivers may inform future improvement efforts. Notable action steps for future improvement discussed by our participants included: increased use and functionality of technology for timely and predictable access to interpreters, deliberate training for providers focused on delivery of culturally-effective care, consistent use of family-centered communication strategies including teach-back, and implementing interdisciplinary expectation setting through “presessions” before encounters with LEP families.
Participants elaborated on several barriers previously described in the literature including time constraints and technical problems.14,21,22 Such barriers may serve as deterrents to consistent and appropriate use of interpreters in healthcare settings.9 A heavy reliance on off-site interpreters (including phone- or video-interpreters) and lack of knowledge regarding resource availability likely amplified frustration for medical providers. Communication with LEP families can be daunting, especially when medical providers do not care for LEP families or work with interpreters on a regular basis.14 Standardizing the education of medical providers regarding available resources, as well as the logistics, process, and parameters for scheduling interpreters and using technology, was an action step identified by our GLA participants. Targeted education about the logistics of accessing interpreter services and having standardized ways to make technology use easier (ie, one-touch dialing in hospital rooms) has been associated with increased interpreter use and decreased interpreter-related delays in care.23
Our frontline medical providers expressed added concern about not spending as much time with LEP families. In fact, LEP families in the literature have perceived medical providers to spend less time with their children compared to their English-proficient counterparts.24 Language and cultural barriers, both perceived and real, may limit medical provider rapport with LEP patients and families14 and likely contribute to medical providers relying on their preconceived assumptions instead.25 Cultural competency education for medical providers, as highlighted by our GLA participants as an action item, can be used to provide more comprehensive and effective care.26,27
In addition to enhancing cultural humility through education, our participants emphasized the use of family-centered communication strategies as a driver of optimal family engagement and understanding. Actively inviting questions from families and utilizing teach-back, an established evidence-based strategy28-30 discussed by our participants, can be particularly powerful in assessing family understanding and engagement. While information should be presented in plain language for families in all encounters,31 these evidence-based practices are of particular importance when communicating with LEP families. They promote effective communication, empower families to share concerns in a structured manner, and allow medical providers to address matters in real-time with interpreters present.
Finally, our participants highlighted the need for partnerships between providers and interpreter services, noting unclear roles and expectations among interpreters and medical providers as a major barrier. Specifically, physicians noted confusion regarding the scope of an interpreter’s practice. Participants from GLA sessions discussed the importance of a team-based approach and suggested implementing a “presession” prior to encounters with LEP patients and families. Presessions—a concept well accepted among interpreters and recommended by consensus-based practice guidelines—enable medical providers and interpreters to establish shared expectations about scope of practice, communication, interpretation style, time constraints, and medical context prior to patient encounters.32,33
There are several limitations to our study. First, individuals who chose to participate were likely highly motivated by their clinical experiences with LEP patients and invested in improving communication with LEP families. Second, the study is limited in generalizability, as it was conducted at a single academic institution in a Midwestern city. Despite regional variations in available resources as well as patient and workforce demographics, our findings regarding major themes are in agreement with previously published literature and further add to our understanding of ways to improve communication with this vulnerable population across the care spectrum. Lastly, we were logistically limited in our ability to elicit the perspectives of LEP families due to the participatory nature of GLA; the need for multiple interpreters to simultaneously interact with LEP individuals would have not only hindered active LEP family participation but may have also biased the data generated by patients and families, as the services interpreters provide during their inpatient stay was the focus of our study. Engaging LEP families in their preferred language using participatory methods should be considered for future studies.
In conclusion, frontline providers of medical and language services identified barriers and drivers impacting the effective use of interpreter services when communicating with LEP families during hospitalization. Our enhanced understanding of barriers and drivers, as well as identified actionable interventions, will inform future improvement of communication and interactions with LEP families that contributes to effective and efficient family centered care. A framework for the development and implementation of organizational strategies aimed at improving communication with LEP families must include a thorough assessment of impact, feasibility, stakeholder involvement, and sustainability of specific interventions. While there is no simple formula to improve language services, health systems should establish and adopt language access policies, standardize communication practices, and develop processes to optimize the use of language services in the hospital. Furthermore, engagement with LEP families to better understand their perceptions and experiences with the healthcare system is crucial to improve communication between medical providers and LEP families in the inpatient setting and should be the subject of future studies.
Disclosures
The authors have no conflicts of interest to disclose.
Funding
No external funding was secured for this study. Dr. Joanna Thomson is supported by the Agency for Healthcare Research and Quality (Grant #K08 HS025138). Dr. Raglin Bignall was supported through a Ruth L. Kirschstein National Research Service Award (T32HP10027) when the study was conducted. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding organizations. The funding organizations had no role in the design, preparation, review, or approval of this paper.
1. The American Academy of Pediatrics Council on Community Pediatrics. Providing care for immigrant, migrant, and border children. Pediatrics. 2013;131(6):e2028-e2034. PubMed
2. Meneses C, Chilton L, Duffee J, et al. Council on Community Pediatrics Immigrant Health Tool Kit. The American Academy of Pediatrics. https://www.aap.org/en-us/Documents/cocp_toolkit_full.pdf. Accessed May 13, 2019.
3. Office for Civil Rights. Guidance to Federal Financial Assistance Recipients Regarding Title VI and the Prohibition Against National Origin Discrimination Affecting Limited English Proficient Persons. https://www.hhs.gov/civil-rights/for-individuals/special-topics/limited-english-proficiency/guidance-federal-financial-assistance-recipients-title-vi/index.html. Accessed May 13, 2019.
4. Lion KC, Rafton SA, Shafii J, et al. Association between language, serious adverse events, and length of stay Among hospitalized children. Hosp Pediatr. 2013;3(3):219-225. https://doi.org/10.1542/hpeds.2012-0091.
5. Lion KC, Wright DR, Desai AD, Mangione-Smith R. Costs of care for hospitalized children associated With preferred language and insurance type. Hosp Pediatr. 2017;7(2):70-78. https://doi.org/10.1542/hpeds.2016-0051.
6. Cohen AL, Rivara F, Marcuse EK, McPhillips H, Davis R. Are language barriers associated with serious medical events in hospitalized pediatric patients? Pediatrics. 2005;116(3):575-579. https://doi.org/10.1542/peds.2005-0521.
7. Samuels-Kalow ME, Stack AM, Amico K, Porter SC. Parental language and return visits to the Emergency Department After discharge. Pediatr Emerg Care. 2017;33(6):402-404. https://doi.org/10.1097/PEC.0000000000000592.
8. Unaka NI, Statile AM, Choe A, Shonna Yin H. Addressing health literacy in the inpatient setting. Curr Treat Options Pediatr. 2018;4(2):283-299. https://doi.org/10.1007/s40746-018-0122-3.
9. DeCamp LR, Kuo DZ, Flores G, O’Connor K, Minkovitz CS. Changes in language services use by US pediatricians. Pediatrics. 2013;132(2):e396-e406. https://doi.org/10.1542/peds.2012-2909.
10. Flores G. The impact of medical interpreter services on the quality of health care: A systematic review. Med Care Res Rev. 2005;62(3):255-299. https://doi.org/10.1177/1077558705275416.
11. Flores G, Abreu M, Barone CP, Bachur R, Lin H. Errors of medical interpretation and their potential clinical consequences: A comparison of professional versus hoc versus no interpreters. Ann Emerg Med. 2012;60(5):545-553. https://doi.org/10.1016/j.annemergmed.2012.01.025.
12. Anand KJ, Sepanski RJ, Giles K, Shah SH, Juarez PD. Pediatric intensive care unit mortality among Latino children before and after a multilevel health care delivery intervention. JAMA Pediatr. 2015;169(4):383-390. https://doi.org/10.1001/jamapediatrics.2014.3789.
13. The Joint Commission. Advancing Effective Communication, Cultural Competence, and Patient- and Family-Centered Care: A Roadmap for Hospitals. Oakbrook Terrace, IL: The Joint Commission; 2010.
14. Hernandez RG, Cowden JD, Moon M et al. Predictors of resident satisfaction in caring for limited English proficient families: a multisite study. Acad Pediatr. 2014;14(2):173-180. https://doi.org/10.1016/j.acap.2013.12.002.
15. Vaughn LM, Lohmueller M. Calling all stakeholders: group-level assessment (GLA)-a qualitative and participatory method for large groups. Eval Rev. 2014;38(4):336-355. https://doi.org/10.1177/0193841X14544903.
16. Vaughn LM, Jacquez F, Zhao J, Lang M. Partnering with students to explore the health needs of an ethnically diverse, low-resource school: an innovative large group assessment approach. Fam Commun Health. 2011;34(1):72-84. https://doi.org/10.1097/FCH.0b013e3181fded12.
17. Gosdin CH, Vaughn L. Perceptions of physician bedside handoff with nurse and family involvement. Hosp Pediatr. 2012;2(1):34-38. https://doi.org/10.1542/hpeds.2011-0008-2.
18. Graham KE, Schellinger AR, Vaughn LM. Developing strategies for positive change: transitioning foster youth to adulthood. Child Youth Serv Rev. 2015;54:71-79. https://doi.org/10.1016/j.childyouth.2015.04.014.
19. Vaughn LM. Group level assessment: A Large Group Method for Identifying Primary Issues and Needs within a community. London2014. http://methods.sagepub.com/case/group-level-assessment-large-group-primary-issues-needs-community. Accessed 2017/07/26.
20. Association of American Medical Colleges Electronic Residency Application Service. ERAS 2018 MyERAS Application Worksheet: Language Fluency. Washington, DC:: Association of American Medical Colleges; 2018:5.
21. Brisset C, Leanza Y, Laforest K. Working with interpreters in health care: A systematic review and meta-ethnography of qualitative studies. Patient Educ Couns. 2013;91(2):131-140. https://doi.org/10.1016/j.pec.2012.11.008.
22. Wiking E, Saleh-Stattin N, Johansson SE, Sundquist J. A description of some aspects of the triangular meeting between immigrant patients, their interpreters and GPs in primary health care in Stockholm, Sweden. Fam Pract. 2009;26(5):377-383. https://doi.org/10.1093/fampra/cmp052.
23. Lion KC, Ebel BE, Rafton S et al. Evaluation of a quality improvement intervention to increase use of telephonic interpretation. Pediatrics. 2015;135(3):e709-e716. https://doi.org/10.1542/peds.2014-2024.
24. Zurca AD, Fisher KR, Flor RJ, et al. Communication with limited English-proficient families in the PICU. Hosp Pediatr. 2017;7(1):9-15. https://doi.org/10.1542/hpeds.2016-0071.
25. Kodjo C. Cultural competence in clinician communication. Pediatr Rev. 2009;30(2):57-64. https://doi.org/10.1542/pir.30-2-57.
26. Britton CV, American Academy of Pediatrics Committee on Pediatric Workforce. Ensuring culturally effective pediatric care: implications for education and health policy. Pediatrics. 2004;114(6):1677-1685. https://doi.org/10.1542/peds.2004-2091.
27. The American Academy of Pediatrics. Culturally Effective Care Toolkit: Providing Cuturally Effective Pediatric Care; 2018. https://www.aap.org/en-us/professional-resources/practice-transformation/managing-patients/Pages/effective-care.aspx. Accessed May 13, 2019.
28. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. https://doi.org/10.1056/NEJMsa1405556.
29. Jager AJ, Wynia MK. Who gets a teach-back? Patient-reported incidence of experiencing a teach-back. J Health Commun. 2012;17 Supplement 3:294-302. https://doi.org/10.1080/10810730.2012.712624.
30. Kornburger C, Gibson C, Sadowski S, Maletta K, Klingbeil C. Using “teach-back” to promote a safe transition from hospital to home: an evidence-based approach to improving the discharge process. J Pediatr Nurs. 2013;28(3):282-291. https://doi.org/10.1016/j.pedn.2012.10.007.
31. Abrams MA, Klass P, Dreyer BP. Health literacy and children: recommendations for action. Pediatrics. 2009;124 Supplement 3:S327-S331. https://doi.org/10.1542/peds.2009-1162I.
32. Betancourt JR, Renfrew MR, Green AR, Lopez L, Wasserman M. Improving Patient Safety Systems for Patients with Limited English Proficiency: a Guide for Hospitals. Agency for Healthcare Research and Quality; 2012.
33. The National Council on Interpreting in Health Care. Best Practices for Communicating Through an Interpreter . https://refugeehealthta.org/access-to-care/language-access/best-practices-communicating-through-an-interpreter/. Accessed May 19, 2019.
1. The American Academy of Pediatrics Council on Community Pediatrics. Providing care for immigrant, migrant, and border children. Pediatrics. 2013;131(6):e2028-e2034. PubMed
2. Meneses C, Chilton L, Duffee J, et al. Council on Community Pediatrics Immigrant Health Tool Kit. The American Academy of Pediatrics. https://www.aap.org/en-us/Documents/cocp_toolkit_full.pdf. Accessed May 13, 2019.
3. Office for Civil Rights. Guidance to Federal Financial Assistance Recipients Regarding Title VI and the Prohibition Against National Origin Discrimination Affecting Limited English Proficient Persons. https://www.hhs.gov/civil-rights/for-individuals/special-topics/limited-english-proficiency/guidance-federal-financial-assistance-recipients-title-vi/index.html. Accessed May 13, 2019.
4. Lion KC, Rafton SA, Shafii J, et al. Association between language, serious adverse events, and length of stay Among hospitalized children. Hosp Pediatr. 2013;3(3):219-225. https://doi.org/10.1542/hpeds.2012-0091.
5. Lion KC, Wright DR, Desai AD, Mangione-Smith R. Costs of care for hospitalized children associated With preferred language and insurance type. Hosp Pediatr. 2017;7(2):70-78. https://doi.org/10.1542/hpeds.2016-0051.
6. Cohen AL, Rivara F, Marcuse EK, McPhillips H, Davis R. Are language barriers associated with serious medical events in hospitalized pediatric patients? Pediatrics. 2005;116(3):575-579. https://doi.org/10.1542/peds.2005-0521.
7. Samuels-Kalow ME, Stack AM, Amico K, Porter SC. Parental language and return visits to the Emergency Department After discharge. Pediatr Emerg Care. 2017;33(6):402-404. https://doi.org/10.1097/PEC.0000000000000592.
8. Unaka NI, Statile AM, Choe A, Shonna Yin H. Addressing health literacy in the inpatient setting. Curr Treat Options Pediatr. 2018;4(2):283-299. https://doi.org/10.1007/s40746-018-0122-3.
9. DeCamp LR, Kuo DZ, Flores G, O’Connor K, Minkovitz CS. Changes in language services use by US pediatricians. Pediatrics. 2013;132(2):e396-e406. https://doi.org/10.1542/peds.2012-2909.
10. Flores G. The impact of medical interpreter services on the quality of health care: A systematic review. Med Care Res Rev. 2005;62(3):255-299. https://doi.org/10.1177/1077558705275416.
11. Flores G, Abreu M, Barone CP, Bachur R, Lin H. Errors of medical interpretation and their potential clinical consequences: A comparison of professional versus hoc versus no interpreters. Ann Emerg Med. 2012;60(5):545-553. https://doi.org/10.1016/j.annemergmed.2012.01.025.
12. Anand KJ, Sepanski RJ, Giles K, Shah SH, Juarez PD. Pediatric intensive care unit mortality among Latino children before and after a multilevel health care delivery intervention. JAMA Pediatr. 2015;169(4):383-390. https://doi.org/10.1001/jamapediatrics.2014.3789.
13. The Joint Commission. Advancing Effective Communication, Cultural Competence, and Patient- and Family-Centered Care: A Roadmap for Hospitals. Oakbrook Terrace, IL: The Joint Commission; 2010.
14. Hernandez RG, Cowden JD, Moon M et al. Predictors of resident satisfaction in caring for limited English proficient families: a multisite study. Acad Pediatr. 2014;14(2):173-180. https://doi.org/10.1016/j.acap.2013.12.002.
15. Vaughn LM, Lohmueller M. Calling all stakeholders: group-level assessment (GLA)-a qualitative and participatory method for large groups. Eval Rev. 2014;38(4):336-355. https://doi.org/10.1177/0193841X14544903.
16. Vaughn LM, Jacquez F, Zhao J, Lang M. Partnering with students to explore the health needs of an ethnically diverse, low-resource school: an innovative large group assessment approach. Fam Commun Health. 2011;34(1):72-84. https://doi.org/10.1097/FCH.0b013e3181fded12.
17. Gosdin CH, Vaughn L. Perceptions of physician bedside handoff with nurse and family involvement. Hosp Pediatr. 2012;2(1):34-38. https://doi.org/10.1542/hpeds.2011-0008-2.
18. Graham KE, Schellinger AR, Vaughn LM. Developing strategies for positive change: transitioning foster youth to adulthood. Child Youth Serv Rev. 2015;54:71-79. https://doi.org/10.1016/j.childyouth.2015.04.014.
19. Vaughn LM. Group level assessment: A Large Group Method for Identifying Primary Issues and Needs within a community. London2014. http://methods.sagepub.com/case/group-level-assessment-large-group-primary-issues-needs-community. Accessed 2017/07/26.
20. Association of American Medical Colleges Electronic Residency Application Service. ERAS 2018 MyERAS Application Worksheet: Language Fluency. Washington, DC:: Association of American Medical Colleges; 2018:5.
21. Brisset C, Leanza Y, Laforest K. Working with interpreters in health care: A systematic review and meta-ethnography of qualitative studies. Patient Educ Couns. 2013;91(2):131-140. https://doi.org/10.1016/j.pec.2012.11.008.
22. Wiking E, Saleh-Stattin N, Johansson SE, Sundquist J. A description of some aspects of the triangular meeting between immigrant patients, their interpreters and GPs in primary health care in Stockholm, Sweden. Fam Pract. 2009;26(5):377-383. https://doi.org/10.1093/fampra/cmp052.
23. Lion KC, Ebel BE, Rafton S et al. Evaluation of a quality improvement intervention to increase use of telephonic interpretation. Pediatrics. 2015;135(3):e709-e716. https://doi.org/10.1542/peds.2014-2024.
24. Zurca AD, Fisher KR, Flor RJ, et al. Communication with limited English-proficient families in the PICU. Hosp Pediatr. 2017;7(1):9-15. https://doi.org/10.1542/hpeds.2016-0071.
25. Kodjo C. Cultural competence in clinician communication. Pediatr Rev. 2009;30(2):57-64. https://doi.org/10.1542/pir.30-2-57.
26. Britton CV, American Academy of Pediatrics Committee on Pediatric Workforce. Ensuring culturally effective pediatric care: implications for education and health policy. Pediatrics. 2004;114(6):1677-1685. https://doi.org/10.1542/peds.2004-2091.
27. The American Academy of Pediatrics. Culturally Effective Care Toolkit: Providing Cuturally Effective Pediatric Care; 2018. https://www.aap.org/en-us/professional-resources/practice-transformation/managing-patients/Pages/effective-care.aspx. Accessed May 13, 2019.
28. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. https://doi.org/10.1056/NEJMsa1405556.
29. Jager AJ, Wynia MK. Who gets a teach-back? Patient-reported incidence of experiencing a teach-back. J Health Commun. 2012;17 Supplement 3:294-302. https://doi.org/10.1080/10810730.2012.712624.
30. Kornburger C, Gibson C, Sadowski S, Maletta K, Klingbeil C. Using “teach-back” to promote a safe transition from hospital to home: an evidence-based approach to improving the discharge process. J Pediatr Nurs. 2013;28(3):282-291. https://doi.org/10.1016/j.pedn.2012.10.007.
31. Abrams MA, Klass P, Dreyer BP. Health literacy and children: recommendations for action. Pediatrics. 2009;124 Supplement 3:S327-S331. https://doi.org/10.1542/peds.2009-1162I.
32. Betancourt JR, Renfrew MR, Green AR, Lopez L, Wasserman M. Improving Patient Safety Systems for Patients with Limited English Proficiency: a Guide for Hospitals. Agency for Healthcare Research and Quality; 2012.
33. The National Council on Interpreting in Health Care. Best Practices for Communicating Through an Interpreter . https://refugeehealthta.org/access-to-care/language-access/best-practices-communicating-through-an-interpreter/. Accessed May 19, 2019.
© 2019 Society of Hospital Medicine
The Role of Adolescent Acne Treatment in Formation of Scars Among Patients With Persistent Adult Acne: Evidence From an Observational Study
In the last 20 years, the incidence of acne lesions in adults has markedly increased. 1 Acne affects adults (individuals older than 25 years) and is no longer a condition limited to adolescents and young adults (individuals younger than 25 years). According to Dreno et al, 2 the accepted age threshold for the onset of adult acne is 25 years. 1-3 In 2013, the term adult acne was defined. 2 Among patients with adult acne, there are 2 subtypes: (1) persistent adult acne, which is a continuation or recurrence of adolescent acne, affecting approximately 80% of patients, and (2) late-onset acne, affecting approximately 20% of patients. 4
Clinical symptoms of adult acne and available treatment modalities have been explored in the literature. Daily clinical experience shows that additional difficulties involved in the management of adult acne patients are related mainly to a high therapeutic failure rate in acne patients older than 25 years. 5 Persistent adult acne seems to be noteworthy because it causes long-term symptoms, and patients experience uncontrollable recurrences.
It is believed that adult acne often is resistant to treatment. 2 Adult skin is more sensitive to topical agents, leading to more irritation by medications intended for external use and cosmetics. 6 Scars in these patients are a frequent and undesirable consequence. 3
Effective treatment of acne encompasses oral antibiotics, topical and systemic retinoids, and oral contraceptive pills (OCPs). For years, oral subantimicrobial doses of cyclines have been recommended for acne treatment. Topical and oral retinoids have been successfully used for more than 30 years as important therapeutic options. 7 More recent evidence-based guidelines for acne issued by the American Academy of Dermatology 8 and the European Dermatology Forum 9 also show that retinoids play an important role in acne therapy. Their anti-inflammatory activity acts against comedones and their precursors (microcomedones). Successful antiacne therapy not only achieves a smooth face without comedones but also minimizes scar formation, postinflammatory discoloration, and long-lasting postinflammatory erythema. 10 Oral contraceptives have a mainly antiseborrheic effect. 11
Our study sought to analyze the potential influence of therapy during adolescent acne on patients who later developed adult acne. Particular attention was given to the use of oral antibiotics, isotretinoin, and topical retinoids for adolescent acne and their potential role in diminishing scar formation in adult acne.
Materials and Methods
Patient Demographics and Selection
A population-based study of Polish patients with adult acne was conducted. Patients were included in the study group on a consecutive basis from among those who visited our outpatient dermatology center from May 2015 to January 2016. A total of 111 patients (101 women [90.99%] and 10 men [9.01%]) were examined. The study group comprised patients aged 25 years and older who were treated for adult acne (20 patients [18.02%] were aged 25–29 years, 61 [54.95%] were aged 30–39 years, and 30 [27.02%] were 40 years or older).
The following inclusion criteria were used: observation period of at least 6 months in our dermatologic center for patients diagnosed with adult acne, at least 2 dermatologic visits for adult acne prior to the study, written informed consent for study participation and data processing (the aim of the study was explained to each participant by a dermatologist), and age 25 years or older. Exclusion criteria included those who were younger than 25 years, those who had only 1 dermatologic visit at our dermatology center, and those who were unwilling to participate or did not provide written informed consent. Our study was conducted according to Good Clinical Practice.
Data Collection
To obtain data with the highest degree of reliability, 3 sources of information were used: (1) a detailed medical interview conducted by one experienced dermatologist (E.C.) at our dermatology center at the first visit in all study participants, (2) a clinical examination that yielded results necessary for the assessment of scars using a method outlined by Jacob et al, 12 and (3) information included in available medical records. These data were then statistically analyzed.
Statistical Analysis
The results were presented as frequency plots, and a Fisher exact test was conducted to obtain a statistical comparison of the distributions of analyzed data. Unless otherwise indicated, 5% was adopted as the significance level. The statistical analysis was performed using Stata 14 software (StataCorp LLC, College Station, Texas).
Results
Incidence of Different Forms of Adult Acne
To analyze the onset of acne, patients were categorized into 1 of 2 groups: those with persistent adult acne (81.98%) and those with late-onset adult acne (ie, developed after 25 years of age)(18.02%).
Age at Initiation of Dermatologic Treatment
Of the patients with persistent adult acne, 31.87% first visited a dermatologist the same year that the first acne lesions appeared, 36.26% postponed the first visit by at least 5 years (Figure 1), and 23.08% started treatment at least 10 years after acne first appeared. Among patients with persistent adult acne, 76.92% began dermatologic treatment before 25 years of age, and 23.08% began treatment after 25 years of age. Of the latter, 28.57% did not start therapy until they were older than 35 years.
Severity of Adolescent Acne
In the persistent adult acne group, the severity of adolescent acne was assessed during the medical interview as well as detailed histories in medical records. The activity of acne was evaluated at 2-year intervals with the use of a 10-point scale: 1 to 3 points indicated mild acne (7.69% of patients), 4 to 6 points indicated moderate acne (24.18%), and 7 to 10 points indicated severe acne (68.13%).
Treatment of Persistent Acne in Adolescence
Treatment was comprised of oral therapy with antibiotics, isotretinoin, and/or application of topical retinoids (sometimes supported with OCPs). Monotherapy was the standard of treatment more than 25 years ago when patients with persistent adult acne were treated as adolescents or young adults. As many as 43.96% of patients with persistent adult acne did not receive any of these therapies before 25 years of age; rather, they used antiacne cosmetics or beauty procedures. Furthermore, 50.55% of patients were treated with oral antibiotics (Figure 2). Topical retinoids were used in 19.78% of patients and isotretinoin was used in 16.48%. Incidentally, OCPs were given to 26.5%. In the course of adolescent acne, 31.87% of patients received 2 to 4 courses of treatment with either antibiotics or retinoids (oral or topical), and 5.49% were treated with 5 or more courses of treatment (Figure 3). The analysis of each treatment revealed that only 1 patient received 4 courses of isotretinoin. Five courses of oral antibiotics were given in 1 patient, and 3 courses of topical retinoids were given in the same patient.
Topical Retinoids
In an analysis of the number of treatments with topical retinoids completed by patients with persistent adult acne, it was established that 80.22% of patients never used topical retinoids for acne during adolescence. Additionally, 12.08% of these patients completed 1 course of treatment, and 7.69% completed 2 to 4 treatments. However, after 25 years of age, only 25.27% of the patients with persistent adult acne were not treated with topical retinoids, and 35.16% completed more than 2 courses of treatment.
Duration of Treatment
Because adult acne is a chronic disease, the mean number of years that patients received treatment over the disease course was analyzed. In the case of persistent adult acne, the mean duration of treatment, including therapy received during adolescence, was more than 13 years. At the time of the study, more than 30% of patients had been undergoing treatment of adult acne for more than 20 years. Scars— The proportion of patients with persistent adult acne who experienced scarring was evaluated. In the persistent adult acne group, scars were identified in 53.85% of patients. Scars appeared only during adolescence in 26.37% of patients with persistent adult acne, scars appeared only after 25 years of age in 21.97% of patients, and scars appeared in adolescence as well as adulthood in 30.77% of patients.
In an analysis of patients with persistent adult acne who experienced scarring after 25 years of age, the proportion of patients with untreated adolescent acne and those who were treated with antibiotics only was not significantly different (60% vs 64%; P = .478)(Table). The inclusion of topical retinoids into treatment decreased the proportion of scars (isotretinoin: 20%, P = .009; topical retinoids: 38.89%, P = .114).
Comment
Persistent Adult Acne
Patients with symptoms of persistent adult acne represented 81.98% of the study population, which was similar to a 1999 study by Goulden et al, 1 a 2001 study by Shaw and White, 13 and a 2009 report by Schmidt et al. 14 Of these patients with persistent adult acne, 23.08% initiated therapy after 25 years of age, and 23.08% started treatment at least 10 years after acne lesions first appeared. However, it is noteworthy that 68.13% of all patients with persistent adult acne assessed their disease as severe.
Treatment Modalities for Adult Acne
Over the last 5 years, some researchers have attempted to make recommendations for the treatment of adult acne based on standards adopted for the treatment of adolescent acne. 2,9,15 First-line treatment of patients with adult comedonal acne is topical retinoids. 9 The recommended treatment of mild to moderate adult inflammatory acne involves topical drugs, including retinoids, azelaic acid, or benzoyl peroxide, or oral medications, including antibiotics, OCPs, or antiandrogens. In severe inflammatory acne, the recommended treatment involves oral isotretinoin or combined therapies; the latter seems to be the most effective. 16 Furthermore, this therapy has been adjusted to the patient’s current clinical condition; general individual sensitivity of the skin to irritation and the risk for irritant activity of topical medications; and life situation, such as planned pregnancies and intended use of OCPs due to the risk for teratogenic effects of drugs. 17
To assess available treatment modalities, oral therapy with antibiotics or isotretinoin as well as topical retinoids were selected for our analysis. It is difficult to determine an exclusive impact of OCPs as acne treatment; according to our study, many female patients use hormone therapy for other medical conditions or contraception, and only a small proportion of these patients are prescribed hormone treatment for acne. We found that 43.96% of patients with persistent adult acne underwent no treatment with antibiotics, isotretinoin, or topical retinoids in adolescence. Patients who did not receive any of these treatments came only for single visits to a dermatologist, did not comply to a recommended therapy, or used only cosmetics or beauty procedures. We found that 80.22% of patients with persistent adult acne never used topical retinoids during adolescence and did not receive maintenance therapy, which may be attributed to the fact that there were no strict recommendations regarding retinoid treatment when these patients were adolescents or young adults. Published data indicate that retinoid use for acne treatment is not common. 18 Conversely, among patients older than 25 years with late-onset adult acne, there was only 1 patient (ie, < 1%) who had never received any oral antibiotic or isotretinoin treatment or therapy with topical retinoids. The reason for the lack of medical treatment is unknown. Only 25.27% of patients were not treated with topical retinoids, and 35.16% completed at least 2 courses of treatment.
Acne Scarring
The worst complication of acne is scarring. Scars develop for the duration of the disease, during both adolescent and adult acne. In the group with persistent adult acne, scarring was found in 53.85% of patients. Scar formation has been previously reported as a common complication of acne. 19 The effects of skin lesions that remain after acne are not only limited to impaired cosmetic appearance; they also negatively affect mental health and impair quality of life. 20 The aim of our study was to analyze types of treatment for adolescent acne in patients who later had persistent adult acne. Postacne scars observed later are objective evidence of the severity of disease. We found that using oral antibiotics did not diminish the number of scars among persistent adult acne patients in adulthood. In contrast, isotretinoin or topical retinoid treatment during adolescence decreased the risk for scars occurring during adulthood. In our opinion, these findings emphasize the role of this type of treatment among adolescents or young adults. The decrease of scar formation in adult acne due to retinoid treatment in adolescence indirectly justifies the role of maintenance therapy with topical retinoids. 21,22
- Goulden V, Stables GI, Cunliffe WJ. Prevalence of facial acne in adults. J Am Acad Dermatol. 1999;41:577-580.
- Dreno B, Layton A, Zouboulis CC, et al. Adult female acne: a new paradigm. J Eur Acad Dermatol Venereol. 2013;27:1063-1070.
- Preneau S, Dreno B. Female acne--a different subtype of teenager acne? J Eur Acad Dermatol Venereol. 2012;26:277-282.
- Goulden V, Clark SM, Cunliffe WJ. Post-adolescent acne: a review of clinical features. Br J Dermatol. 1997;136:66-70.
- Kamangar F, Shinkai K. Acne in the adult female patient: a practical approach. Int J Dermatol. 2012;51:1162-1174.
- Choi CW, Lee DH, Kim HS, et al. The clinical features of late onset acne compared with early onset acne in women. J Eur Acad Dermatol Venereol. 2011;25:454-461.
- Kligman AM, Fulton JE Jr, Plewig G. Topical vitamin A acid in acne vulgaris. Arch Dermatol. 1969;99:469-476.
- Zaenglein AL, Pathy AL, Schlosser BJ, et al. Guidelines of care for the management of acne vulgaris. J Am Acad Dermatol. 2016;74:945.e33-973.e33.
- Nast A, Dreno B, Bettoli V, et al. European evidence-based guidelines for the treatment of acne. J Eur Acad Dermatol Venereol. 2012;26(suppl 1):1-29.
- Levin J. The relationship of proper skin cleansing to pathophysiology, clinical benefits, and the concomitant use of prescription topical therapies in patients with acne vulgaris. Dermatol Clin. 2016;34:133-145.
- Savage LJ, Layton AM. Treating acne vulgaris: systemic, local and combination therapy. Expert Rev Clin Pharmacol. 2010;3:563-580.
- Jacob CL, Dover JS, Kaminer MS. Acne scarring: a classification system and review of treatment options. J Am Acad Dermatol. 2001;45:109-117.
- Shaw JC, White LE. Persistent acne in adult women. Arch Dermatol. 2001;137:1252-1253.
- Schmidt JV, Masuda PY, Miot HA. Acne in women: clinical patterns in different age groups. An Bras Dermatol. 2009;84:349-354.
- Thiboutot D, Gollnick H, Bettoli V, et al. New insights into the management of acne: an update from the Global Alliance to Improve Outcomes in Acne group. J Am Acad Dermatol. 2009;60(5 suppl):1-50.
- Williams C, Layton AM. Persistent acne in women: implications for the patient and for therapy. Am J Clin Dermatol. 2006;7:281-290.
- Holzmann R, Shakery K. Postadolescent acne in females. Skin Pharmacol Physiol. 2014;27(suppl 1):3-8.
- Pena S, Hill D, Feldman SR. Use of topical retinoids by dermatologist and non-dermatologist in the management of acne vulgaris. J Am Acad Dermatol. 2016;74:1252-1254.
- Layton AM, Henderson CA, Cunliffe WJ. A clinical evaluation of acne scarring and its incidence. Clin Exp Dermatol. 1994;19;303-308.
- Halvorsen JA, Stern RS, Dalgard F, et al. Suicidal ideation, mental health problems, and social impairment are increased in adolescents with acne: a population-based study. J Invest Dermatol. 2011;131:363-370.
- Thielitz A, Sidou F, Gollnick H. Control of microcomedone formation throughout a maintenance treatment with adapalene gel, 0.1%. J Eur Acad Dermatol Venereol. 2007;21:747-753.
- Leyden J, Thiboutot DM, Shalita R, et al. Comparison of tazarotene and minocycline maintenance therapies in acne vulgaris: a multicenter, double-blind, randomized, parallel-group study. Arch Dermatol. 2006;142:605-612.
In the last 20 years, the incidence of acne lesions in adults has markedly increased. 1 Acne affects adults (individuals older than 25 years) and is no longer a condition limited to adolescents and young adults (individuals younger than 25 years). According to Dreno et al, 2 the accepted age threshold for the onset of adult acne is 25 years. 1-3 In 2013, the term adult acne was defined. 2 Among patients with adult acne, there are 2 subtypes: (1) persistent adult acne, which is a continuation or recurrence of adolescent acne, affecting approximately 80% of patients, and (2) late-onset acne, affecting approximately 20% of patients. 4
Clinical symptoms of adult acne and available treatment modalities have been explored in the literature. Daily clinical experience shows that additional difficulties involved in the management of adult acne patients are related mainly to a high therapeutic failure rate in acne patients older than 25 years. 5 Persistent adult acne seems to be noteworthy because it causes long-term symptoms, and patients experience uncontrollable recurrences.
It is believed that adult acne often is resistant to treatment. 2 Adult skin is more sensitive to topical agents, leading to more irritation by medications intended for external use and cosmetics. 6 Scars in these patients are a frequent and undesirable consequence. 3
Effective treatment of acne encompasses oral antibiotics, topical and systemic retinoids, and oral contraceptive pills (OCPs). For years, oral subantimicrobial doses of cyclines have been recommended for acne treatment. Topical and oral retinoids have been successfully used for more than 30 years as important therapeutic options. 7 More recent evidence-based guidelines for acne issued by the American Academy of Dermatology 8 and the European Dermatology Forum 9 also show that retinoids play an important role in acne therapy. Their anti-inflammatory activity acts against comedones and their precursors (microcomedones). Successful antiacne therapy not only achieves a smooth face without comedones but also minimizes scar formation, postinflammatory discoloration, and long-lasting postinflammatory erythema. 10 Oral contraceptives have a mainly antiseborrheic effect. 11
Our study sought to analyze the potential influence of therapy during adolescent acne on patients who later developed adult acne. Particular attention was given to the use of oral antibiotics, isotretinoin, and topical retinoids for adolescent acne and their potential role in diminishing scar formation in adult acne.
Materials and Methods
Patient Demographics and Selection
A population-based study of Polish patients with adult acne was conducted. Patients were included in the study group on a consecutive basis from among those who visited our outpatient dermatology center from May 2015 to January 2016. A total of 111 patients (101 women [90.99%] and 10 men [9.01%]) were examined. The study group comprised patients aged 25 years and older who were treated for adult acne (20 patients [18.02%] were aged 25–29 years, 61 [54.95%] were aged 30–39 years, and 30 [27.02%] were 40 years or older).
The following inclusion criteria were used: observation period of at least 6 months in our dermatologic center for patients diagnosed with adult acne, at least 2 dermatologic visits for adult acne prior to the study, written informed consent for study participation and data processing (the aim of the study was explained to each participant by a dermatologist), and age 25 years or older. Exclusion criteria included those who were younger than 25 years, those who had only 1 dermatologic visit at our dermatology center, and those who were unwilling to participate or did not provide written informed consent. Our study was conducted according to Good Clinical Practice.
Data Collection
To obtain data with the highest degree of reliability, 3 sources of information were used: (1) a detailed medical interview conducted by one experienced dermatologist (E.C.) at our dermatology center at the first visit in all study participants, (2) a clinical examination that yielded results necessary for the assessment of scars using a method outlined by Jacob et al, 12 and (3) information included in available medical records. These data were then statistically analyzed.
Statistical Analysis
The results were presented as frequency plots, and a Fisher exact test was conducted to obtain a statistical comparison of the distributions of analyzed data. Unless otherwise indicated, 5% was adopted as the significance level. The statistical analysis was performed using Stata 14 software (StataCorp LLC, College Station, Texas).
Results
Incidence of Different Forms of Adult Acne
To analyze the onset of acne, patients were categorized into 1 of 2 groups: those with persistent adult acne (81.98%) and those with late-onset adult acne (ie, developed after 25 years of age)(18.02%).
Age at Initiation of Dermatologic Treatment
Of the patients with persistent adult acne, 31.87% first visited a dermatologist the same year that the first acne lesions appeared, 36.26% postponed the first visit by at least 5 years (Figure 1), and 23.08% started treatment at least 10 years after acne first appeared. Among patients with persistent adult acne, 76.92% began dermatologic treatment before 25 years of age, and 23.08% began treatment after 25 years of age. Of the latter, 28.57% did not start therapy until they were older than 35 years.
Severity of Adolescent Acne
In the persistent adult acne group, the severity of adolescent acne was assessed during the medical interview as well as detailed histories in medical records. The activity of acne was evaluated at 2-year intervals with the use of a 10-point scale: 1 to 3 points indicated mild acne (7.69% of patients), 4 to 6 points indicated moderate acne (24.18%), and 7 to 10 points indicated severe acne (68.13%).
Treatment of Persistent Acne in Adolescence
Treatment was comprised of oral therapy with antibiotics, isotretinoin, and/or application of topical retinoids (sometimes supported with OCPs). Monotherapy was the standard of treatment more than 25 years ago when patients with persistent adult acne were treated as adolescents or young adults. As many as 43.96% of patients with persistent adult acne did not receive any of these therapies before 25 years of age; rather, they used antiacne cosmetics or beauty procedures. Furthermore, 50.55% of patients were treated with oral antibiotics (Figure 2). Topical retinoids were used in 19.78% of patients and isotretinoin was used in 16.48%. Incidentally, OCPs were given to 26.5%. In the course of adolescent acne, 31.87% of patients received 2 to 4 courses of treatment with either antibiotics or retinoids (oral or topical), and 5.49% were treated with 5 or more courses of treatment (Figure 3). The analysis of each treatment revealed that only 1 patient received 4 courses of isotretinoin. Five courses of oral antibiotics were given in 1 patient, and 3 courses of topical retinoids were given in the same patient.
Topical Retinoids
In an analysis of the number of treatments with topical retinoids completed by patients with persistent adult acne, it was established that 80.22% of patients never used topical retinoids for acne during adolescence. Additionally, 12.08% of these patients completed 1 course of treatment, and 7.69% completed 2 to 4 treatments. However, after 25 years of age, only 25.27% of the patients with persistent adult acne were not treated with topical retinoids, and 35.16% completed more than 2 courses of treatment.
Duration of Treatment
Because adult acne is a chronic disease, the mean number of years that patients received treatment over the disease course was analyzed. In the case of persistent adult acne, the mean duration of treatment, including therapy received during adolescence, was more than 13 years. At the time of the study, more than 30% of patients had been undergoing treatment of adult acne for more than 20 years. Scars— The proportion of patients with persistent adult acne who experienced scarring was evaluated. In the persistent adult acne group, scars were identified in 53.85% of patients. Scars appeared only during adolescence in 26.37% of patients with persistent adult acne, scars appeared only after 25 years of age in 21.97% of patients, and scars appeared in adolescence as well as adulthood in 30.77% of patients.
In an analysis of patients with persistent adult acne who experienced scarring after 25 years of age, the proportion of patients with untreated adolescent acne and those who were treated with antibiotics only was not significantly different (60% vs 64%; P = .478)(Table). The inclusion of topical retinoids into treatment decreased the proportion of scars (isotretinoin: 20%, P = .009; topical retinoids: 38.89%, P = .114).
Comment
Persistent Adult Acne
Patients with symptoms of persistent adult acne represented 81.98% of the study population, which was similar to a 1999 study by Goulden et al, 1 a 2001 study by Shaw and White, 13 and a 2009 report by Schmidt et al. 14 Of these patients with persistent adult acne, 23.08% initiated therapy after 25 years of age, and 23.08% started treatment at least 10 years after acne lesions first appeared. However, it is noteworthy that 68.13% of all patients with persistent adult acne assessed their disease as severe.
Treatment Modalities for Adult Acne
Over the last 5 years, some researchers have attempted to make recommendations for the treatment of adult acne based on standards adopted for the treatment of adolescent acne. 2,9,15 First-line treatment of patients with adult comedonal acne is topical retinoids. 9 The recommended treatment of mild to moderate adult inflammatory acne involves topical drugs, including retinoids, azelaic acid, or benzoyl peroxide, or oral medications, including antibiotics, OCPs, or antiandrogens. In severe inflammatory acne, the recommended treatment involves oral isotretinoin or combined therapies; the latter seems to be the most effective. 16 Furthermore, this therapy has been adjusted to the patient’s current clinical condition; general individual sensitivity of the skin to irritation and the risk for irritant activity of topical medications; and life situation, such as planned pregnancies and intended use of OCPs due to the risk for teratogenic effects of drugs. 17
To assess available treatment modalities, oral therapy with antibiotics or isotretinoin as well as topical retinoids were selected for our analysis. It is difficult to determine an exclusive impact of OCPs as acne treatment; according to our study, many female patients use hormone therapy for other medical conditions or contraception, and only a small proportion of these patients are prescribed hormone treatment for acne. We found that 43.96% of patients with persistent adult acne underwent no treatment with antibiotics, isotretinoin, or topical retinoids in adolescence. Patients who did not receive any of these treatments came only for single visits to a dermatologist, did not comply to a recommended therapy, or used only cosmetics or beauty procedures. We found that 80.22% of patients with persistent adult acne never used topical retinoids during adolescence and did not receive maintenance therapy, which may be attributed to the fact that there were no strict recommendations regarding retinoid treatment when these patients were adolescents or young adults. Published data indicate that retinoid use for acne treatment is not common. 18 Conversely, among patients older than 25 years with late-onset adult acne, there was only 1 patient (ie, < 1%) who had never received any oral antibiotic or isotretinoin treatment or therapy with topical retinoids. The reason for the lack of medical treatment is unknown. Only 25.27% of patients were not treated with topical retinoids, and 35.16% completed at least 2 courses of treatment.
Acne Scarring
The worst complication of acne is scarring. Scars develop for the duration of the disease, during both adolescent and adult acne. In the group with persistent adult acne, scarring was found in 53.85% of patients. Scar formation has been previously reported as a common complication of acne. 19 The effects of skin lesions that remain after acne are not only limited to impaired cosmetic appearance; they also negatively affect mental health and impair quality of life. 20 The aim of our study was to analyze types of treatment for adolescent acne in patients who later had persistent adult acne. Postacne scars observed later are objective evidence of the severity of disease. We found that using oral antibiotics did not diminish the number of scars among persistent adult acne patients in adulthood. In contrast, isotretinoin or topical retinoid treatment during adolescence decreased the risk for scars occurring during adulthood. In our opinion, these findings emphasize the role of this type of treatment among adolescents or young adults. The decrease of scar formation in adult acne due to retinoid treatment in adolescence indirectly justifies the role of maintenance therapy with topical retinoids. 21,22
In the last 20 years, the incidence of acne lesions in adults has markedly increased. 1 Acne affects adults (individuals older than 25 years) and is no longer a condition limited to adolescents and young adults (individuals younger than 25 years). According to Dreno et al, 2 the accepted age threshold for the onset of adult acne is 25 years. 1-3 In 2013, the term adult acne was defined. 2 Among patients with adult acne, there are 2 subtypes: (1) persistent adult acne, which is a continuation or recurrence of adolescent acne, affecting approximately 80% of patients, and (2) late-onset acne, affecting approximately 20% of patients. 4
Clinical symptoms of adult acne and available treatment modalities have been explored in the literature. Daily clinical experience shows that additional difficulties involved in the management of adult acne patients are related mainly to a high therapeutic failure rate in acne patients older than 25 years. 5 Persistent adult acne seems to be noteworthy because it causes long-term symptoms, and patients experience uncontrollable recurrences.
It is believed that adult acne often is resistant to treatment. 2 Adult skin is more sensitive to topical agents, leading to more irritation by medications intended for external use and cosmetics. 6 Scars in these patients are a frequent and undesirable consequence. 3
Effective treatment of acne encompasses oral antibiotics, topical and systemic retinoids, and oral contraceptive pills (OCPs). For years, oral subantimicrobial doses of cyclines have been recommended for acne treatment. Topical and oral retinoids have been successfully used for more than 30 years as important therapeutic options. 7 More recent evidence-based guidelines for acne issued by the American Academy of Dermatology 8 and the European Dermatology Forum 9 also show that retinoids play an important role in acne therapy. Their anti-inflammatory activity acts against comedones and their precursors (microcomedones). Successful antiacne therapy not only achieves a smooth face without comedones but also minimizes scar formation, postinflammatory discoloration, and long-lasting postinflammatory erythema. 10 Oral contraceptives have a mainly antiseborrheic effect. 11
Our study sought to analyze the potential influence of therapy during adolescent acne on patients who later developed adult acne. Particular attention was given to the use of oral antibiotics, isotretinoin, and topical retinoids for adolescent acne and their potential role in diminishing scar formation in adult acne.
Materials and Methods
Patient Demographics and Selection
A population-based study of Polish patients with adult acne was conducted. Patients were included in the study group on a consecutive basis from among those who visited our outpatient dermatology center from May 2015 to January 2016. A total of 111 patients (101 women [90.99%] and 10 men [9.01%]) were examined. The study group comprised patients aged 25 years and older who were treated for adult acne (20 patients [18.02%] were aged 25–29 years, 61 [54.95%] were aged 30–39 years, and 30 [27.02%] were 40 years or older).
The following inclusion criteria were used: observation period of at least 6 months in our dermatologic center for patients diagnosed with adult acne, at least 2 dermatologic visits for adult acne prior to the study, written informed consent for study participation and data processing (the aim of the study was explained to each participant by a dermatologist), and age 25 years or older. Exclusion criteria included those who were younger than 25 years, those who had only 1 dermatologic visit at our dermatology center, and those who were unwilling to participate or did not provide written informed consent. Our study was conducted according to Good Clinical Practice.
Data Collection
To obtain data with the highest degree of reliability, 3 sources of information were used: (1) a detailed medical interview conducted by one experienced dermatologist (E.C.) at our dermatology center at the first visit in all study participants, (2) a clinical examination that yielded results necessary for the assessment of scars using a method outlined by Jacob et al, 12 and (3) information included in available medical records. These data were then statistically analyzed.
Statistical Analysis
The results were presented as frequency plots, and a Fisher exact test was conducted to obtain a statistical comparison of the distributions of analyzed data. Unless otherwise indicated, 5% was adopted as the significance level. The statistical analysis was performed using Stata 14 software (StataCorp LLC, College Station, Texas).
Results
Incidence of Different Forms of Adult Acne
To analyze the onset of acne, patients were categorized into 1 of 2 groups: those with persistent adult acne (81.98%) and those with late-onset adult acne (ie, developed after 25 years of age)(18.02%).
Age at Initiation of Dermatologic Treatment
Of the patients with persistent adult acne, 31.87% first visited a dermatologist the same year that the first acne lesions appeared, 36.26% postponed the first visit by at least 5 years (Figure 1), and 23.08% started treatment at least 10 years after acne first appeared. Among patients with persistent adult acne, 76.92% began dermatologic treatment before 25 years of age, and 23.08% began treatment after 25 years of age. Of the latter, 28.57% did not start therapy until they were older than 35 years.
Severity of Adolescent Acne
In the persistent adult acne group, the severity of adolescent acne was assessed during the medical interview as well as detailed histories in medical records. The activity of acne was evaluated at 2-year intervals with the use of a 10-point scale: 1 to 3 points indicated mild acne (7.69% of patients), 4 to 6 points indicated moderate acne (24.18%), and 7 to 10 points indicated severe acne (68.13%).
Treatment of Persistent Acne in Adolescence
Treatment was comprised of oral therapy with antibiotics, isotretinoin, and/or application of topical retinoids (sometimes supported with OCPs). Monotherapy was the standard of treatment more than 25 years ago when patients with persistent adult acne were treated as adolescents or young adults. As many as 43.96% of patients with persistent adult acne did not receive any of these therapies before 25 years of age; rather, they used antiacne cosmetics or beauty procedures. Furthermore, 50.55% of patients were treated with oral antibiotics (Figure 2). Topical retinoids were used in 19.78% of patients and isotretinoin was used in 16.48%. Incidentally, OCPs were given to 26.5%. In the course of adolescent acne, 31.87% of patients received 2 to 4 courses of treatment with either antibiotics or retinoids (oral or topical), and 5.49% were treated with 5 or more courses of treatment (Figure 3). The analysis of each treatment revealed that only 1 patient received 4 courses of isotretinoin. Five courses of oral antibiotics were given in 1 patient, and 3 courses of topical retinoids were given in the same patient.
Topical Retinoids
In an analysis of the number of treatments with topical retinoids completed by patients with persistent adult acne, it was established that 80.22% of patients never used topical retinoids for acne during adolescence. Additionally, 12.08% of these patients completed 1 course of treatment, and 7.69% completed 2 to 4 treatments. However, after 25 years of age, only 25.27% of the patients with persistent adult acne were not treated with topical retinoids, and 35.16% completed more than 2 courses of treatment.
Duration of Treatment
Because adult acne is a chronic disease, the mean number of years that patients received treatment over the disease course was analyzed. In the case of persistent adult acne, the mean duration of treatment, including therapy received during adolescence, was more than 13 years. At the time of the study, more than 30% of patients had been undergoing treatment of adult acne for more than 20 years. Scars— The proportion of patients with persistent adult acne who experienced scarring was evaluated. In the persistent adult acne group, scars were identified in 53.85% of patients. Scars appeared only during adolescence in 26.37% of patients with persistent adult acne, scars appeared only after 25 years of age in 21.97% of patients, and scars appeared in adolescence as well as adulthood in 30.77% of patients.
In an analysis of patients with persistent adult acne who experienced scarring after 25 years of age, the proportion of patients with untreated adolescent acne and those who were treated with antibiotics only was not significantly different (60% vs 64%; P = .478)(Table). The inclusion of topical retinoids into treatment decreased the proportion of scars (isotretinoin: 20%, P = .009; topical retinoids: 38.89%, P = .114).
Comment
Persistent Adult Acne
Patients with symptoms of persistent adult acne represented 81.98% of the study population, which was similar to a 1999 study by Goulden et al, 1 a 2001 study by Shaw and White, 13 and a 2009 report by Schmidt et al. 14 Of these patients with persistent adult acne, 23.08% initiated therapy after 25 years of age, and 23.08% started treatment at least 10 years after acne lesions first appeared. However, it is noteworthy that 68.13% of all patients with persistent adult acne assessed their disease as severe.
Treatment Modalities for Adult Acne
Over the last 5 years, some researchers have attempted to make recommendations for the treatment of adult acne based on standards adopted for the treatment of adolescent acne. 2,9,15 First-line treatment of patients with adult comedonal acne is topical retinoids. 9 The recommended treatment of mild to moderate adult inflammatory acne involves topical drugs, including retinoids, azelaic acid, or benzoyl peroxide, or oral medications, including antibiotics, OCPs, or antiandrogens. In severe inflammatory acne, the recommended treatment involves oral isotretinoin or combined therapies; the latter seems to be the most effective. 16 Furthermore, this therapy has been adjusted to the patient’s current clinical condition; general individual sensitivity of the skin to irritation and the risk for irritant activity of topical medications; and life situation, such as planned pregnancies and intended use of OCPs due to the risk for teratogenic effects of drugs. 17
To assess available treatment modalities, oral therapy with antibiotics or isotretinoin as well as topical retinoids were selected for our analysis. It is difficult to determine an exclusive impact of OCPs as acne treatment; according to our study, many female patients use hormone therapy for other medical conditions or contraception, and only a small proportion of these patients are prescribed hormone treatment for acne. We found that 43.96% of patients with persistent adult acne underwent no treatment with antibiotics, isotretinoin, or topical retinoids in adolescence. Patients who did not receive any of these treatments came only for single visits to a dermatologist, did not comply to a recommended therapy, or used only cosmetics or beauty procedures. We found that 80.22% of patients with persistent adult acne never used topical retinoids during adolescence and did not receive maintenance therapy, which may be attributed to the fact that there were no strict recommendations regarding retinoid treatment when these patients were adolescents or young adults. Published data indicate that retinoid use for acne treatment is not common. 18 Conversely, among patients older than 25 years with late-onset adult acne, there was only 1 patient (ie, < 1%) who had never received any oral antibiotic or isotretinoin treatment or therapy with topical retinoids. The reason for the lack of medical treatment is unknown. Only 25.27% of patients were not treated with topical retinoids, and 35.16% completed at least 2 courses of treatment.
Acne Scarring
The worst complication of acne is scarring. Scars develop for the duration of the disease, during both adolescent and adult acne. In the group with persistent adult acne, scarring was found in 53.85% of patients. Scar formation has been previously reported as a common complication of acne. 19 The effects of skin lesions that remain after acne are not only limited to impaired cosmetic appearance; they also negatively affect mental health and impair quality of life. 20 The aim of our study was to analyze types of treatment for adolescent acne in patients who later had persistent adult acne. Postacne scars observed later are objective evidence of the severity of disease. We found that using oral antibiotics did not diminish the number of scars among persistent adult acne patients in adulthood. In contrast, isotretinoin or topical retinoid treatment during adolescence decreased the risk for scars occurring during adulthood. In our opinion, these findings emphasize the role of this type of treatment among adolescents or young adults. The decrease of scar formation in adult acne due to retinoid treatment in adolescence indirectly justifies the role of maintenance therapy with topical retinoids. 21,22
- Goulden V, Stables GI, Cunliffe WJ. Prevalence of facial acne in adults. J Am Acad Dermatol. 1999;41:577-580.
- Dreno B, Layton A, Zouboulis CC, et al. Adult female acne: a new paradigm. J Eur Acad Dermatol Venereol. 2013;27:1063-1070.
- Preneau S, Dreno B. Female acne--a different subtype of teenager acne? J Eur Acad Dermatol Venereol. 2012;26:277-282.
- Goulden V, Clark SM, Cunliffe WJ. Post-adolescent acne: a review of clinical features. Br J Dermatol. 1997;136:66-70.
- Kamangar F, Shinkai K. Acne in the adult female patient: a practical approach. Int J Dermatol. 2012;51:1162-1174.
- Choi CW, Lee DH, Kim HS, et al. The clinical features of late onset acne compared with early onset acne in women. J Eur Acad Dermatol Venereol. 2011;25:454-461.
- Kligman AM, Fulton JE Jr, Plewig G. Topical vitamin A acid in acne vulgaris. Arch Dermatol. 1969;99:469-476.
- Zaenglein AL, Pathy AL, Schlosser BJ, et al. Guidelines of care for the management of acne vulgaris. J Am Acad Dermatol. 2016;74:945.e33-973.e33.
- Nast A, Dreno B, Bettoli V, et al. European evidence-based guidelines for the treatment of acne. J Eur Acad Dermatol Venereol. 2012;26(suppl 1):1-29.
- Levin J. The relationship of proper skin cleansing to pathophysiology, clinical benefits, and the concomitant use of prescription topical therapies in patients with acne vulgaris. Dermatol Clin. 2016;34:133-145.
- Savage LJ, Layton AM. Treating acne vulgaris: systemic, local and combination therapy. Expert Rev Clin Pharmacol. 2010;3:563-580.
- Jacob CL, Dover JS, Kaminer MS. Acne scarring: a classification system and review of treatment options. J Am Acad Dermatol. 2001;45:109-117.
- Shaw JC, White LE. Persistent acne in adult women. Arch Dermatol. 2001;137:1252-1253.
- Schmidt JV, Masuda PY, Miot HA. Acne in women: clinical patterns in different age groups. An Bras Dermatol. 2009;84:349-354.
- Thiboutot D, Gollnick H, Bettoli V, et al. New insights into the management of acne: an update from the Global Alliance to Improve Outcomes in Acne group. J Am Acad Dermatol. 2009;60(5 suppl):1-50.
- Williams C, Layton AM. Persistent acne in women: implications for the patient and for therapy. Am J Clin Dermatol. 2006;7:281-290.
- Holzmann R, Shakery K. Postadolescent acne in females. Skin Pharmacol Physiol. 2014;27(suppl 1):3-8.
- Pena S, Hill D, Feldman SR. Use of topical retinoids by dermatologist and non-dermatologist in the management of acne vulgaris. J Am Acad Dermatol. 2016;74:1252-1254.
- Layton AM, Henderson CA, Cunliffe WJ. A clinical evaluation of acne scarring and its incidence. Clin Exp Dermatol. 1994;19;303-308.
- Halvorsen JA, Stern RS, Dalgard F, et al. Suicidal ideation, mental health problems, and social impairment are increased in adolescents with acne: a population-based study. J Invest Dermatol. 2011;131:363-370.
- Thielitz A, Sidou F, Gollnick H. Control of microcomedone formation throughout a maintenance treatment with adapalene gel, 0.1%. J Eur Acad Dermatol Venereol. 2007;21:747-753.
- Leyden J, Thiboutot DM, Shalita R, et al. Comparison of tazarotene and minocycline maintenance therapies in acne vulgaris: a multicenter, double-blind, randomized, parallel-group study. Arch Dermatol. 2006;142:605-612.
- Goulden V, Stables GI, Cunliffe WJ. Prevalence of facial acne in adults. J Am Acad Dermatol. 1999;41:577-580.
- Dreno B, Layton A, Zouboulis CC, et al. Adult female acne: a new paradigm. J Eur Acad Dermatol Venereol. 2013;27:1063-1070.
- Preneau S, Dreno B. Female acne--a different subtype of teenager acne? J Eur Acad Dermatol Venereol. 2012;26:277-282.
- Goulden V, Clark SM, Cunliffe WJ. Post-adolescent acne: a review of clinical features. Br J Dermatol. 1997;136:66-70.
- Kamangar F, Shinkai K. Acne in the adult female patient: a practical approach. Int J Dermatol. 2012;51:1162-1174.
- Choi CW, Lee DH, Kim HS, et al. The clinical features of late onset acne compared with early onset acne in women. J Eur Acad Dermatol Venereol. 2011;25:454-461.
- Kligman AM, Fulton JE Jr, Plewig G. Topical vitamin A acid in acne vulgaris. Arch Dermatol. 1969;99:469-476.
- Zaenglein AL, Pathy AL, Schlosser BJ, et al. Guidelines of care for the management of acne vulgaris. J Am Acad Dermatol. 2016;74:945.e33-973.e33.
- Nast A, Dreno B, Bettoli V, et al. European evidence-based guidelines for the treatment of acne. J Eur Acad Dermatol Venereol. 2012;26(suppl 1):1-29.
- Levin J. The relationship of proper skin cleansing to pathophysiology, clinical benefits, and the concomitant use of prescription topical therapies in patients with acne vulgaris. Dermatol Clin. 2016;34:133-145.
- Savage LJ, Layton AM. Treating acne vulgaris: systemic, local and combination therapy. Expert Rev Clin Pharmacol. 2010;3:563-580.
- Jacob CL, Dover JS, Kaminer MS. Acne scarring: a classification system and review of treatment options. J Am Acad Dermatol. 2001;45:109-117.
- Shaw JC, White LE. Persistent acne in adult women. Arch Dermatol. 2001;137:1252-1253.
- Schmidt JV, Masuda PY, Miot HA. Acne in women: clinical patterns in different age groups. An Bras Dermatol. 2009;84:349-354.
- Thiboutot D, Gollnick H, Bettoli V, et al. New insights into the management of acne: an update from the Global Alliance to Improve Outcomes in Acne group. J Am Acad Dermatol. 2009;60(5 suppl):1-50.
- Williams C, Layton AM. Persistent acne in women: implications for the patient and for therapy. Am J Clin Dermatol. 2006;7:281-290.
- Holzmann R, Shakery K. Postadolescent acne in females. Skin Pharmacol Physiol. 2014;27(suppl 1):3-8.
- Pena S, Hill D, Feldman SR. Use of topical retinoids by dermatologist and non-dermatologist in the management of acne vulgaris. J Am Acad Dermatol. 2016;74:1252-1254.
- Layton AM, Henderson CA, Cunliffe WJ. A clinical evaluation of acne scarring and its incidence. Clin Exp Dermatol. 1994;19;303-308.
- Halvorsen JA, Stern RS, Dalgard F, et al. Suicidal ideation, mental health problems, and social impairment are increased in adolescents with acne: a population-based study. J Invest Dermatol. 2011;131:363-370.
- Thielitz A, Sidou F, Gollnick H. Control of microcomedone formation throughout a maintenance treatment with adapalene gel, 0.1%. J Eur Acad Dermatol Venereol. 2007;21:747-753.
- Leyden J, Thiboutot DM, Shalita R, et al. Comparison of tazarotene and minocycline maintenance therapies in acne vulgaris: a multicenter, double-blind, randomized, parallel-group study. Arch Dermatol. 2006;142:605-612.
Practice Points
- Postacne scarring is the most severe complication of acne.
- Isotretinoin or topical retinoid treatment in adolescence decreases the risk for scars during adult acne, justifying the role of maintenance therapy with topical retinoids.
Usage of and Attitudes Toward Health Information Exchange Before and After System Implementation in a VA Medical Center
More than 9 million veterans are enrolled in the Veterans Health Administration (VHA). A high percentage of veterans who use VHA services have multiple chronic conditions and complex medical needs.1 In addition to receiving health care from the VHA, many of these patients receive additional services from non-VHA providers in the community. Furthermore, recent laws enacted, such as the 2018 VA MISSION Act and the 2014 VA Choice Program, have increased veterans’ use of community health care services.
VHA staff face considerable barriers when seeking documentation about non-VHA services delivered in the community, which can be fragmented across multiple health care systems. In many VHA medical centers, staff must telephone non-VHA sites of care and/or use time-consuming fax services to request community-based patient records. VA health care providers (HCPs) often complain that community records are not available to make timely clinical decisions or that they must do so without knowing past or co-occurring assessments or treatment plans. Without access to comprehensive health records, patients are at risk for duplicated treatment, medication errors, and death.2,3
Background
To improve the continuity and safety of health care, US governmental and health information experts stimulated formal communication among HCPs via the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act.4,5 One of the primary aims of the HITECH Act was to promote reliable and interoperable electronic sharing of clinical information through health information exchange (HIE) for both patients and HCPs. Monetary incentives encouraged regional, state, or state-funded organizations to create and promote HIE capabilities.
Presently, empirical data are not available that describe the effect of external HIE systems in VHA settings. However, data examining non-VHA settings suggest that HIE may improve quality of care, although findings are mixed. For example, some research has found that HIE reduces hospital admissions, duplicated test ordering, and health care costs and improves decision making, whereas other research has found no change.3,6-13 Barriers to HIE use noted in community settings include poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6-10,14
A few US Department of Veterans Affairs (VA) medical centers have recently initiated contracts with HIE organizations. Because much of the present research evaluates internally developed HIE systems, scholars in the field have identified a pressing need for useful statistics before and after implementation of externally developed HIE systems.13,15 Additionally, scholars call for data examining nonacademic settings (eg, VHA medical centers) and for diverse patient populations (eg, individuals with chronic disorders, veterans).13This quality improvement project had 2 goals. The first goal was to assess baseline descriptive statistics related to requesting/obtaining community health records in a VHA setting. The second goal was to evaluate VHA staff access to needed community health records (eg, records stemming from community consults) before and after implementation of an externally developed HIE system.
Methods
This project was a single-center, quality improvement evaluation examining the effect of implementing an HIE system, developed by an external nonprofit organization. The project protocol was approved by the VA Pacific Islands Healthcare System (VAPIHCS) Evidence-Based Practices Council. Clinicians’ responses were anonymous, and data were reported only in aggregate. Assessment was conducted by an evaluator who was not associated with the HIE system developers and its implementation, reducing the chance of bias.15
Coinciding with the HIE system implementation and prior to having access to it, VAPIHCS medical and managed care staff were invited to complete an online needs assessment tool. Voluntary trainings on the system were offered at various times on multiple days and lasted approximately 1 hour. Six months after the HIE system was implemented, a postassessment tool reevaluated HIE-related access.
VHA Setting and HIE System
VAPIHCS serves about 55,000 unique patients across a 2.6 million square-mile catchment area (Hawaii and Pacific Island territories). Facilities include a medium-sized, urban VA medical center and 7 suburban or rural/remote primary care outpatient clinics.
VAPIHCS contracted with Hawaii Health Information Exchange (HHIE), a nonprofit organization that was designated by the state of Hawaii to develop a seamless, secure HIE system. According to HHIE, 83% of the 23 hospitals in the state and 55% of Hawaii’s 2,927 active practicing physicians have adopted the HIE system (F. Chan, personal communication, December 12, 2018). HHIE’s data sources provide real-time access to a database of 20 million health records. Records include, among other records, data such as patients’ reasons for referral, encounter diagnoses, medications, immunizations, and discharge instructions from many (but not all) HCPs in Hawaii.
HHIE reports that it has the capacity to interface with all electronic health records systems currently in use in the community (F. Chan, personal communication, December 12, 2018). Although the HIE system can provide directed exchange (ie, sending and receiving secure information electronically between HCPs), the HIE system implemented in the VAPIHCS was limited to query-retrieve (ie, practitioner-initiated requests for information from other community HCPs). Specifically, to access patient records, practitioners log in to the HIE portal and enter a patient’s name in a search window. The system then generates a consolidated virtual chart with data collected from all HIE data-sharing participants. To share records, community HCPs either build or enable a profile in an integrated health care enterprise electronic communication interface into their data. However, VHA records were not made available to community HCPs at this initial stage.
Measures and Statistical Analysis
A template of quality improvement-related questions was adapted for this project with input from subject matter experts. Questions were then modified further based on interviews with 5 clinical and managed care staff members. The final online tool consisted of up to 20 multiple choice items and 2 open-ended questions delivered online. A 22-item evaluation tool was administered 6 months after system implementation. Frequencies were obtained for descriptive items, and group responses were compared across time.
Results
Thirty-nine staff (32 medical and 7 managed care staff) completed the needs assessment, and 20 staff (16 medical and 4 managed care staff) completed the postimplementation evaluation.
Before implementation of the HIE system, most staff (54%) indicated that they spent > 1 hour a week conducting tasks related to seeking and/or obtaining health records from the community. The largest percentage of staff (27%) requested > 10 community records during a typical week. Most respondents indicated that they would use an easy tool to instantly retrieve community health records at least 20 times per week (Table 1).
Preimplementation, 32.4% of respondents indicated that they could access community-based health records sometimes. Postimplementation, most respondents indicated they could access the records most of the time (Figure 1).
Preimplementation, staff most frequently indicated they were very dissatisfied with the current level of access to community records. Postimplementation, more staff were somewhat satisfied or very satisfied (Figure 2). Postimplementation, 48% of staff most often reported using the HIE system either several times a month or 2 to 4 times a week, 19% used the system daily, 19% used 1 to 2 times, and 14% never used the system. Most staff (67%) reported that the system improved access to records somewhat and supported continuing the contract with the HIE system. Conversely, 18% of respondents said that their access did not improve enough for the system to be of use to them.
Preimplementation, staff most frequently indicated that they did not have time (28.6%) or sufficient staff (25.7%) to request records (Table 2). Postimplementation, staff most frequently (33.3%) indicated that they had no problems accessing the HIE system, but 6.7% reported having time or interface/software difficulties.
Discussion
This report assessed a quality improvement project designed to increase VHA access to community health records via an external HIE system. Prior to this work, no data were available on use, barriers, and staff satisfaction related to implementing an externally developed HIE system within a VA medical center.13,15
Before the medical center implemented the HIE system, logistical barriers prevented most HCPs and managed care staff from obtaining needed community records. Staff faced challenges such as lacking time as well as rudimentary barriers, such as community clinics not responding to requests or the fax machine not working. Time remained a challenge after implementation, but this work demonstrated that the HIE system helped staff overcome many logistical barriers.
After implementation of the HIE system, staff reported an improvement in access and satisfaction related to retrieving community health records. These findings are consistent with most but not all evaluations of HIE systems.3,6,7,12,13 In the present work, staff used the system several times a month or several times a week, and most staff believed that access to the HIE system should be continued. Still, improvement was incomplete. The HIE system increased access to specific types of records (eg, reports) and health care systems (eg, large hospitals), but not others. As a result, the system was more useful for some staff than for others.
Research examining HIE systems in community and academic settings have identified factors that deter their use, such as poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6,7,14,16 In the present project, incomplete record availability was a noted barrier. Additionally, a few staff reported system interface issues. However, most staff found the system easy to use as part of their daily workflow.
Because the HIE system had a meaningful, positive impact on VHA providers and staff, it will be sustained at VAPIHCS. Specifically, the contract with the HHIE has been renewed, and the number of user licenses has increased. Staff users now self-refer for the service or can be referred by their service chiefs.
Limitations
This work was designed to evaluate the effect of an HIE system on staff in 1 VHA setting; thus, findings may not be generalizable to other settings or HIE systems. Limitations of the present work include small sample size of respondents; limited time frame for responses; and limited response rate. The logical next step would be research efforts to compare access to the HIE system with no access on factors such as workload productivity, cost savings, and patient safety.
Conclusion
The vision of the HITECH Act was to improve the continuity and safety of health care via reliable and interoperable electronic sharing of clinical information across health care entities.6 This VHA quality improvement project demonstrated a meaningful improvement in staff’s level of satisfaction with access to community health records when staff used an externally developed HIE system. Not all types of records (eg, progress notes) were accessible, which resulted in the system being useful for most but not all staff.
In the future, the federal government’s internally developed Veterans Health Information Exchange (formerly known as the Virtual Lifetime Electronic Record [VLER]) is expected to enable VHA, the Department of Defense, and participating community care providers to access shared electronic health records nationally. However, until we can achieve that envisioned interoperability, VHA staff can use HIE and other clinical support applications to access health records.
1. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S.
2. Bourgeois FC, Olson KL, Mandl KD. Patients treated at multiple acute health care facilities: quantifying information fragmentation. Arch Intern Med. 2010;170(22):1989-1995.
3. Rudin RS, Motala A, Goldzweig CL, Shekelle PG. Usage and effect of health information exchange: a systematic review. Ann Intern Med. 2014;161(11):803-811.
4. Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431.
5. The Office of the National Coordinator for Health Information Technology. Connecting health and care for the nation: a shared nationwide interoperability roadmap. Final version 1.0. https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf. Accessed May 22, 2019.
6. Detmer D, Bloomrosen M, Raymond B, Tang P. Integrated personal health records: transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008;8:45.
7. Hersh WR, Totten AM, Eden KB, et al. Outcomes from health information exchange: systematic review and future research needs. JMIR Med Inform. 2015;3(4):e39.
8. Vest JR, Kern LM, Campion TR Jr, Silver MD, Kaushal R. Association between use of a health information exchange system and hospital admissions. Appl Clin Inform. 2014;5(1):219-231.
9. Vest JR, Jung HY, Ostrovsky A, Das LT, McGinty GB. Image sharing technologies and reduction of imaging utilization: a systematic review and meta-analysis. J Am Coll Radiol. 2015;12(12 pt B):1371-1379.e3.
10. Walker DM. Does participation in health information exchange improve hospital efficiency? Health Care Manag Sci. 2018;21(3):426-438.
11. Gordon BD, Bernard K, Salzman J, Whitebird RR. Impact of health information exchange on emergency medicine clinical decision making. West J Emerg Med. 2015;16(7):1047-1051.
12. Hincapie A, Warholak T. The impact of health information exchange on health outcomes. Appl Clin Inform. 2011;2(4):499-507.
13. Rahurkar S, Vest JR, Menachemi N. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care. Health Aff (Millwood). 2015;34(3):477-483.
14. Eden KB, Totten AM, Kassakian SZ, et al. Barriers and facilitators to exchanging health information: a systematic review. Int J Med Inform. 2016;88:44-51.
15. Hersh WR, Totten AM, Eden K, et al. The evidence base for health information exchange. In: Dixon BE, ed. Health Information Exchange: Navigating and Managing a Network of Health Information Systems. Cambridge, MA: Academic Press; 2016:213-229.
16. Blavin F, Ramos C, Cafarella Lallemand N, Fass J, Ozanich G, Adler-Milstein J. Analyzing the public benefit attributable to interoperable health information exchange. https://aspe.hhs.gov/system/files/pdf/258851/AnalyzingthePublicBenefitAttributabletoInteroperableHealth.pdf. Published July 2017. Accessed May 22, 2019.
More than 9 million veterans are enrolled in the Veterans Health Administration (VHA). A high percentage of veterans who use VHA services have multiple chronic conditions and complex medical needs.1 In addition to receiving health care from the VHA, many of these patients receive additional services from non-VHA providers in the community. Furthermore, recent laws enacted, such as the 2018 VA MISSION Act and the 2014 VA Choice Program, have increased veterans’ use of community health care services.
VHA staff face considerable barriers when seeking documentation about non-VHA services delivered in the community, which can be fragmented across multiple health care systems. In many VHA medical centers, staff must telephone non-VHA sites of care and/or use time-consuming fax services to request community-based patient records. VA health care providers (HCPs) often complain that community records are not available to make timely clinical decisions or that they must do so without knowing past or co-occurring assessments or treatment plans. Without access to comprehensive health records, patients are at risk for duplicated treatment, medication errors, and death.2,3
Background
To improve the continuity and safety of health care, US governmental and health information experts stimulated formal communication among HCPs via the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act.4,5 One of the primary aims of the HITECH Act was to promote reliable and interoperable electronic sharing of clinical information through health information exchange (HIE) for both patients and HCPs. Monetary incentives encouraged regional, state, or state-funded organizations to create and promote HIE capabilities.
Presently, empirical data are not available that describe the effect of external HIE systems in VHA settings. However, data examining non-VHA settings suggest that HIE may improve quality of care, although findings are mixed. For example, some research has found that HIE reduces hospital admissions, duplicated test ordering, and health care costs and improves decision making, whereas other research has found no change.3,6-13 Barriers to HIE use noted in community settings include poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6-10,14
A few US Department of Veterans Affairs (VA) medical centers have recently initiated contracts with HIE organizations. Because much of the present research evaluates internally developed HIE systems, scholars in the field have identified a pressing need for useful statistics before and after implementation of externally developed HIE systems.13,15 Additionally, scholars call for data examining nonacademic settings (eg, VHA medical centers) and for diverse patient populations (eg, individuals with chronic disorders, veterans).13This quality improvement project had 2 goals. The first goal was to assess baseline descriptive statistics related to requesting/obtaining community health records in a VHA setting. The second goal was to evaluate VHA staff access to needed community health records (eg, records stemming from community consults) before and after implementation of an externally developed HIE system.
Methods
This project was a single-center, quality improvement evaluation examining the effect of implementing an HIE system, developed by an external nonprofit organization. The project protocol was approved by the VA Pacific Islands Healthcare System (VAPIHCS) Evidence-Based Practices Council. Clinicians’ responses were anonymous, and data were reported only in aggregate. Assessment was conducted by an evaluator who was not associated with the HIE system developers and its implementation, reducing the chance of bias.15
Coinciding with the HIE system implementation and prior to having access to it, VAPIHCS medical and managed care staff were invited to complete an online needs assessment tool. Voluntary trainings on the system were offered at various times on multiple days and lasted approximately 1 hour. Six months after the HIE system was implemented, a postassessment tool reevaluated HIE-related access.
VHA Setting and HIE System
VAPIHCS serves about 55,000 unique patients across a 2.6 million square-mile catchment area (Hawaii and Pacific Island territories). Facilities include a medium-sized, urban VA medical center and 7 suburban or rural/remote primary care outpatient clinics.
VAPIHCS contracted with Hawaii Health Information Exchange (HHIE), a nonprofit organization that was designated by the state of Hawaii to develop a seamless, secure HIE system. According to HHIE, 83% of the 23 hospitals in the state and 55% of Hawaii’s 2,927 active practicing physicians have adopted the HIE system (F. Chan, personal communication, December 12, 2018). HHIE’s data sources provide real-time access to a database of 20 million health records. Records include, among other records, data such as patients’ reasons for referral, encounter diagnoses, medications, immunizations, and discharge instructions from many (but not all) HCPs in Hawaii.
HHIE reports that it has the capacity to interface with all electronic health records systems currently in use in the community (F. Chan, personal communication, December 12, 2018). Although the HIE system can provide directed exchange (ie, sending and receiving secure information electronically between HCPs), the HIE system implemented in the VAPIHCS was limited to query-retrieve (ie, practitioner-initiated requests for information from other community HCPs). Specifically, to access patient records, practitioners log in to the HIE portal and enter a patient’s name in a search window. The system then generates a consolidated virtual chart with data collected from all HIE data-sharing participants. To share records, community HCPs either build or enable a profile in an integrated health care enterprise electronic communication interface into their data. However, VHA records were not made available to community HCPs at this initial stage.
Measures and Statistical Analysis
A template of quality improvement-related questions was adapted for this project with input from subject matter experts. Questions were then modified further based on interviews with 5 clinical and managed care staff members. The final online tool consisted of up to 20 multiple choice items and 2 open-ended questions delivered online. A 22-item evaluation tool was administered 6 months after system implementation. Frequencies were obtained for descriptive items, and group responses were compared across time.
Results
Thirty-nine staff (32 medical and 7 managed care staff) completed the needs assessment, and 20 staff (16 medical and 4 managed care staff) completed the postimplementation evaluation.
Before implementation of the HIE system, most staff (54%) indicated that they spent > 1 hour a week conducting tasks related to seeking and/or obtaining health records from the community. The largest percentage of staff (27%) requested > 10 community records during a typical week. Most respondents indicated that they would use an easy tool to instantly retrieve community health records at least 20 times per week (Table 1).
Preimplementation, 32.4% of respondents indicated that they could access community-based health records sometimes. Postimplementation, most respondents indicated they could access the records most of the time (Figure 1).
Preimplementation, staff most frequently indicated they were very dissatisfied with the current level of access to community records. Postimplementation, more staff were somewhat satisfied or very satisfied (Figure 2). Postimplementation, 48% of staff most often reported using the HIE system either several times a month or 2 to 4 times a week, 19% used the system daily, 19% used 1 to 2 times, and 14% never used the system. Most staff (67%) reported that the system improved access to records somewhat and supported continuing the contract with the HIE system. Conversely, 18% of respondents said that their access did not improve enough for the system to be of use to them.
Preimplementation, staff most frequently indicated that they did not have time (28.6%) or sufficient staff (25.7%) to request records (Table 2). Postimplementation, staff most frequently (33.3%) indicated that they had no problems accessing the HIE system, but 6.7% reported having time or interface/software difficulties.
Discussion
This report assessed a quality improvement project designed to increase VHA access to community health records via an external HIE system. Prior to this work, no data were available on use, barriers, and staff satisfaction related to implementing an externally developed HIE system within a VA medical center.13,15
Before the medical center implemented the HIE system, logistical barriers prevented most HCPs and managed care staff from obtaining needed community records. Staff faced challenges such as lacking time as well as rudimentary barriers, such as community clinics not responding to requests or the fax machine not working. Time remained a challenge after implementation, but this work demonstrated that the HIE system helped staff overcome many logistical barriers.
After implementation of the HIE system, staff reported an improvement in access and satisfaction related to retrieving community health records. These findings are consistent with most but not all evaluations of HIE systems.3,6,7,12,13 In the present work, staff used the system several times a month or several times a week, and most staff believed that access to the HIE system should be continued. Still, improvement was incomplete. The HIE system increased access to specific types of records (eg, reports) and health care systems (eg, large hospitals), but not others. As a result, the system was more useful for some staff than for others.
Research examining HIE systems in community and academic settings have identified factors that deter their use, such as poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6,7,14,16 In the present project, incomplete record availability was a noted barrier. Additionally, a few staff reported system interface issues. However, most staff found the system easy to use as part of their daily workflow.
Because the HIE system had a meaningful, positive impact on VHA providers and staff, it will be sustained at VAPIHCS. Specifically, the contract with the HHIE has been renewed, and the number of user licenses has increased. Staff users now self-refer for the service or can be referred by their service chiefs.
Limitations
This work was designed to evaluate the effect of an HIE system on staff in 1 VHA setting; thus, findings may not be generalizable to other settings or HIE systems. Limitations of the present work include small sample size of respondents; limited time frame for responses; and limited response rate. The logical next step would be research efforts to compare access to the HIE system with no access on factors such as workload productivity, cost savings, and patient safety.
Conclusion
The vision of the HITECH Act was to improve the continuity and safety of health care via reliable and interoperable electronic sharing of clinical information across health care entities.6 This VHA quality improvement project demonstrated a meaningful improvement in staff’s level of satisfaction with access to community health records when staff used an externally developed HIE system. Not all types of records (eg, progress notes) were accessible, which resulted in the system being useful for most but not all staff.
In the future, the federal government’s internally developed Veterans Health Information Exchange (formerly known as the Virtual Lifetime Electronic Record [VLER]) is expected to enable VHA, the Department of Defense, and participating community care providers to access shared electronic health records nationally. However, until we can achieve that envisioned interoperability, VHA staff can use HIE and other clinical support applications to access health records.
More than 9 million veterans are enrolled in the Veterans Health Administration (VHA). A high percentage of veterans who use VHA services have multiple chronic conditions and complex medical needs.1 In addition to receiving health care from the VHA, many of these patients receive additional services from non-VHA providers in the community. Furthermore, recent laws enacted, such as the 2018 VA MISSION Act and the 2014 VA Choice Program, have increased veterans’ use of community health care services.
VHA staff face considerable barriers when seeking documentation about non-VHA services delivered in the community, which can be fragmented across multiple health care systems. In many VHA medical centers, staff must telephone non-VHA sites of care and/or use time-consuming fax services to request community-based patient records. VA health care providers (HCPs) often complain that community records are not available to make timely clinical decisions or that they must do so without knowing past or co-occurring assessments or treatment plans. Without access to comprehensive health records, patients are at risk for duplicated treatment, medication errors, and death.2,3
Background
To improve the continuity and safety of health care, US governmental and health information experts stimulated formal communication among HCPs via the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act.4,5 One of the primary aims of the HITECH Act was to promote reliable and interoperable electronic sharing of clinical information through health information exchange (HIE) for both patients and HCPs. Monetary incentives encouraged regional, state, or state-funded organizations to create and promote HIE capabilities.
Presently, empirical data are not available that describe the effect of external HIE systems in VHA settings. However, data examining non-VHA settings suggest that HIE may improve quality of care, although findings are mixed. For example, some research has found that HIE reduces hospital admissions, duplicated test ordering, and health care costs and improves decision making, whereas other research has found no change.3,6-13 Barriers to HIE use noted in community settings include poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6-10,14
A few US Department of Veterans Affairs (VA) medical centers have recently initiated contracts with HIE organizations. Because much of the present research evaluates internally developed HIE systems, scholars in the field have identified a pressing need for useful statistics before and after implementation of externally developed HIE systems.13,15 Additionally, scholars call for data examining nonacademic settings (eg, VHA medical centers) and for diverse patient populations (eg, individuals with chronic disorders, veterans).13This quality improvement project had 2 goals. The first goal was to assess baseline descriptive statistics related to requesting/obtaining community health records in a VHA setting. The second goal was to evaluate VHA staff access to needed community health records (eg, records stemming from community consults) before and after implementation of an externally developed HIE system.
Methods
This project was a single-center, quality improvement evaluation examining the effect of implementing an HIE system, developed by an external nonprofit organization. The project protocol was approved by the VA Pacific Islands Healthcare System (VAPIHCS) Evidence-Based Practices Council. Clinicians’ responses were anonymous, and data were reported only in aggregate. Assessment was conducted by an evaluator who was not associated with the HIE system developers and its implementation, reducing the chance of bias.15
Coinciding with the HIE system implementation and prior to having access to it, VAPIHCS medical and managed care staff were invited to complete an online needs assessment tool. Voluntary trainings on the system were offered at various times on multiple days and lasted approximately 1 hour. Six months after the HIE system was implemented, a postassessment tool reevaluated HIE-related access.
VHA Setting and HIE System
VAPIHCS serves about 55,000 unique patients across a 2.6 million square-mile catchment area (Hawaii and Pacific Island territories). Facilities include a medium-sized, urban VA medical center and 7 suburban or rural/remote primary care outpatient clinics.
VAPIHCS contracted with Hawaii Health Information Exchange (HHIE), a nonprofit organization that was designated by the state of Hawaii to develop a seamless, secure HIE system. According to HHIE, 83% of the 23 hospitals in the state and 55% of Hawaii’s 2,927 active practicing physicians have adopted the HIE system (F. Chan, personal communication, December 12, 2018). HHIE’s data sources provide real-time access to a database of 20 million health records. Records include, among other records, data such as patients’ reasons for referral, encounter diagnoses, medications, immunizations, and discharge instructions from many (but not all) HCPs in Hawaii.
HHIE reports that it has the capacity to interface with all electronic health records systems currently in use in the community (F. Chan, personal communication, December 12, 2018). Although the HIE system can provide directed exchange (ie, sending and receiving secure information electronically between HCPs), the HIE system implemented in the VAPIHCS was limited to query-retrieve (ie, practitioner-initiated requests for information from other community HCPs). Specifically, to access patient records, practitioners log in to the HIE portal and enter a patient’s name in a search window. The system then generates a consolidated virtual chart with data collected from all HIE data-sharing participants. To share records, community HCPs either build or enable a profile in an integrated health care enterprise electronic communication interface into their data. However, VHA records were not made available to community HCPs at this initial stage.
Measures and Statistical Analysis
A template of quality improvement-related questions was adapted for this project with input from subject matter experts. Questions were then modified further based on interviews with 5 clinical and managed care staff members. The final online tool consisted of up to 20 multiple choice items and 2 open-ended questions delivered online. A 22-item evaluation tool was administered 6 months after system implementation. Frequencies were obtained for descriptive items, and group responses were compared across time.
Results
Thirty-nine staff (32 medical and 7 managed care staff) completed the needs assessment, and 20 staff (16 medical and 4 managed care staff) completed the postimplementation evaluation.
Before implementation of the HIE system, most staff (54%) indicated that they spent > 1 hour a week conducting tasks related to seeking and/or obtaining health records from the community. The largest percentage of staff (27%) requested > 10 community records during a typical week. Most respondents indicated that they would use an easy tool to instantly retrieve community health records at least 20 times per week (Table 1).
Preimplementation, 32.4% of respondents indicated that they could access community-based health records sometimes. Postimplementation, most respondents indicated they could access the records most of the time (Figure 1).
Preimplementation, staff most frequently indicated they were very dissatisfied with the current level of access to community records. Postimplementation, more staff were somewhat satisfied or very satisfied (Figure 2). Postimplementation, 48% of staff most often reported using the HIE system either several times a month or 2 to 4 times a week, 19% used the system daily, 19% used 1 to 2 times, and 14% never used the system. Most staff (67%) reported that the system improved access to records somewhat and supported continuing the contract with the HIE system. Conversely, 18% of respondents said that their access did not improve enough for the system to be of use to them.
Preimplementation, staff most frequently indicated that they did not have time (28.6%) or sufficient staff (25.7%) to request records (Table 2). Postimplementation, staff most frequently (33.3%) indicated that they had no problems accessing the HIE system, but 6.7% reported having time or interface/software difficulties.
Discussion
This report assessed a quality improvement project designed to increase VHA access to community health records via an external HIE system. Prior to this work, no data were available on use, barriers, and staff satisfaction related to implementing an externally developed HIE system within a VA medical center.13,15
Before the medical center implemented the HIE system, logistical barriers prevented most HCPs and managed care staff from obtaining needed community records. Staff faced challenges such as lacking time as well as rudimentary barriers, such as community clinics not responding to requests or the fax machine not working. Time remained a challenge after implementation, but this work demonstrated that the HIE system helped staff overcome many logistical barriers.
After implementation of the HIE system, staff reported an improvement in access and satisfaction related to retrieving community health records. These findings are consistent with most but not all evaluations of HIE systems.3,6,7,12,13 In the present work, staff used the system several times a month or several times a week, and most staff believed that access to the HIE system should be continued. Still, improvement was incomplete. The HIE system increased access to specific types of records (eg, reports) and health care systems (eg, large hospitals), but not others. As a result, the system was more useful for some staff than for others.
Research examining HIE systems in community and academic settings have identified factors that deter their use, such as poorly designed interfaces, inefficient workflow, and incomplete record availability.3,6,7,14,16 In the present project, incomplete record availability was a noted barrier. Additionally, a few staff reported system interface issues. However, most staff found the system easy to use as part of their daily workflow.
Because the HIE system had a meaningful, positive impact on VHA providers and staff, it will be sustained at VAPIHCS. Specifically, the contract with the HHIE has been renewed, and the number of user licenses has increased. Staff users now self-refer for the service or can be referred by their service chiefs.
Limitations
This work was designed to evaluate the effect of an HIE system on staff in 1 VHA setting; thus, findings may not be generalizable to other settings or HIE systems. Limitations of the present work include small sample size of respondents; limited time frame for responses; and limited response rate. The logical next step would be research efforts to compare access to the HIE system with no access on factors such as workload productivity, cost savings, and patient safety.
Conclusion
The vision of the HITECH Act was to improve the continuity and safety of health care via reliable and interoperable electronic sharing of clinical information across health care entities.6 This VHA quality improvement project demonstrated a meaningful improvement in staff’s level of satisfaction with access to community health records when staff used an externally developed HIE system. Not all types of records (eg, progress notes) were accessible, which resulted in the system being useful for most but not all staff.
In the future, the federal government’s internally developed Veterans Health Information Exchange (formerly known as the Virtual Lifetime Electronic Record [VLER]) is expected to enable VHA, the Department of Defense, and participating community care providers to access shared electronic health records nationally. However, until we can achieve that envisioned interoperability, VHA staff can use HIE and other clinical support applications to access health records.
1. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S.
2. Bourgeois FC, Olson KL, Mandl KD. Patients treated at multiple acute health care facilities: quantifying information fragmentation. Arch Intern Med. 2010;170(22):1989-1995.
3. Rudin RS, Motala A, Goldzweig CL, Shekelle PG. Usage and effect of health information exchange: a systematic review. Ann Intern Med. 2014;161(11):803-811.
4. Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431.
5. The Office of the National Coordinator for Health Information Technology. Connecting health and care for the nation: a shared nationwide interoperability roadmap. Final version 1.0. https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf. Accessed May 22, 2019.
6. Detmer D, Bloomrosen M, Raymond B, Tang P. Integrated personal health records: transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008;8:45.
7. Hersh WR, Totten AM, Eden KB, et al. Outcomes from health information exchange: systematic review and future research needs. JMIR Med Inform. 2015;3(4):e39.
8. Vest JR, Kern LM, Campion TR Jr, Silver MD, Kaushal R. Association between use of a health information exchange system and hospital admissions. Appl Clin Inform. 2014;5(1):219-231.
9. Vest JR, Jung HY, Ostrovsky A, Das LT, McGinty GB. Image sharing technologies and reduction of imaging utilization: a systematic review and meta-analysis. J Am Coll Radiol. 2015;12(12 pt B):1371-1379.e3.
10. Walker DM. Does participation in health information exchange improve hospital efficiency? Health Care Manag Sci. 2018;21(3):426-438.
11. Gordon BD, Bernard K, Salzman J, Whitebird RR. Impact of health information exchange on emergency medicine clinical decision making. West J Emerg Med. 2015;16(7):1047-1051.
12. Hincapie A, Warholak T. The impact of health information exchange on health outcomes. Appl Clin Inform. 2011;2(4):499-507.
13. Rahurkar S, Vest JR, Menachemi N. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care. Health Aff (Millwood). 2015;34(3):477-483.
14. Eden KB, Totten AM, Kassakian SZ, et al. Barriers and facilitators to exchanging health information: a systematic review. Int J Med Inform. 2016;88:44-51.
15. Hersh WR, Totten AM, Eden K, et al. The evidence base for health information exchange. In: Dixon BE, ed. Health Information Exchange: Navigating and Managing a Network of Health Information Systems. Cambridge, MA: Academic Press; 2016:213-229.
16. Blavin F, Ramos C, Cafarella Lallemand N, Fass J, Ozanich G, Adler-Milstein J. Analyzing the public benefit attributable to interoperable health information exchange. https://aspe.hhs.gov/system/files/pdf/258851/AnalyzingthePublicBenefitAttributabletoInteroperableHealth.pdf. Published July 2017. Accessed May 22, 2019.
1. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S.
2. Bourgeois FC, Olson KL, Mandl KD. Patients treated at multiple acute health care facilities: quantifying information fragmentation. Arch Intern Med. 2010;170(22):1989-1995.
3. Rudin RS, Motala A, Goldzweig CL, Shekelle PG. Usage and effect of health information exchange: a systematic review. Ann Intern Med. 2014;161(11):803-811.
4. Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431.
5. The Office of the National Coordinator for Health Information Technology. Connecting health and care for the nation: a shared nationwide interoperability roadmap. Final version 1.0. https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf. Accessed May 22, 2019.
6. Detmer D, Bloomrosen M, Raymond B, Tang P. Integrated personal health records: transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008;8:45.
7. Hersh WR, Totten AM, Eden KB, et al. Outcomes from health information exchange: systematic review and future research needs. JMIR Med Inform. 2015;3(4):e39.
8. Vest JR, Kern LM, Campion TR Jr, Silver MD, Kaushal R. Association between use of a health information exchange system and hospital admissions. Appl Clin Inform. 2014;5(1):219-231.
9. Vest JR, Jung HY, Ostrovsky A, Das LT, McGinty GB. Image sharing technologies and reduction of imaging utilization: a systematic review and meta-analysis. J Am Coll Radiol. 2015;12(12 pt B):1371-1379.e3.
10. Walker DM. Does participation in health information exchange improve hospital efficiency? Health Care Manag Sci. 2018;21(3):426-438.
11. Gordon BD, Bernard K, Salzman J, Whitebird RR. Impact of health information exchange on emergency medicine clinical decision making. West J Emerg Med. 2015;16(7):1047-1051.
12. Hincapie A, Warholak T. The impact of health information exchange on health outcomes. Appl Clin Inform. 2011;2(4):499-507.
13. Rahurkar S, Vest JR, Menachemi N. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care. Health Aff (Millwood). 2015;34(3):477-483.
14. Eden KB, Totten AM, Kassakian SZ, et al. Barriers and facilitators to exchanging health information: a systematic review. Int J Med Inform. 2016;88:44-51.
15. Hersh WR, Totten AM, Eden K, et al. The evidence base for health information exchange. In: Dixon BE, ed. Health Information Exchange: Navigating and Managing a Network of Health Information Systems. Cambridge, MA: Academic Press; 2016:213-229.
16. Blavin F, Ramos C, Cafarella Lallemand N, Fass J, Ozanich G, Adler-Milstein J. Analyzing the public benefit attributable to interoperable health information exchange. https://aspe.hhs.gov/system/files/pdf/258851/AnalyzingthePublicBenefitAttributabletoInteroperableHealth.pdf. Published July 2017. Accessed May 22, 2019.
Beyond the Polygraph: Deception Detection and the Autonomic Nervous System
The US Department of Defense (DoD) and law enforcement agencies around the country utilize polygraph as an aid in security screenings and interrogation. It is assumed that a person being interviewed will have a visceral response when attempting to deceive the interviewer, and that this response can be detected by measuring the change in vital signs between questions. By using vital signs as an indirect measurement of deception-induced stress, the polygraph machine may provide a false positive or negative result if a patient has an inherited or acquired condition that affects the autonomic nervous system (ANS).
A variety of diseases from alcohol use disorder to rheumatoid arthritis can affect the ANS. In addition, a multitude of commonly prescribed drugs can affect the ANS. Although in their infancy, functional magnetic resonance imaging (fMRI) and EEG (electroencephalogram) deception detection techniques circumvent these issues. Dysautonomias may be an underappreciated cause of error in polygraph interpretation. Polygraph examiners and DoD agencies should be aware of the potential for these disorders to interfere with interpretation of results. In the near future, other modalities that do not measure autonomic variables may be utilized to avoid these pitfalls.
Polygraphy
Throughout history, humans have been interested in techniques and devices that can discern lies from the truth. Even in the ancient era, it was known that the act of lying had physiologic effects. In ancient Israel, if a woman accused of adultery should develop a swollen abdomen after drinking “waters of bitterness,” she was considered guilty of the crime, as described in Numbers 5:11-31. In Ancient China, those accused of fraud would be forced to hold dry rice in their mouths; if the expectorated rice was dry, the suspect was found guilty.1 We now know that catecholamines, particularly epinephrine, secreted during times of stress, cause relaxation of smooth muscle, leading to reduced bowel motility and dry mouth.2-4 However, most methods before the modern era were based more on superstition and chance rather than any sound physiologic premise.
When asked to discern the truth from falsehood based on their own perceptions, people correctly discern lies as false merely 47% of the time and truth as nondeceptive about 61% of the time.5 In short, unaided, we are very poor lie detectors. Therefore, a great deal of interest in technology that can aid in lie detection has ensued. With enhanced technology and understanding of human physiology came a renewed interest in lie detection. Since it was known that vital signs such as blood pressure (BP), heart rate, and breathing could be affected by the stressful situation brought on by deception, quantifying and measuring those responses in an effort to detect lying became a goal. In 1881, the Italian criminologist Cesare Lombroso invented a glove that when worn by a suspect, measured their BP.6-8 Changes in BP also were the target variable of the systolic BP deception test invented by William M. Marston, PhD, in 1915.8 Marston also experimented with measurements of other variables, such as muscle tension.9 In 1921, John Larson invented the first modern polygraph machine.7
Procedures
Today’s polygraph builds on these techniques. A standard polygraph measures respiration, heart rate, BP, and sudomotor function (sweating). Respiration is measured via strain gauges strapped around the chest and abdomen that respond to chest expansion during inhalation. BP and pulse can be measured through a variety of means, including finger pulse measurement or sphygmomanometer.8
Perspiration is measured by skin electrical conductance. Human sweat contains a variety of cations and anions—mostly sodium and chloride, but also potassium, bicarbonate, and lactate. The presence of these electrolytes alter electrical conduction at the skin surface when sweat is released.10
The exact questioning procedure used to perform a polygraph examination can vary. The Comparison Question Test is most commonly used. In this format, the interview consists of questions that are relevant to the investigation at hand, interspersed with control questions. The examiner compares the changes in vital signs and skin conduction to the baseline measurements generated during the pretest interview and during control questions.8 Using these standardized techniques, some studies have shown accuracy rates between 83% and 95% in controlled settings.8 However, studies performed outside of the polygraph community have found very high false positive rates, up to 50% or greater.11
The US Supreme Court has ruled that individual jurisdictions can decide whether or not to admit polygraph evidence in court, and the US Court of Appeals for the Eleventh Circuit has ruled that polygraph results are only admissible if both parties agree to it and are given sufficient notice.12,13 Currently, New Mexico is the only state that allows polygraph results to be used as evidence without a pretrial agreement; all other states either require such an agreement or forbid the results to be used as evidence.14
Although rarely used in federal and state courts as evidence, polygraphy is commonly used during investigations and in the hiring process of government agencies. DoD Directive 5210.48 and Instruction 5210.91 enable DoD investigative organizations (eg, Naval Criminal Investigative Service, National Security Agency, US Army Investigational Command) to use polygraph as an aid during investigations into suspected involvement with foreign intelligence, terrorism against the US, mishandling of classified documents, and other serious violations.15
The Role of the Physician in Polygraph Assessment
It may be rare that the physician is called upon to provide information regarding an individual’s medical condition or related medication use and the effect of these on polygraph results. In such cases, however, the physician must remember the primary fiduciary duty to the patient. Disclosure of medical conditions cannot be made without the patient’s consent, save in very specific situations (eg, Commanding Officer Inquiry, Tarasoff Duty to Protect, etc). It is the polygraph examiner’s responsibility to be aware of potential confounders in a particular examination.10
Physicians can have a responsibility when in administrative or supervisory positions, to advise security and other officials regarding the fitness for certain duties of candidates with whom there is no physician-patient relationship. This may include an individual’s ability to undergo polygraph examination and the validity of such results. However, when a physician-patient relationship is involved, care must be given to ensure that the patient understands that the relationship is protected both by professional standards and by law and that no information will be shared without the patient’s authorization (aside from those rare exceptions provided by law). Often, a straightforward explanation to the patient of the medical condition and any medication’s potential effects on polygraph results will be sufficient, allowing the patient to report as much as is deemed necessary to the polygraph examiner.
Polygraphy Pitfalls
Polygraphy presupposes that the subject will have a consistent and measurable physiologic response when he or she attempts to deceive the interviewer. The changes in BP, heart rate, respirations, and perspiration that are detected by polygraphy and interpreted by the examiner are controlled by the ANS (Table 1). There are a variety of diseases that are known to cause autonomic dysfunction (dysautonomia). Small fiber autonomic neuropathies often result in loss of sweating and altered heart rate and BP variation and can arise from many underlying conditions. Synucleinopathies, such as Parkinson disease, alter cardiovascular reflexes.14,16
Even diseases not commonly recognized as having a predominant clinical impact on ANS function can demonstrate measurable physiologic effect. For example, approximately 60% of patients with rheumatoid arthritis will have blunted cardiovagal baroreceptor responses and heart rate variability.17 ANS dysfunction is also a common sequela of alcoholism.18 Patients with diabetes mellitus often have an elevated resting heart rate and low heart rate variability due to dysregulated β-adrenergic activity.19 The impact of reduced baroreceptor response and reduced heart rate variability could impact the polygraph interpreter’s ability to discern responses using heart rate. Individuals with ANS dysfunction that causes blunted physiologic responses could have inconclusive or potentially worse false-negative polygraph results due to lack of variation between control and target questions.
To our knowledge, no study has been performed on the validity of polygraphy in patients with any form of dysautonomia. Additionally, a 2011 process and compliance study of the DoD polygraph program specifically recommended that “adjudicators would benefit from training in polygraph capabilities and limitations.”20 Although specific requirements vary from program to program, all programs accredited by the American Polygraph Association provide training in physiology, psychology, and standardization of test results.
Many commonly prescribed medications have effects on the ANS that could affect the results of a polygraph exam (Table 2). For example, β blockers reduce β adrenergic receptor activation in cardiac muscle and blood vessels, reducing heart rate, heart rate variability, cardiac contractility, and BP.21 This class of medication is prescribed for a variety of conditions, including congestive heart failure, hypertension, panic disorder, and posttraumatic stress disorder. Thus, a patient taking β blockers will have a blunted physiologic response to stress and have an increased likelihood of an inconclusive or false-negative polygraph exam.
Some over-the-counter medications also have effects on autonomic function. Sympathomimetics such as pseudoephedrine or antihistamines with anticholinergic activity like diphenhydramine can both increase heart rate and BP.22,23 Of the 10 most prescribed medications of 2016, 5 have direct effects on the ANS or the variables measured by the polygraph machine.24 An exhaustive list of medication effects on autonomic function is beyond the scope of this article.
A medication that may affect the results of a polygraph study that is of special interest to the DoD and military is mefloquine. Mefloquine is an antimalarial drug that has been used by military personnel deployed to malaria endemic regions.25 In murine models, mefloquine has been shown to disrupt autonomic and respiratory control in the central nervous system.26 The neuropsychiatric adverse effects of mefloquine are well documented and can last for years after exposure to the drug.27 Therefore, mefloquine could affect the results of a polygraph test through both direct toxic effects on the ANS as well as causing anxiety and depression, potentially affecting the subject’s response to questioning.
Alternative Modalities
Given the pitfalls inherent with external physiologic measures for lie detection, additional modalities that bypass measurement of ANS-governed responses have been sought. Indeed, the integration and combination of more comprehensive modalities has come to be named the forensic credibility assessment.
Functional MRI
Beginning in 1991, researchers began using fMRI to see real-time perfusion changes in areas of the cerebral cortex between times of rest and mental stimulation.26 This modality provides a noninvasive technique for viewing which specific parts of the brain are stimulated during activity. When someone is engaged in active deception, the dorsolateral prefrontal cortex has greater perfusion than when the patient is engaged in truth telling.28 Since fMRI involves imaging for evaluation of the central nervous system, it avoids the potential inaccuracies that can be seen in some subjects with autonomic irregularities. In fact, fMRI may have superior sensitivity and specificity for lie detection compared with that of conventional polygraphy.29
Significant limitations to the use of fMRI include the necessity of expensive specialized equipment and trained personnel to operate the MRI. Agencies that use polygraph examinations may be unwilling to make such an investment. Further, subjects with metallic foreign bodies or noncompatible medical implants cannot undergo the MRI procedure. Finally, there have been bioethical and legal concerns raised that measuring brain activity during interrogation may endanger “cognitive freedom” and may even be considered unreasonable search and seizure under the Fourth Amendment to the US Constitution.30 However, fMRI—like polygraphy—can only measure the difference between brain perfusion in 2 states. The idea of fMRI as “mind reading” is largely a misconception.31
Electroencephalography
Various EEG modalities have received increased interest for lie detection. In EEG, electrodes are used to measure the summation of a multitude of postsynaptic action potentials and the local voltage gradient they produce when cortical pyramidal neurons are fired in synchrony.32 These voltage gradients are detectable at the scalp surface. Shortly after the invention of EEG, it was observed that specific stimuli generated unique and predicable changes in EEG morphology. These event-related potentials (ERP) are detectable by scalp EEG shortly after the stimulus is given.33
ERPs can be elicited by a multitude of sensory stimuli, have a predictable and reproducible morphology, and are believed to be a psychophysiologic correlate of mental processing of stimuli.34 The P300 is an ERP characterized by a positive change in voltage occurring 300 milliseconds after a stimulus. It is associated with stimulus processing and categorization.35 Since deception is a complex cognitive process involving recognizing pertinent stimuli and inventing false responses to them, it was theorized that the detection of a P300 ERP during a patient interview would mean the patient truly recognizes the stimulus and is denying such knowledge. Early studies performed on P300 had variable accuracy for lie detection, roughly 40% to 80%, depending on the study. Thus, the rate of false negatives would increase if the subjects were coached on countermeasures, such as increasing the significance of distractor data or counting backward by 7s.36,37 Later studies have found ways of minimizing these issues, such as detection of a P900 ERP (a cortical potential at 900 milliseconds) that can be seen when subjects are attempting countermeasures.38
Another technique for increasing accuracy in EEG-mediated lie detection is measurement of multifaceted electroencephalographic response (MER), which involves a more detailed analysis of multiple EEG electrode sites and how the signaling changes over time using both visual comparison of multiple trials as well as bootstrap analysis.37 In particular, memory- and encoding-related multifaceted electroencephalographic response (MERMER) using P300 coupled with an electrically negative impulse recorded at the frontal lobe and phasic changes in the global EEG had superior accuracy than P300 alone.37
The benefits of EEG compared with that of fMRI include large reductions in cost, space, and restrictions for use in some individuals (EEG is safe for virtually all patients, including those with metallic foreign bodies). However, like fMRI, EEG still requires trained personnel to operate and interpret. Also, it has yet to be tested outside of the laboratory.
Conclusion
The ability to detect deception is an important factor in determining security risk and adjudication of legal proceedings, but untrained persons are surprisingly poor at discerning truth from lies. The polygraph has been used by law enforcement and government agencies for decades to aid in interrogation and the screening of employees for security clearances and other types of access. However, results are vulnerable to inaccuracies in subjects with autonomic disorders and may be confounded by multiple medications. While emerging technologies such as fMRI and EEG may allow superior accuracy by bypassing ANS-based physiologic outputs, the polygraph examiner and the physician must be aware of the effect of autonomic dysfunction and of the medications that affect the ANS. This is particularly true within military medicine, as many patients within this population are subject to polygraph examination.
1. Ford EB. Lie detection: historical, neuropsychiatric and legal dimensions. Int J Law Psychiatry. 2006;29(3):159-177.
2. Ohrn PG. Catecholamine infusion and gastrointestinal propulsion in the rat. Acta Chir Scand Suppl. 1979(461):43-52.
3. Sakamoto H. The study of catecholamine, acetylcholine and bradykinin in buccal circulation in dogs. Kurume Med J. 1979;26(2):153-162.
4. Bond CF Jr, Depaulo BM. Accuracy of deception judgments. Pers Soc Psychol Rev. 2006;10(3):214-234.
5. Vicianova M. Historical techniques of lie detection. Eur J sychology. 2015;11(3):522-534.
6. Matté JA. Forensic Psychophysiology Using the Polygraph: Scientific Truth Verification, Lie Detection. Williamsville, NY: JAM Publications; 2012.
7. Segrave K. Lie Detectors: A Social History. Jefferson, NC: McFarland & Company; 2004.
8. Nelson R. Scientific basis for polygraph testing. Polygraph. 2015;44(1):28-61.
9. Boucsein W. Electrodermal Activity. New York, NY: Springer Publishing; 2012.
10. US Congress, Office of Assessment and Technology. Scientific validity of polygraph testing: a research review and evaluation. https://ota.fas.org/reports/8320.pdf. Published 1983. Accessed June 12, 2019.
11. United States v Scheffer, 523 US 303 (1998).
12. United States v Piccinonna, 729 F Supp 1336 (SD Fl 1990).
13. Fridman DS, Janoe JS. The state of judicial gatekeeping in New Mexico. https://cyber.harvard.edu/daubert/nm.htm. Updated April 17, 1999. Accessed May 20, 2019.
14. Gibbons CH. Small fiber neuropathies. Continuum (Minneap Minn). 2014;20(5 Peripheral Nervous System Disorders):1398-1412.
15. US Department of Defense. Directive 5210.48: Credibility assessment (CA) program. https://fas.org/irp/doddir/dod/d5210_48.pdf. Updated February 12, 2018. Accessed May 30, 2019.
16. Postuma RB, Gagnon JF, Pelletier A, Montplaisir J. Prodromal autonomic symptoms and signs in Parkinson’s disease and dementia with Lewy bodies. Mov Disord. 2013;28(5):597-604.
17. Adlan AM, Lip GY, Paton JF, Kitas GD, Fisher JP. Autonomic function and rheumatoid arthritis: a systematic review. Semin Arthritis Rheum. 2014;44(3):283-304.
18. Di Ciaula A, Grattagliano I, Portincasa P. Chronic alcoholics retain dyspeptic symptoms, pan-enteric dysmotility, and autonomic neuropathy before and after abstinence. J Dig Dis. 2016;17(11):735-746.
19. Thaung HA, Baldi JC, Wang H, et al. Increased efferent cardiac sympathetic nerve activity and defective intrinsic heart rate regulation in type 2 diabetes. Diabetes. 2015;64(8):2944-2956.
20. US Department of Defense, Office of the Undersecretary of Defense for Intelligence. Department of Defense polygraph program process and compliance study: study report. https://fas.org/sgp/othergov/polygraph/dod-poly.pdf. Published December 19, 2011. Accessed May 20, 2019.
21. Ladage D, Schwinger RH, Brixius K. Cardio-selective beta-blocker: pharmacological evidence and their influence on exercise capacity. Cardiovasc Ther. 2013;31(2):76-83.
22. D’Souza RS, Mercogliano C, Ojukwu E, et al. Effects of prophylactic anticholinergic medications to decrease extrapyramidal side effects in patients taking acute antiemetic drugs: a systematic review and meta-analysis Emerg Med J. 2018;35:325-331.
23. Gheorghiev MD, Hosseini F, Moran J, Cooper CE. Effects of pseudoephedrine on parameters affecting exercise performance: a meta-analysis. Sports Med Open. 2018;4(1):44.
24. Frellick M. Top-selling, top-prescribed drugs for 2016. https://www.medscape.com/viewarticle/886404. Published October 2, 2017. Accessed May 20, 2019.
25. Lall DM, Dutschmann M, Deuchars J, Deuchars S. The anti-malarial drug mefloquine disrupts central autonomic and respiratory control in the working heart brainstem preparation of the rat. J Biomed Sci. 2012;19:103.
26. Ritchie EC, Block J, Nevin RL. Psychiatric side effects of mefloquine: applications to forensic psychiatry. J Am Acad Psychiatry Law. 2013;41(2):224-235.
27. Belliveau JW, Kennedy DN Jr, McKinstry RC, et al. Functional mapping of the human visual cortex by magnetic resonance imaging. Science. 1991;254(5032):716-719.
28. Ito A, Abe N, Fujii T, et al. The contribution of the dorsolateral prefrontal cortex to the preparation for deception and truth-telling. Brain Res. 2012;1464:43-52.
29. Langleben DD, Hakun JG, Seelig D. Polygraphy and functional magnetic resonance imaging in lie detection: a controlled blind comparison using the concealed information test. J Clin Psychiatry. 2016;77(10):1372-1380.
30. Boire RG. Searching the brain: the Fourth Amendment implications of brain-based deception detection devices. Am J Bioeth. 2005;5(2):62-63; discussion W5.
31. Langleben DD. Detection of deception with fMRI: Are we there yet? Legal Criminological Psychol. 2008;13(1):1-9.
32. Marcuse LV, Fields MC, Yoo J. Rowans Primer of EEG. 2nd ed. Edinburgh, Scotland, United Kingdom: Elsevier; 2016.
33. Farwell LA, Donchin E. The truth will out: interrogative polygraphy (“lie detection”) with event-related brain potentials. Psychophysiology. 1991;28(5):531-547.
34. Sur S, Sinha VK. Event-related potential: an overview. Ind Psychiatry J. 2009;18(1):70-73.
35. Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiol. 2007;118(10):2128-2148.
36. Mertens R, Allen, JJB. The role of psychophysiology in forensic assessments: Deception detection, ERPs, and virtual reality mock crime scenarios. Psychophysiology. 2008;45(2):286-298.
37. Rosenfeld JP, Labkovsky E. New P300-based protocol to detect concealed information: resistance to mental countermeasures against only half the irrelevant stimuli and a possible ERP indicator of countermeasures. Psychophysiology. 2010;47(6):1002-1010.
38. Farwell LA, Smith SS. Using brain MERMER testing to detect knowledge despite efforts to conceal. J Forensic Sci. 2001;46(1):135-143.
The US Department of Defense (DoD) and law enforcement agencies around the country utilize polygraph as an aid in security screenings and interrogation. It is assumed that a person being interviewed will have a visceral response when attempting to deceive the interviewer, and that this response can be detected by measuring the change in vital signs between questions. By using vital signs as an indirect measurement of deception-induced stress, the polygraph machine may provide a false positive or negative result if a patient has an inherited or acquired condition that affects the autonomic nervous system (ANS).
A variety of diseases from alcohol use disorder to rheumatoid arthritis can affect the ANS. In addition, a multitude of commonly prescribed drugs can affect the ANS. Although in their infancy, functional magnetic resonance imaging (fMRI) and EEG (electroencephalogram) deception detection techniques circumvent these issues. Dysautonomias may be an underappreciated cause of error in polygraph interpretation. Polygraph examiners and DoD agencies should be aware of the potential for these disorders to interfere with interpretation of results. In the near future, other modalities that do not measure autonomic variables may be utilized to avoid these pitfalls.
Polygraphy
Throughout history, humans have been interested in techniques and devices that can discern lies from the truth. Even in the ancient era, it was known that the act of lying had physiologic effects. In ancient Israel, if a woman accused of adultery should develop a swollen abdomen after drinking “waters of bitterness,” she was considered guilty of the crime, as described in Numbers 5:11-31. In Ancient China, those accused of fraud would be forced to hold dry rice in their mouths; if the expectorated rice was dry, the suspect was found guilty.1 We now know that catecholamines, particularly epinephrine, secreted during times of stress, cause relaxation of smooth muscle, leading to reduced bowel motility and dry mouth.2-4 However, most methods before the modern era were based more on superstition and chance rather than any sound physiologic premise.
When asked to discern the truth from falsehood based on their own perceptions, people correctly discern lies as false merely 47% of the time and truth as nondeceptive about 61% of the time.5 In short, unaided, we are very poor lie detectors. Therefore, a great deal of interest in technology that can aid in lie detection has ensued. With enhanced technology and understanding of human physiology came a renewed interest in lie detection. Since it was known that vital signs such as blood pressure (BP), heart rate, and breathing could be affected by the stressful situation brought on by deception, quantifying and measuring those responses in an effort to detect lying became a goal. In 1881, the Italian criminologist Cesare Lombroso invented a glove that when worn by a suspect, measured their BP.6-8 Changes in BP also were the target variable of the systolic BP deception test invented by William M. Marston, PhD, in 1915.8 Marston also experimented with measurements of other variables, such as muscle tension.9 In 1921, John Larson invented the first modern polygraph machine.7
Procedures
Today’s polygraph builds on these techniques. A standard polygraph measures respiration, heart rate, BP, and sudomotor function (sweating). Respiration is measured via strain gauges strapped around the chest and abdomen that respond to chest expansion during inhalation. BP and pulse can be measured through a variety of means, including finger pulse measurement or sphygmomanometer.8
Perspiration is measured by skin electrical conductance. Human sweat contains a variety of cations and anions—mostly sodium and chloride, but also potassium, bicarbonate, and lactate. The presence of these electrolytes alter electrical conduction at the skin surface when sweat is released.10
The exact questioning procedure used to perform a polygraph examination can vary. The Comparison Question Test is most commonly used. In this format, the interview consists of questions that are relevant to the investigation at hand, interspersed with control questions. The examiner compares the changes in vital signs and skin conduction to the baseline measurements generated during the pretest interview and during control questions.8 Using these standardized techniques, some studies have shown accuracy rates between 83% and 95% in controlled settings.8 However, studies performed outside of the polygraph community have found very high false positive rates, up to 50% or greater.11
The US Supreme Court has ruled that individual jurisdictions can decide whether or not to admit polygraph evidence in court, and the US Court of Appeals for the Eleventh Circuit has ruled that polygraph results are only admissible if both parties agree to it and are given sufficient notice.12,13 Currently, New Mexico is the only state that allows polygraph results to be used as evidence without a pretrial agreement; all other states either require such an agreement or forbid the results to be used as evidence.14
Although rarely used in federal and state courts as evidence, polygraphy is commonly used during investigations and in the hiring process of government agencies. DoD Directive 5210.48 and Instruction 5210.91 enable DoD investigative organizations (eg, Naval Criminal Investigative Service, National Security Agency, US Army Investigational Command) to use polygraph as an aid during investigations into suspected involvement with foreign intelligence, terrorism against the US, mishandling of classified documents, and other serious violations.15
The Role of the Physician in Polygraph Assessment
It may be rare that the physician is called upon to provide information regarding an individual’s medical condition or related medication use and the effect of these on polygraph results. In such cases, however, the physician must remember the primary fiduciary duty to the patient. Disclosure of medical conditions cannot be made without the patient’s consent, save in very specific situations (eg, Commanding Officer Inquiry, Tarasoff Duty to Protect, etc). It is the polygraph examiner’s responsibility to be aware of potential confounders in a particular examination.10
Physicians can have a responsibility when in administrative or supervisory positions, to advise security and other officials regarding the fitness for certain duties of candidates with whom there is no physician-patient relationship. This may include an individual’s ability to undergo polygraph examination and the validity of such results. However, when a physician-patient relationship is involved, care must be given to ensure that the patient understands that the relationship is protected both by professional standards and by law and that no information will be shared without the patient’s authorization (aside from those rare exceptions provided by law). Often, a straightforward explanation to the patient of the medical condition and any medication’s potential effects on polygraph results will be sufficient, allowing the patient to report as much as is deemed necessary to the polygraph examiner.
Polygraphy Pitfalls
Polygraphy presupposes that the subject will have a consistent and measurable physiologic response when he or she attempts to deceive the interviewer. The changes in BP, heart rate, respirations, and perspiration that are detected by polygraphy and interpreted by the examiner are controlled by the ANS (Table 1). There are a variety of diseases that are known to cause autonomic dysfunction (dysautonomia). Small fiber autonomic neuropathies often result in loss of sweating and altered heart rate and BP variation and can arise from many underlying conditions. Synucleinopathies, such as Parkinson disease, alter cardiovascular reflexes.14,16
Even diseases not commonly recognized as having a predominant clinical impact on ANS function can demonstrate measurable physiologic effect. For example, approximately 60% of patients with rheumatoid arthritis will have blunted cardiovagal baroreceptor responses and heart rate variability.17 ANS dysfunction is also a common sequela of alcoholism.18 Patients with diabetes mellitus often have an elevated resting heart rate and low heart rate variability due to dysregulated β-adrenergic activity.19 The impact of reduced baroreceptor response and reduced heart rate variability could impact the polygraph interpreter’s ability to discern responses using heart rate. Individuals with ANS dysfunction that causes blunted physiologic responses could have inconclusive or potentially worse false-negative polygraph results due to lack of variation between control and target questions.
To our knowledge, no study has been performed on the validity of polygraphy in patients with any form of dysautonomia. Additionally, a 2011 process and compliance study of the DoD polygraph program specifically recommended that “adjudicators would benefit from training in polygraph capabilities and limitations.”20 Although specific requirements vary from program to program, all programs accredited by the American Polygraph Association provide training in physiology, psychology, and standardization of test results.
Many commonly prescribed medications have effects on the ANS that could affect the results of a polygraph exam (Table 2). For example, β blockers reduce β adrenergic receptor activation in cardiac muscle and blood vessels, reducing heart rate, heart rate variability, cardiac contractility, and BP.21 This class of medication is prescribed for a variety of conditions, including congestive heart failure, hypertension, panic disorder, and posttraumatic stress disorder. Thus, a patient taking β blockers will have a blunted physiologic response to stress and have an increased likelihood of an inconclusive or false-negative polygraph exam.
Some over-the-counter medications also have effects on autonomic function. Sympathomimetics such as pseudoephedrine or antihistamines with anticholinergic activity like diphenhydramine can both increase heart rate and BP.22,23 Of the 10 most prescribed medications of 2016, 5 have direct effects on the ANS or the variables measured by the polygraph machine.24 An exhaustive list of medication effects on autonomic function is beyond the scope of this article.
A medication that may affect the results of a polygraph study that is of special interest to the DoD and military is mefloquine. Mefloquine is an antimalarial drug that has been used by military personnel deployed to malaria endemic regions.25 In murine models, mefloquine has been shown to disrupt autonomic and respiratory control in the central nervous system.26 The neuropsychiatric adverse effects of mefloquine are well documented and can last for years after exposure to the drug.27 Therefore, mefloquine could affect the results of a polygraph test through both direct toxic effects on the ANS as well as causing anxiety and depression, potentially affecting the subject’s response to questioning.
Alternative Modalities
Given the pitfalls inherent with external physiologic measures for lie detection, additional modalities that bypass measurement of ANS-governed responses have been sought. Indeed, the integration and combination of more comprehensive modalities has come to be named the forensic credibility assessment.
Functional MRI
Beginning in 1991, researchers began using fMRI to see real-time perfusion changes in areas of the cerebral cortex between times of rest and mental stimulation.26 This modality provides a noninvasive technique for viewing which specific parts of the brain are stimulated during activity. When someone is engaged in active deception, the dorsolateral prefrontal cortex has greater perfusion than when the patient is engaged in truth telling.28 Since fMRI involves imaging for evaluation of the central nervous system, it avoids the potential inaccuracies that can be seen in some subjects with autonomic irregularities. In fact, fMRI may have superior sensitivity and specificity for lie detection compared with that of conventional polygraphy.29
Significant limitations to the use of fMRI include the necessity of expensive specialized equipment and trained personnel to operate the MRI. Agencies that use polygraph examinations may be unwilling to make such an investment. Further, subjects with metallic foreign bodies or noncompatible medical implants cannot undergo the MRI procedure. Finally, there have been bioethical and legal concerns raised that measuring brain activity during interrogation may endanger “cognitive freedom” and may even be considered unreasonable search and seizure under the Fourth Amendment to the US Constitution.30 However, fMRI—like polygraphy—can only measure the difference between brain perfusion in 2 states. The idea of fMRI as “mind reading” is largely a misconception.31
Electroencephalography
Various EEG modalities have received increased interest for lie detection. In EEG, electrodes are used to measure the summation of a multitude of postsynaptic action potentials and the local voltage gradient they produce when cortical pyramidal neurons are fired in synchrony.32 These voltage gradients are detectable at the scalp surface. Shortly after the invention of EEG, it was observed that specific stimuli generated unique and predicable changes in EEG morphology. These event-related potentials (ERP) are detectable by scalp EEG shortly after the stimulus is given.33
ERPs can be elicited by a multitude of sensory stimuli, have a predictable and reproducible morphology, and are believed to be a psychophysiologic correlate of mental processing of stimuli.34 The P300 is an ERP characterized by a positive change in voltage occurring 300 milliseconds after a stimulus. It is associated with stimulus processing and categorization.35 Since deception is a complex cognitive process involving recognizing pertinent stimuli and inventing false responses to them, it was theorized that the detection of a P300 ERP during a patient interview would mean the patient truly recognizes the stimulus and is denying such knowledge. Early studies performed on P300 had variable accuracy for lie detection, roughly 40% to 80%, depending on the study. Thus, the rate of false negatives would increase if the subjects were coached on countermeasures, such as increasing the significance of distractor data or counting backward by 7s.36,37 Later studies have found ways of minimizing these issues, such as detection of a P900 ERP (a cortical potential at 900 milliseconds) that can be seen when subjects are attempting countermeasures.38
Another technique for increasing accuracy in EEG-mediated lie detection is measurement of multifaceted electroencephalographic response (MER), which involves a more detailed analysis of multiple EEG electrode sites and how the signaling changes over time using both visual comparison of multiple trials as well as bootstrap analysis.37 In particular, memory- and encoding-related multifaceted electroencephalographic response (MERMER) using P300 coupled with an electrically negative impulse recorded at the frontal lobe and phasic changes in the global EEG had superior accuracy than P300 alone.37
The benefits of EEG compared with that of fMRI include large reductions in cost, space, and restrictions for use in some individuals (EEG is safe for virtually all patients, including those with metallic foreign bodies). However, like fMRI, EEG still requires trained personnel to operate and interpret. Also, it has yet to be tested outside of the laboratory.
Conclusion
The ability to detect deception is an important factor in determining security risk and adjudication of legal proceedings, but untrained persons are surprisingly poor at discerning truth from lies. The polygraph has been used by law enforcement and government agencies for decades to aid in interrogation and the screening of employees for security clearances and other types of access. However, results are vulnerable to inaccuracies in subjects with autonomic disorders and may be confounded by multiple medications. While emerging technologies such as fMRI and EEG may allow superior accuracy by bypassing ANS-based physiologic outputs, the polygraph examiner and the physician must be aware of the effect of autonomic dysfunction and of the medications that affect the ANS. This is particularly true within military medicine, as many patients within this population are subject to polygraph examination.
The US Department of Defense (DoD) and law enforcement agencies around the country utilize polygraph as an aid in security screenings and interrogation. It is assumed that a person being interviewed will have a visceral response when attempting to deceive the interviewer, and that this response can be detected by measuring the change in vital signs between questions. By using vital signs as an indirect measurement of deception-induced stress, the polygraph machine may provide a false positive or negative result if a patient has an inherited or acquired condition that affects the autonomic nervous system (ANS).
A variety of diseases from alcohol use disorder to rheumatoid arthritis can affect the ANS. In addition, a multitude of commonly prescribed drugs can affect the ANS. Although in their infancy, functional magnetic resonance imaging (fMRI) and EEG (electroencephalogram) deception detection techniques circumvent these issues. Dysautonomias may be an underappreciated cause of error in polygraph interpretation. Polygraph examiners and DoD agencies should be aware of the potential for these disorders to interfere with interpretation of results. In the near future, other modalities that do not measure autonomic variables may be utilized to avoid these pitfalls.
Polygraphy
Throughout history, humans have been interested in techniques and devices that can discern lies from the truth. Even in the ancient era, it was known that the act of lying had physiologic effects. In ancient Israel, if a woman accused of adultery should develop a swollen abdomen after drinking “waters of bitterness,” she was considered guilty of the crime, as described in Numbers 5:11-31. In Ancient China, those accused of fraud would be forced to hold dry rice in their mouths; if the expectorated rice was dry, the suspect was found guilty.1 We now know that catecholamines, particularly epinephrine, secreted during times of stress, cause relaxation of smooth muscle, leading to reduced bowel motility and dry mouth.2-4 However, most methods before the modern era were based more on superstition and chance rather than any sound physiologic premise.
When asked to discern the truth from falsehood based on their own perceptions, people correctly discern lies as false merely 47% of the time and truth as nondeceptive about 61% of the time.5 In short, unaided, we are very poor lie detectors. Therefore, a great deal of interest in technology that can aid in lie detection has ensued. With enhanced technology and understanding of human physiology came a renewed interest in lie detection. Since it was known that vital signs such as blood pressure (BP), heart rate, and breathing could be affected by the stressful situation brought on by deception, quantifying and measuring those responses in an effort to detect lying became a goal. In 1881, the Italian criminologist Cesare Lombroso invented a glove that when worn by a suspect, measured their BP.6-8 Changes in BP also were the target variable of the systolic BP deception test invented by William M. Marston, PhD, in 1915.8 Marston also experimented with measurements of other variables, such as muscle tension.9 In 1921, John Larson invented the first modern polygraph machine.7
Procedures
Today’s polygraph builds on these techniques. A standard polygraph measures respiration, heart rate, BP, and sudomotor function (sweating). Respiration is measured via strain gauges strapped around the chest and abdomen that respond to chest expansion during inhalation. BP and pulse can be measured through a variety of means, including finger pulse measurement or sphygmomanometer.8
Perspiration is measured by skin electrical conductance. Human sweat contains a variety of cations and anions—mostly sodium and chloride, but also potassium, bicarbonate, and lactate. The presence of these electrolytes alter electrical conduction at the skin surface when sweat is released.10
The exact questioning procedure used to perform a polygraph examination can vary. The Comparison Question Test is most commonly used. In this format, the interview consists of questions that are relevant to the investigation at hand, interspersed with control questions. The examiner compares the changes in vital signs and skin conduction to the baseline measurements generated during the pretest interview and during control questions.8 Using these standardized techniques, some studies have shown accuracy rates between 83% and 95% in controlled settings.8 However, studies performed outside of the polygraph community have found very high false positive rates, up to 50% or greater.11
The US Supreme Court has ruled that individual jurisdictions can decide whether or not to admit polygraph evidence in court, and the US Court of Appeals for the Eleventh Circuit has ruled that polygraph results are only admissible if both parties agree to it and are given sufficient notice.12,13 Currently, New Mexico is the only state that allows polygraph results to be used as evidence without a pretrial agreement; all other states either require such an agreement or forbid the results to be used as evidence.14
Although rarely used in federal and state courts as evidence, polygraphy is commonly used during investigations and in the hiring process of government agencies. DoD Directive 5210.48 and Instruction 5210.91 enable DoD investigative organizations (eg, Naval Criminal Investigative Service, National Security Agency, US Army Investigational Command) to use polygraph as an aid during investigations into suspected involvement with foreign intelligence, terrorism against the US, mishandling of classified documents, and other serious violations.15
The Role of the Physician in Polygraph Assessment
It may be rare that the physician is called upon to provide information regarding an individual’s medical condition or related medication use and the effect of these on polygraph results. In such cases, however, the physician must remember the primary fiduciary duty to the patient. Disclosure of medical conditions cannot be made without the patient’s consent, save in very specific situations (eg, Commanding Officer Inquiry, Tarasoff Duty to Protect, etc). It is the polygraph examiner’s responsibility to be aware of potential confounders in a particular examination.10
Physicians can have a responsibility when in administrative or supervisory positions, to advise security and other officials regarding the fitness for certain duties of candidates with whom there is no physician-patient relationship. This may include an individual’s ability to undergo polygraph examination and the validity of such results. However, when a physician-patient relationship is involved, care must be given to ensure that the patient understands that the relationship is protected both by professional standards and by law and that no information will be shared without the patient’s authorization (aside from those rare exceptions provided by law). Often, a straightforward explanation to the patient of the medical condition and any medication’s potential effects on polygraph results will be sufficient, allowing the patient to report as much as is deemed necessary to the polygraph examiner.
Polygraphy Pitfalls
Polygraphy presupposes that the subject will have a consistent and measurable physiologic response when he or she attempts to deceive the interviewer. The changes in BP, heart rate, respirations, and perspiration that are detected by polygraphy and interpreted by the examiner are controlled by the ANS (Table 1). There are a variety of diseases that are known to cause autonomic dysfunction (dysautonomia). Small fiber autonomic neuropathies often result in loss of sweating and altered heart rate and BP variation and can arise from many underlying conditions. Synucleinopathies, such as Parkinson disease, alter cardiovascular reflexes.14,16
Even diseases not commonly recognized as having a predominant clinical impact on ANS function can demonstrate measurable physiologic effect. For example, approximately 60% of patients with rheumatoid arthritis will have blunted cardiovagal baroreceptor responses and heart rate variability.17 ANS dysfunction is also a common sequela of alcoholism.18 Patients with diabetes mellitus often have an elevated resting heart rate and low heart rate variability due to dysregulated β-adrenergic activity.19 The impact of reduced baroreceptor response and reduced heart rate variability could impact the polygraph interpreter’s ability to discern responses using heart rate. Individuals with ANS dysfunction that causes blunted physiologic responses could have inconclusive or potentially worse false-negative polygraph results due to lack of variation between control and target questions.
To our knowledge, no study has been performed on the validity of polygraphy in patients with any form of dysautonomia. Additionally, a 2011 process and compliance study of the DoD polygraph program specifically recommended that “adjudicators would benefit from training in polygraph capabilities and limitations.”20 Although specific requirements vary from program to program, all programs accredited by the American Polygraph Association provide training in physiology, psychology, and standardization of test results.
Many commonly prescribed medications have effects on the ANS that could affect the results of a polygraph exam (Table 2). For example, β blockers reduce β adrenergic receptor activation in cardiac muscle and blood vessels, reducing heart rate, heart rate variability, cardiac contractility, and BP.21 This class of medication is prescribed for a variety of conditions, including congestive heart failure, hypertension, panic disorder, and posttraumatic stress disorder. Thus, a patient taking β blockers will have a blunted physiologic response to stress and have an increased likelihood of an inconclusive or false-negative polygraph exam.
Some over-the-counter medications also have effects on autonomic function. Sympathomimetics such as pseudoephedrine or antihistamines with anticholinergic activity like diphenhydramine can both increase heart rate and BP.22,23 Of the 10 most prescribed medications of 2016, 5 have direct effects on the ANS or the variables measured by the polygraph machine.24 An exhaustive list of medication effects on autonomic function is beyond the scope of this article.
A medication that may affect the results of a polygraph study that is of special interest to the DoD and military is mefloquine. Mefloquine is an antimalarial drug that has been used by military personnel deployed to malaria endemic regions.25 In murine models, mefloquine has been shown to disrupt autonomic and respiratory control in the central nervous system.26 The neuropsychiatric adverse effects of mefloquine are well documented and can last for years after exposure to the drug.27 Therefore, mefloquine could affect the results of a polygraph test through both direct toxic effects on the ANS as well as causing anxiety and depression, potentially affecting the subject’s response to questioning.
Alternative Modalities
Given the pitfalls inherent with external physiologic measures for lie detection, additional modalities that bypass measurement of ANS-governed responses have been sought. Indeed, the integration and combination of more comprehensive modalities has come to be named the forensic credibility assessment.
Functional MRI
Beginning in 1991, researchers began using fMRI to see real-time perfusion changes in areas of the cerebral cortex between times of rest and mental stimulation.26 This modality provides a noninvasive technique for viewing which specific parts of the brain are stimulated during activity. When someone is engaged in active deception, the dorsolateral prefrontal cortex has greater perfusion than when the patient is engaged in truth telling.28 Since fMRI involves imaging for evaluation of the central nervous system, it avoids the potential inaccuracies that can be seen in some subjects with autonomic irregularities. In fact, fMRI may have superior sensitivity and specificity for lie detection compared with that of conventional polygraphy.29
Significant limitations to the use of fMRI include the necessity of expensive specialized equipment and trained personnel to operate the MRI. Agencies that use polygraph examinations may be unwilling to make such an investment. Further, subjects with metallic foreign bodies or noncompatible medical implants cannot undergo the MRI procedure. Finally, there have been bioethical and legal concerns raised that measuring brain activity during interrogation may endanger “cognitive freedom” and may even be considered unreasonable search and seizure under the Fourth Amendment to the US Constitution.30 However, fMRI—like polygraphy—can only measure the difference between brain perfusion in 2 states. The idea of fMRI as “mind reading” is largely a misconception.31
Electroencephalography
Various EEG modalities have received increased interest for lie detection. In EEG, electrodes are used to measure the summation of a multitude of postsynaptic action potentials and the local voltage gradient they produce when cortical pyramidal neurons are fired in synchrony.32 These voltage gradients are detectable at the scalp surface. Shortly after the invention of EEG, it was observed that specific stimuli generated unique and predicable changes in EEG morphology. These event-related potentials (ERP) are detectable by scalp EEG shortly after the stimulus is given.33
ERPs can be elicited by a multitude of sensory stimuli, have a predictable and reproducible morphology, and are believed to be a psychophysiologic correlate of mental processing of stimuli.34 The P300 is an ERP characterized by a positive change in voltage occurring 300 milliseconds after a stimulus. It is associated with stimulus processing and categorization.35 Since deception is a complex cognitive process involving recognizing pertinent stimuli and inventing false responses to them, it was theorized that the detection of a P300 ERP during a patient interview would mean the patient truly recognizes the stimulus and is denying such knowledge. Early studies performed on P300 had variable accuracy for lie detection, roughly 40% to 80%, depending on the study. Thus, the rate of false negatives would increase if the subjects were coached on countermeasures, such as increasing the significance of distractor data or counting backward by 7s.36,37 Later studies have found ways of minimizing these issues, such as detection of a P900 ERP (a cortical potential at 900 milliseconds) that can be seen when subjects are attempting countermeasures.38
Another technique for increasing accuracy in EEG-mediated lie detection is measurement of multifaceted electroencephalographic response (MER), which involves a more detailed analysis of multiple EEG electrode sites and how the signaling changes over time using both visual comparison of multiple trials as well as bootstrap analysis.37 In particular, memory- and encoding-related multifaceted electroencephalographic response (MERMER) using P300 coupled with an electrically negative impulse recorded at the frontal lobe and phasic changes in the global EEG had superior accuracy than P300 alone.37
The benefits of EEG compared with that of fMRI include large reductions in cost, space, and restrictions for use in some individuals (EEG is safe for virtually all patients, including those with metallic foreign bodies). However, like fMRI, EEG still requires trained personnel to operate and interpret. Also, it has yet to be tested outside of the laboratory.
Conclusion
The ability to detect deception is an important factor in determining security risk and adjudication of legal proceedings, but untrained persons are surprisingly poor at discerning truth from lies. The polygraph has been used by law enforcement and government agencies for decades to aid in interrogation and the screening of employees for security clearances and other types of access. However, results are vulnerable to inaccuracies in subjects with autonomic disorders and may be confounded by multiple medications. While emerging technologies such as fMRI and EEG may allow superior accuracy by bypassing ANS-based physiologic outputs, the polygraph examiner and the physician must be aware of the effect of autonomic dysfunction and of the medications that affect the ANS. This is particularly true within military medicine, as many patients within this population are subject to polygraph examination.
1. Ford EB. Lie detection: historical, neuropsychiatric and legal dimensions. Int J Law Psychiatry. 2006;29(3):159-177.
2. Ohrn PG. Catecholamine infusion and gastrointestinal propulsion in the rat. Acta Chir Scand Suppl. 1979(461):43-52.
3. Sakamoto H. The study of catecholamine, acetylcholine and bradykinin in buccal circulation in dogs. Kurume Med J. 1979;26(2):153-162.
4. Bond CF Jr, Depaulo BM. Accuracy of deception judgments. Pers Soc Psychol Rev. 2006;10(3):214-234.
5. Vicianova M. Historical techniques of lie detection. Eur J sychology. 2015;11(3):522-534.
6. Matté JA. Forensic Psychophysiology Using the Polygraph: Scientific Truth Verification, Lie Detection. Williamsville, NY: JAM Publications; 2012.
7. Segrave K. Lie Detectors: A Social History. Jefferson, NC: McFarland & Company; 2004.
8. Nelson R. Scientific basis for polygraph testing. Polygraph. 2015;44(1):28-61.
9. Boucsein W. Electrodermal Activity. New York, NY: Springer Publishing; 2012.
10. US Congress, Office of Assessment and Technology. Scientific validity of polygraph testing: a research review and evaluation. https://ota.fas.org/reports/8320.pdf. Published 1983. Accessed June 12, 2019.
11. United States v Scheffer, 523 US 303 (1998).
12. United States v Piccinonna, 729 F Supp 1336 (SD Fl 1990).
13. Fridman DS, Janoe JS. The state of judicial gatekeeping in New Mexico. https://cyber.harvard.edu/daubert/nm.htm. Updated April 17, 1999. Accessed May 20, 2019.
14. Gibbons CH. Small fiber neuropathies. Continuum (Minneap Minn). 2014;20(5 Peripheral Nervous System Disorders):1398-1412.
15. US Department of Defense. Directive 5210.48: Credibility assessment (CA) program. https://fas.org/irp/doddir/dod/d5210_48.pdf. Updated February 12, 2018. Accessed May 30, 2019.
16. Postuma RB, Gagnon JF, Pelletier A, Montplaisir J. Prodromal autonomic symptoms and signs in Parkinson’s disease and dementia with Lewy bodies. Mov Disord. 2013;28(5):597-604.
17. Adlan AM, Lip GY, Paton JF, Kitas GD, Fisher JP. Autonomic function and rheumatoid arthritis: a systematic review. Semin Arthritis Rheum. 2014;44(3):283-304.
18. Di Ciaula A, Grattagliano I, Portincasa P. Chronic alcoholics retain dyspeptic symptoms, pan-enteric dysmotility, and autonomic neuropathy before and after abstinence. J Dig Dis. 2016;17(11):735-746.
19. Thaung HA, Baldi JC, Wang H, et al. Increased efferent cardiac sympathetic nerve activity and defective intrinsic heart rate regulation in type 2 diabetes. Diabetes. 2015;64(8):2944-2956.
20. US Department of Defense, Office of the Undersecretary of Defense for Intelligence. Department of Defense polygraph program process and compliance study: study report. https://fas.org/sgp/othergov/polygraph/dod-poly.pdf. Published December 19, 2011. Accessed May 20, 2019.
21. Ladage D, Schwinger RH, Brixius K. Cardio-selective beta-blocker: pharmacological evidence and their influence on exercise capacity. Cardiovasc Ther. 2013;31(2):76-83.
22. D’Souza RS, Mercogliano C, Ojukwu E, et al. Effects of prophylactic anticholinergic medications to decrease extrapyramidal side effects in patients taking acute antiemetic drugs: a systematic review and meta-analysis Emerg Med J. 2018;35:325-331.
23. Gheorghiev MD, Hosseini F, Moran J, Cooper CE. Effects of pseudoephedrine on parameters affecting exercise performance: a meta-analysis. Sports Med Open. 2018;4(1):44.
24. Frellick M. Top-selling, top-prescribed drugs for 2016. https://www.medscape.com/viewarticle/886404. Published October 2, 2017. Accessed May 20, 2019.
25. Lall DM, Dutschmann M, Deuchars J, Deuchars S. The anti-malarial drug mefloquine disrupts central autonomic and respiratory control in the working heart brainstem preparation of the rat. J Biomed Sci. 2012;19:103.
26. Ritchie EC, Block J, Nevin RL. Psychiatric side effects of mefloquine: applications to forensic psychiatry. J Am Acad Psychiatry Law. 2013;41(2):224-235.
27. Belliveau JW, Kennedy DN Jr, McKinstry RC, et al. Functional mapping of the human visual cortex by magnetic resonance imaging. Science. 1991;254(5032):716-719.
28. Ito A, Abe N, Fujii T, et al. The contribution of the dorsolateral prefrontal cortex to the preparation for deception and truth-telling. Brain Res. 2012;1464:43-52.
29. Langleben DD, Hakun JG, Seelig D. Polygraphy and functional magnetic resonance imaging in lie detection: a controlled blind comparison using the concealed information test. J Clin Psychiatry. 2016;77(10):1372-1380.
30. Boire RG. Searching the brain: the Fourth Amendment implications of brain-based deception detection devices. Am J Bioeth. 2005;5(2):62-63; discussion W5.
31. Langleben DD. Detection of deception with fMRI: Are we there yet? Legal Criminological Psychol. 2008;13(1):1-9.
32. Marcuse LV, Fields MC, Yoo J. Rowans Primer of EEG. 2nd ed. Edinburgh, Scotland, United Kingdom: Elsevier; 2016.
33. Farwell LA, Donchin E. The truth will out: interrogative polygraphy (“lie detection”) with event-related brain potentials. Psychophysiology. 1991;28(5):531-547.
34. Sur S, Sinha VK. Event-related potential: an overview. Ind Psychiatry J. 2009;18(1):70-73.
35. Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiol. 2007;118(10):2128-2148.
36. Mertens R, Allen, JJB. The role of psychophysiology in forensic assessments: Deception detection, ERPs, and virtual reality mock crime scenarios. Psychophysiology. 2008;45(2):286-298.
37. Rosenfeld JP, Labkovsky E. New P300-based protocol to detect concealed information: resistance to mental countermeasures against only half the irrelevant stimuli and a possible ERP indicator of countermeasures. Psychophysiology. 2010;47(6):1002-1010.
38. Farwell LA, Smith SS. Using brain MERMER testing to detect knowledge despite efforts to conceal. J Forensic Sci. 2001;46(1):135-143.
1. Ford EB. Lie detection: historical, neuropsychiatric and legal dimensions. Int J Law Psychiatry. 2006;29(3):159-177.
2. Ohrn PG. Catecholamine infusion and gastrointestinal propulsion in the rat. Acta Chir Scand Suppl. 1979(461):43-52.
3. Sakamoto H. The study of catecholamine, acetylcholine and bradykinin in buccal circulation in dogs. Kurume Med J. 1979;26(2):153-162.
4. Bond CF Jr, Depaulo BM. Accuracy of deception judgments. Pers Soc Psychol Rev. 2006;10(3):214-234.
5. Vicianova M. Historical techniques of lie detection. Eur J sychology. 2015;11(3):522-534.
6. Matté JA. Forensic Psychophysiology Using the Polygraph: Scientific Truth Verification, Lie Detection. Williamsville, NY: JAM Publications; 2012.
7. Segrave K. Lie Detectors: A Social History. Jefferson, NC: McFarland & Company; 2004.
8. Nelson R. Scientific basis for polygraph testing. Polygraph. 2015;44(1):28-61.
9. Boucsein W. Electrodermal Activity. New York, NY: Springer Publishing; 2012.
10. US Congress, Office of Assessment and Technology. Scientific validity of polygraph testing: a research review and evaluation. https://ota.fas.org/reports/8320.pdf. Published 1983. Accessed June 12, 2019.
11. United States v Scheffer, 523 US 303 (1998).
12. United States v Piccinonna, 729 F Supp 1336 (SD Fl 1990).
13. Fridman DS, Janoe JS. The state of judicial gatekeeping in New Mexico. https://cyber.harvard.edu/daubert/nm.htm. Updated April 17, 1999. Accessed May 20, 2019.
14. Gibbons CH. Small fiber neuropathies. Continuum (Minneap Minn). 2014;20(5 Peripheral Nervous System Disorders):1398-1412.
15. US Department of Defense. Directive 5210.48: Credibility assessment (CA) program. https://fas.org/irp/doddir/dod/d5210_48.pdf. Updated February 12, 2018. Accessed May 30, 2019.
16. Postuma RB, Gagnon JF, Pelletier A, Montplaisir J. Prodromal autonomic symptoms and signs in Parkinson’s disease and dementia with Lewy bodies. Mov Disord. 2013;28(5):597-604.
17. Adlan AM, Lip GY, Paton JF, Kitas GD, Fisher JP. Autonomic function and rheumatoid arthritis: a systematic review. Semin Arthritis Rheum. 2014;44(3):283-304.
18. Di Ciaula A, Grattagliano I, Portincasa P. Chronic alcoholics retain dyspeptic symptoms, pan-enteric dysmotility, and autonomic neuropathy before and after abstinence. J Dig Dis. 2016;17(11):735-746.
19. Thaung HA, Baldi JC, Wang H, et al. Increased efferent cardiac sympathetic nerve activity and defective intrinsic heart rate regulation in type 2 diabetes. Diabetes. 2015;64(8):2944-2956.
20. US Department of Defense, Office of the Undersecretary of Defense for Intelligence. Department of Defense polygraph program process and compliance study: study report. https://fas.org/sgp/othergov/polygraph/dod-poly.pdf. Published December 19, 2011. Accessed May 20, 2019.
21. Ladage D, Schwinger RH, Brixius K. Cardio-selective beta-blocker: pharmacological evidence and their influence on exercise capacity. Cardiovasc Ther. 2013;31(2):76-83.
22. D’Souza RS, Mercogliano C, Ojukwu E, et al. Effects of prophylactic anticholinergic medications to decrease extrapyramidal side effects in patients taking acute antiemetic drugs: a systematic review and meta-analysis Emerg Med J. 2018;35:325-331.
23. Gheorghiev MD, Hosseini F, Moran J, Cooper CE. Effects of pseudoephedrine on parameters affecting exercise performance: a meta-analysis. Sports Med Open. 2018;4(1):44.
24. Frellick M. Top-selling, top-prescribed drugs for 2016. https://www.medscape.com/viewarticle/886404. Published October 2, 2017. Accessed May 20, 2019.
25. Lall DM, Dutschmann M, Deuchars J, Deuchars S. The anti-malarial drug mefloquine disrupts central autonomic and respiratory control in the working heart brainstem preparation of the rat. J Biomed Sci. 2012;19:103.
26. Ritchie EC, Block J, Nevin RL. Psychiatric side effects of mefloquine: applications to forensic psychiatry. J Am Acad Psychiatry Law. 2013;41(2):224-235.
27. Belliveau JW, Kennedy DN Jr, McKinstry RC, et al. Functional mapping of the human visual cortex by magnetic resonance imaging. Science. 1991;254(5032):716-719.
28. Ito A, Abe N, Fujii T, et al. The contribution of the dorsolateral prefrontal cortex to the preparation for deception and truth-telling. Brain Res. 2012;1464:43-52.
29. Langleben DD, Hakun JG, Seelig D. Polygraphy and functional magnetic resonance imaging in lie detection: a controlled blind comparison using the concealed information test. J Clin Psychiatry. 2016;77(10):1372-1380.
30. Boire RG. Searching the brain: the Fourth Amendment implications of brain-based deception detection devices. Am J Bioeth. 2005;5(2):62-63; discussion W5.
31. Langleben DD. Detection of deception with fMRI: Are we there yet? Legal Criminological Psychol. 2008;13(1):1-9.
32. Marcuse LV, Fields MC, Yoo J. Rowans Primer of EEG. 2nd ed. Edinburgh, Scotland, United Kingdom: Elsevier; 2016.
33. Farwell LA, Donchin E. The truth will out: interrogative polygraphy (“lie detection”) with event-related brain potentials. Psychophysiology. 1991;28(5):531-547.
34. Sur S, Sinha VK. Event-related potential: an overview. Ind Psychiatry J. 2009;18(1):70-73.
35. Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiol. 2007;118(10):2128-2148.
36. Mertens R, Allen, JJB. The role of psychophysiology in forensic assessments: Deception detection, ERPs, and virtual reality mock crime scenarios. Psychophysiology. 2008;45(2):286-298.
37. Rosenfeld JP, Labkovsky E. New P300-based protocol to detect concealed information: resistance to mental countermeasures against only half the irrelevant stimuli and a possible ERP indicator of countermeasures. Psychophysiology. 2010;47(6):1002-1010.
38. Farwell LA, Smith SS. Using brain MERMER testing to detect knowledge despite efforts to conceal. J Forensic Sci. 2001;46(1):135-143.
Enoxaparin vs Continuous Heparin for Periprocedural Bridging in Patients With Atrial Fibrillation and Advanced Chronic Kidney Disease
There has been a long-standing controversy in the use of parenteral anticoagulation for perioperative bridging in patients with atrial fibrillation (AF) pursuing elective surgery.1 The decision to bridge is dependent on the patient’s risk of thromboembolic complications and susceptibility to bleed.1 The BRIDGE trial showed noninferiority in rate of stroke and embolism events between low molecular weight heparins (LMWHs) and no perioperative bridging.2 However, according to the American College of Chest Physicians (CHEST) 2012 guidelines, patients in the BRIDGE trial would be deemed low risk for thromboembolic events displayed by a mean CHADS2 (congestive heart failure [CHF], hypertension, age, diabetes mellitus, and stroke/transient ischemic attack) score of 2.3. Also, the BRIDGE study and many others excluded patients with advanced forms of chronic kidney disease (CKD).2,3
Similar to patients with AF, patients with advanced CKD (ACKD, stage 4 and 5 CKD) have an increased risk of stroke and venous thromboembolism (VTE).4,5 Patients with AF and ACKD have not been adequately studied for perioperative anticoagulation bridging outcomes. Although unfractionated heparin (UFH) is preferred over LMWH in ACKD patients,enoxaparin can be used in this population.1,6 Enoxaparin 1 mg/kg once daily is approved by the US Food and Drug Administration (FDA) for use in patients with severe renal insufficiency defined as creatinine clearance (CrCl) < 30 mL/min. This dosage adjustment is subsequent to studies with enoxaparin 1 mg/kg twice daily that showed a significant increase in major and minor bleeding in severe renal-insufficient patients with CrCl < 30 mL/min vs patients with CrCl > 30 mL/min.7 When comparing the myocardial infarction (MI) outcomes of severe renal-insufficient patients in the ExTRACT-TIMI 25 trial, enoxaparin 1 mg/kg once daily had no significant difference in nonfatal major bleeding vs UFH.8 In patients without renal impairment (no documentation of kidney disease), bridging therapy with LMWH was completed more than UFH in < 24 hours of hospital stay and with similar rates of VTEs and major bleeding.9 In addition to its ability to be administered outpatient, enoxaparin has a more predictable pharmacokinetic profile, allowing for less monitoring and a lower incidence of heparin-induced thrombocytopenia (HIT) vs that of UFH.6
The Michael E. DeBakey Veteran Affairs Medical Center (MEDVAMC) in Houston, Texas, is one of the largest US Department of Veterans Affairs (VA) hospitals in the US, managing > 150,000 veterans in Southeast Texas and other southern states. As a referral center for traveling patients, it is crucial that MEDVAMC decrease hospital length of stay (LOS) to increase space for incoming patients. Reducing LOS also reduces costs and may have a correlation with decreasing the incidence of nosocomial infections. Because of its significance to this facility, hospital LOS is an appropriate primary outcome for this study.
To our knowledge, bridging outcomes between LMWH and UFH in patients with AF and ACKD have never been studied. We hypothesized that using enoxaparin instead of heparin for periprocedural management would result in decreased hospital LOS, leading to a lower economic burden and lower incidence of nosocomial infections with no significant differences in major and minor bleeding and thromboembolic complications.10
Methods
This study was a single-center, retrospective chart review of adult patients from January 2008 to September 2017. The review was conducted at MEDVAMC and was approved by the research and development committee and by the Baylor College of Medicine Institutional Review Board. Formal consent was not required.
Included patients were aged ≥ 18 years with diagnoses of AF or atrial flutter and ACKD as recognized by a glomerular filtration rate (eGFR) of < 30 mL/min/1.73 m2 as calculated by use of the Modification of Diet in Renal Disease Study (MDRD) equation.11 Patients must have previously been on warfarin and required temporary interruption of warfarin for an elective procedure. During the interruption of warfarin therapy, a requirement was set for patients to be on periprocedural anticoagulation with subcutaneous (SC) enoxaparin 1 mg/kg daily or continuous IV heparin per MEDVAMC heparin protocol. Patients were excluded if they had experienced major bleeding in the 6 weeks prior to the elective procedure, had current thrombocytopenia (platelet count < 100 × 109/L), or had a history of heparin-induced thrombocytopenia (HIT) or a heparin allergy.
This patient population was identified using TheraDoc Clinical Surveillance Software System (Charlotte, NC), which has prebuilt alert reviews for anticoagulation medications, including enoxaparin and heparin. An alert for patients on enoxaparin with serum creatinine (SCr) > 1.5 mg/dL was used to screen patients who met the inclusion criteria. A second alert identified patients on heparin. The VA Computerized Patient Record System (CPRS) was used to collect patient data.
Economic Analysis
An economic analysis was conducted using data from the VA Managerial Cost Accounting Reports. Data on the national average cost per bed day was used for the purpose of extrapolating this information to multiple VA institutions.12 National average cost per day was determined by dividing the total cost by the number of bed days for the identified treating specialty during the fiscal period of 2018. Average cost per day data included costs for bed day, surgery, radiology services, laboratory tests, pharmacy services, treatment location (ie, intensive care units [ICUs]) and all other costs associated with an inpatient stay. A cost analysis was performed using this average cost per bed day and the mean LOS between enoxaparin and UFH for each treating specialty. The major outcome of the cost analysis was the total cost per average inpatient stay. The national average cost per bed day for each treating specialty was multiplied by the average LOS found for each treating specialty in this study; the sum of all the average costs per inpatient stay for the treating specialties resulted in the total cost per average inpatient stay. Permission to use these data was granted by the Pharmacy and Critical Care Services at MEDVAMC.
Patient Demographics and Characteristics
Data were collected on patient demographics (Table 1). Nosocomial infections, stroke/transient ischemic attack, MI, VTE, major and minor bleeding, and death are defined in Table 2.
The primary outcome of the study was hospital LOS. The study was powered at 90% for α = .05, which gives a required study population of 114 (1:1 enrollment ratio) patients to determine a statistically significant difference in hospital stay. This sample size was calculated using the mean hospital LOS (the primary objective) in the REGIMEN registry for LMWH (4.6 days) and UFH (10.3 days).9 To our knowledge, the incidence of nosocomial infections (a secondary outcome) has not been studied in this patient population; therefore, there was no basis to assess an appropriate sample size to find a difference in this outcome. Furthermore, the goal was to collect as many patients as possible to best assess this variable. Because of an expected high exclusion rate, 504 patients were reviewed to target a sample size of 120 patients. Due to the single-center nature of this review, the secondary outcomes of thromboembolic complications and major and minor bleeding were expected to be underpowered.
The final analysis compared the enoxaparin arm with the UFH arm. Univariate differences between the treatment groups were compared using the Fisher exact test for categorical variables. Demographic data and other continuous variables were analyzed by an unpaired t test to compare means between the 2 arms. Outcomes and characteristics were deemed statistically significant when α (P value) was < .05. All P values reported were 2-tailed with a 95% CI. No statistical analysis was performed for the cost differences (based on LOS per treating specialty) in the 2 treatment arms. Statistical analyses were completed by utilizing GraphPad Software (San Diego, CA).
Results
In total, 50 patients were analyzed in the study. There were 36 patients bridged with IV UFH at a concentration of 25,000 U/250 mL with an initial infusion rate of 12 U/kg/h. For the other arm, 14 patients were anticoagulated with renally dosed enoxaparin 1 mg/kg/d with an average daily dose of 89.3 mg; the mean actual body weight in this group was 90.9 mg (correlates with enoxaparin daily dose). Physicians of the primary team decided which parenteral anticoagulant to use. The difference in mean duration of inpatient parental anticoagulation between both groups was not statistically significant: enoxaparin at 7.1 days and UFH at 9.6 days (P = .19). Patients in the enoxaparin arm were off warfarin therapy for an average of 6.0 days vs 7.5 days for the UFH group (P = .29). The duration of outpatient anticoagulation with enoxaparin was not analyzed in this study.
Patient and Procedure Characteristics
All patients had AF or atrial flutter with 86% of patients (n = 43) having a CHADS2 > 2 and 48% (n = 29) having a CHA2DS2VASc > 4. Overall, the mean age was 71.3 years with similarities in ethnicity distribution. Patients had multiple comorbidities as shown by a mean Charlson Comorbidity Index (CCI) of 7.7 and an increased risk of bleeding as evidenced by 98% (n = 48) of patients having a HAS-BLED score of ≥ 3. A greater percentage of patients bridged with enoxaparin had DM, history of stroke and MI, and a heart valve, whereas UFH patients were more likely to be in stage 5 CKD (eGFR < 15 mL/min/1.73m2) with a significantly lower mean eGFR (16.76 vs 22.64, P = .03). Furthermore, there were more patients on hemodialysis in the UFH (50%) arm vs enoxaparin (21%) arm and a lower mean CrCl with UFH (20.1 mL/min) compared with enoxaparin (24.9 mL/min); however, the differences in hemodialysis and mean CrCl were not statistically significant. There were no patients on peritoneal dialysis in this review.
Procedure Characteristics
The average Revised Cardiac Risk Index (RCRI) score was about 3, indicating that these patients were at a Class IV risk (11%) of having a perioperative cardiac event (Table 3). Nineteen patients (38%) elected for a major surgery with all but 1 of the surgeries (major or minor) being invasive. The average length of surgery was 1.2 hours, and patients were more likely to undergo cardiothoracic procedures (38%). There were 2 out of 14 (14%) patients on enoxaparin who were able to have surgery as an outpatient; whereas this did not occur in patients on UFH. The procedures completed for these patients were a colostomy (minor surgery) and arteriovenous graft repair (major surgery). There were no statistically significant differences regarding types of procedures between the 2 arms.
Outcomes
The primary outcome of this study, hospital LOS, differed significantly in the enoxaparin arm vs UFH: 10.2 days vs 17.5 days, P = .04 (Table 4). The time-to-discharge from initiation of parenteral anticoagulation was significantly reduced with enoxaparin (7.1 days) compared with UFH (11.9 days); P = .04. Although also reduced in the enoxaparin arm, ICU LOS did not show statistical significance (1.1 days vs 4.0 days, P = .09).
About 36% (n = 18) of patients in this study acquired an infection during hospitalization for elective surgery. The most common microorganism and site of infection were Enterococcus species and urinary tract, respectively (Table 5). Nearly half (44%, n = 16) of the patients in the UFH group had a nosocomial infection vs 14% (n = 2) of enoxaparin-bridged patients with a difference approaching significance; P = .056. Both patients in the enoxaparin group had the urinary tract as the primary source of infection; 1 of these patients had a urologic procedure.
Major bleeding occurred in 7% (n = 1) of enoxaparin patients vs 22% (n = 8) in the UFH arm, but this was not found to be statistically significant (P = .41). Minor bleeding was similar between enoxaparin and UFH arms (14% vs 19%, P = .99). Regarding thromboembolic complications, the enoxaparin group (0%) had a numerical reduction compared to UFH (11%) with VTE (n = 4) being the only occurrence of the composite outcome (P = .57). There were 4 deaths within 30 days posthospitalization—all were from the UFH group (P = .57). Due to the small sample size of this study, these outcomes (bleeding and thrombotic events) were not powered to detect a statistically significant difference.
Economic Analysis
The average cost differences (Table 6) of hospitalization between enoxaparin and UFH were calculated using the average LOS per treating specialty multiplied by the national average cost of the MCO for an inpatient bed day in 2018.12 The treating specialty with the longest average LOS in the enoxaparin arm was thoracic (4.7 days). The UFH arm also had a large LOS (average days) for the thoracic specialty (6.4 days); however, the vascular specialty (6.7 days) had the longest average LOS in this group. Due to a mean LOS of 10.2 days in the enoxaparin arm, which was further stratified by treating specialty, the total cost per average inpatient stay was calculated as $51,710. On the other hand, patients in the UFH arm had a total cost per average inpatient stay of $92,848.
Monitoring
Anti-factor Xa levels for LMWH monitoring were not analyzed in this study due to a lack of values collected; only 1 patient had an anti-factor Xa level checked during this time frame. Infusion rates of UFH were adjusted based on aPTT levels collected per MEDVAMC inpatient anticoagulation protocol. The average percentage of aPTT in therapeutic range was 46.3% and the mean time-to-therapeutic range (SD) was about 2.4 (1.3) days. Due to this study’s retrospective nature, there were inconsistencies with availability of documentation of UFH infusion rates. For this reason, these values were not analyzed further.
Discussion
In 2017, the American College of Cardiology published the Periprocedural Anticoagulation Expert Consensus Pathway, which recommends for patients with AF at low risk (CHA2DS2VASc 1-4) of thromboembolism to not be bridged (unless patient had a prior VTE or stroke/TIA).13 Nearly half the patients in this study, were classified as moderate-to-high thrombotic risk as evidenced by a CHA2DS2VASc > 4 with a mean score of 4.8. Due to this study’s retrospective design from 2008 to 2017, many of the clinicians may have referenced the 2008 CHEST antithrombotic guidelines when making the decision to bridge patients; these guidelines and the previous MEDVAMC anticoagulation protocol recommend bridging patients with AF with CHADS2 > 2 (moderate-to-high thrombotic risk) in which all but 1 of the patients in this study met criteria.1,14 In contrast to the landmark BRIDGE trial, the mean CHADS2 score in this study was 3.6; this is an indication that our patient population was of individuals at an increased risk of stroke and embolism.
In addition to thromboembolic complications, patients in the current study also were at increased risk of clinically relevant bleeding with a mean HAS-BLED score of 4.1 and nearly all patients having a score > 3. The complexity of the veteran population also was displayed by this study’s mean CCI (7.7) and RCRI (3.0) indicating a 0% estimated 10-year survival and a 11% increase in having a perioperative cardiac event, respectively. A mean CCI of 7.7 is associated with a 13.3 relative risk of death within 6 years postoperation.15 All patients had a diagnosis of hypertension, and > 75% had this diagnosis complicated by DM. In addition, this patient population was of those with extensive cardiovascular disease or increased risk, which makes for a clinically relevant application of patients who would require periprocedural bridging.
Another positive aspect of this study is that all the baseline characteristics, apart from renal function, were similar between arms, helping to strengthen the ability to adequately compare the 2 bridging modalities. Our assumption for the reasoning that more stage 5 CKD and dialysis patients were anticoagulated with UFH vs enoxaparin is a result of concern for an increased risk of bleeding with a medication that is renally cleared 30% less in CrCl < 30 mL/min.16 Although, enoxaparin 1 mg/kg/d is FDA approved as a therapeutic anticoagulant option, clinicians at MEDVAMC likely had reservations about its use in end-stage CKD patients. Unlike many studies, including the BRIDGE trial, patients with ACKD were not excluded from this trial, and the outcomes with enoxaparin are available for interpretation.
To no surprise, for patients included in this study, enoxaparin use led to shorter hospital LOS, reduced ICU LOS, and a quicker time-to-discharge from initiation. This is credited to the 100% bioavailability of SC enoxaparin in conjunction with its means to be a therapeutic option as an outpatient.16 Unlike IV UFH, patients requiring bridging can be discharged on SC injections of enoxaparin until a therapeutic INR is maintained with warfarin.The duration of hospital LOS in both arms were longer in this study compared with that of other studies.9 This may be due to clinicians being more cautious with renal insufficient patients, and the patients included in this study had multiple comorbidities. According to an economic analysis performed by Amorosi and colleagues in 2004, bridging with enoxaparin instead of UFH can save up to $3,733 per patient and reduce bridging costs by 63% to 85% driven primarily by decreased hospital LOS.10
Economic Outcome
In our study, we conducted a cost analysis using national VA data that indicated a $41,138 or 44% reduction in total cost per average inpatient stay when bridging 1 patient with enoxaparin vs UFH. The benefit of this cost analysis is that it reflects direct costs at VA institutions nationally; this will allow these data to be useful for practitioners at MEDVAMC and other VA hospitals. Stratifying the costs by treating specialty instead of treatment location minimized skewing of the data as there were some patients with long LOS in the ICU. No patients in the enoxaparin arm were treated in otolaryngology, which may have skewed the data. The data included direct costs for beds as well as costs for multiple services, such as procedures, pharmacy, nursing, laboratory tests, and imaging. Unlike the Amorosi study, our review did not include acquisition costs for enoxaparin syringes and bags of UFH or laboratory costs for aPTT and anti-factor Xa levels in part because of the data source and the difficulty calculating costs over a 10-year span.
Patients in the enoxaparin arm had a trend toward fewer occurrences of hospital-acquired infections than did those in the UFH arm, which we believe is due to a decreased LOS (in both total hospital and ICU days) and fewer blood draws needed for monitoring. This also may be attributed to a longer mean duration of surgery in the UFH arm (1.3 hours) vs enoxaparin (0.9 hours). The percentage of patients with procedures ≥ 45 minutes and the types of procedures between both arms were similar. However, these outcomes were not statistically significant. In addition, elderly males who are hospitalized may require a catheter (due to urinary retention), and catheter-associated urinary tract infection (CAUTI) is one of the highest reported infections in acute care hospitals in the US. This is in line with our patient population and may be a supplementary reason for the increase in infection incidence with UFH. Though, whether urinary catheters were used in these patients was not evaluated in this study.
Despite being at an increased risk of experiencing a major adverse cardiovascular event (MACE), no patients in either arm had a stroke/TIA or MI within 30 days postprocedure. The only occurrences documented were VTEs, which happened only in 4 patients on UFH. Four people died in this study, solely in the UFH arm. The incidence of thromboembolic complications and death along with major and minor bleeding cannot be deduced as meaningful as this study was underpowered for these outcomes. Despite anti-factor Xa monitoring being recommended in ACKD patients on enoxaparin, this monitoring was not routinely performed in this study. Another limitation was the inability to adequately assess the appropriateness of nurse-adjusted UFH infusion rates largely due to the retrospective nature of this study. The variability of aPTT percentage in therapeutic range and time-to-therapeutic range reported was indicative of the difficulties of monitoring for the safety and efficacy of UFH.
In 1991, Cruickshank and colleagues conducted a study in which a standard nomogram (similar to the MEDVAMC nomogram) for the adjustment of IV heparin was implemented at a single hospital.17 The success rate (aPTT percentage in therapeutic range) was 59.4% and average time-to-therapeutic range was about 1 day. The success rate (46.3%) and time-to-therapeutic range (2.4 days) in our study were lower and longer, respectively, than was expected. One potential reason for this discrepancy could be the differences in indication as the patients in Cruickshank and colleagues were being treated for VTE, whereas patients in our study had AF or atrial flutter. Also, there were inconsistencies in the availability of documentation of monitoring parameters for heparin due to the study time frame and retrospective design. Patients on UFH who are not within the therapeutic range in a timely manner are at greater risk of MACE and major/minor bleeding. Our study was not powered to detect these findings.
Strengths and Limitations
A significant limitation of this study was its small sample size; the study was not able to meet power for the primary outcome; it is unknown whether our study met power for nosocomial infections. The study also was not a powered review of other adverse events, such as thromboembolic complications, bleeding, and death. The study had an uneven number of patients, which made it more difficult to appropriately compare 2 patient populations; the study also did not include medians for patient characteristics and outcomes.
Due to this study’s time frame, the clinical pharmacy services at MEDVAMC were not as robust as they are now, which is the reason the decisions on which anticoagulant to use were primarily physician based. The use of TheraDoc to identify patients posed the risk of missing patients who may not have had the appropriate laboratory tests performed (ie, SCr). Patients on UFH had a reduced eGFR compared with that of enoxaparin, which may limit our extrapolation of enoxaparin’s use in end-stage renal disease. The reduced eGFR and higher number of dialysis patients in the UFH arm may have increased the occurrence of more labile INRs and bleeding outcomes. Patients on hemodialysis typically have more comorbidities and an increased risk of infection due to the frequent use of catheters and needles to access the bloodstream. In addition, the potential differences in catheter use and duration between groups were not identified. If these parameters were studied, the data collected may have helped better explain the reasoning for increased incidence of infection in the UFH arm.
Strengths of this study include a complex patient population with similar characteristics, distribution of ethnicities representative of the US population, patients at moderate-to-high thrombotic risk, the analysis of nosocomial infections, and the exclusion of patients with normal renal function or moderate CKD.
Conclusion
To our knowledge, this is the first study to compare periprocedural bridging outcomes and incidence of nosocomial infections in patients with AF and ACKD. This review provides new evidence that in this patient population, enoxaparin is a potential anticoagulant to reduce hospital LOS and hospital-acquired infections. Compared with UFH, bridging with enoxaparin reduced hospital LOS and anticoagulation time-to-discharge by 7 and 5 days, respectively, and decreased the incidence of nosocomial infections by 30%. Using the mean LOS per treating specialty for both arms, bridging 1 patient with AF with enoxaparin vs UFH can potentially lead to an estimated $40,000 (44%) reduction in total cost of hospitalization. Enoxaparin also had no numeric differences in mortality and adverse events (stroke/TIA, MI, VTE) vs that of UFH, but it is important to note that this study was not powered to find a significant difference in these outcomes. Due to the mean eGFR of patients on enoxaparin being 22.6 mL/min/1.73 m2 and only 1 in 5 having stage 5 CKD, at this time, we do not recommend enoxaparin for periprocedural use in stage 5 CKD or in patients on hemodialysis. Larger studies are needed, including randomized trials, in this patient population to further evaluate these outcomes and assess the use of enoxaparin in patients with ACKD.
1. Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2)(suppl):e326S-350S.
2. Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-833.
3. Hammerstingl C, Schmitz A, Fimmers R, Omran H. Bridging of chronic oral anticoagulation with enoxaparin in patients with atrial fibrillation: results from the prospective BRAVE registry. Cardiovasc Ther. 2009;27(4):230-238.
4. Dad T, Weiner DE. Stroke and chronic kidney disease: epidemiology, pathogenesis, and management across kidney disease stages. Semin Nephrol. 2015;35(4):311-322.
5. Wattanakit K, Cushman M. Chronic kidney disease and venous thromboembolism: epidemiology and mechanisms. Curr Opin Pulm Med. 2009;15(5):408-412.
6. Saltiel M. Dosing low molecular weight heparins in kidney disease. J Pharm Pract. 2010;23(3):205-209.
7. Spinler SA, Inverso SM, Cohen M, Goodman SG, Stringer KA, Antman EM; ESSENCE and TIMI 11B Investigators. Safety and efficacy of unfractionated heparin versus enoxaparin in patients who are obese and patients with severe renal impairment: analysis from the ESSENCE and TIMI 11B studies. Am Heart J. 2003;146(1):33-41.
8. Fox KA, Antman EM, Montalescot G, et al. The impact of renal dysfunction on outcomes in the ExTRACT-TIMI 25 trial. J Am Coll Cardiol. 2007;49(23):2249-2255.
9. Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost. 2006;4(6):1246-1252.
10. Amorosi SL, Tsilimingras K, Thompson D, Fanikos J, Weinstein MC, Goldhaber SZ. Cost analysis of “bridging therapy” with low-molecular-weight heparin versus unfractionated heparin during temporary interruption of chronic anticoagulation. Am J Cardiol. 2004;93(4):509-511.
11. Inker LA, Astor BC, Fox CH, et al. KDOQI US commentary on the 2012 KDIGO clinical practice guideline for the evaluation and management of CKD. Am J Kidney Dis. 2014;63(5):713-735.
12. US Department of Veteran Affairs. Managerial Cost Accounting Financial User Support Reports: fiscal year 2018. https://www.herc.research.va.gov/include/page.asp?id=managerial-cost-accounting. [Source not verified.]
13. Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC Expert Consensus Decision Pathway for Periprocedural Management of Anticoagulation in Patients With Nonvalvular Atrial Fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol. 2017;69(7):871-898.
14. Kearon C, Kahn SR, Agnelli G, et al. Antithrombotic therapy for venous thromboembolic disease: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition). Chest. 2008;133(6 suppl):454S-545S.
15. Charlson M, Szatrowski TP, Peterson J, Gold J. Validation of a combined comorbidity index. J Clin Epidemiol. 1994;47(11):1245-1251.
16. Lovenox [package insert]. Bridgewater, NJ: Sanofi-Aventis; December 2017.
17. Cruickshank MK, Levine MN, Hirsh J, Roberts R, Siguenza M. A standard heparin nomogram for the management of heparin therapy. Arch Intern Med. 1991;151(2):333-337.
18. Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation. 2015;131(5):488-494.
19. Verheugt FW, Steinhubl SR, Hamon M, et al. Incidence, prognostic impact, and influence of antithrombotic therapy on access and nonaccess site bleeding in percutaneous coronary intervention. JACC Cardiovasc Interv. 2011;4(2):191-197.
20. Bijsterveld NR, Peters RJ, Murphy SA, Bernink PJ, Tijssen JG, Cohen M. Recurrent cardiac ischemic events early after discontinuation of short-term heparin treatment in acute coronary syndromes: results from the Thrombolysis in Myocardial Infarction (TIMI) 11B and Efficacy and Safety of Subcutaneous Enoxaparin in Non-Q-Wave Coronary Events (ESSENCE) studies. J Am Coll Cardiol. 2003;42(12):2083-2089.
There has been a long-standing controversy in the use of parenteral anticoagulation for perioperative bridging in patients with atrial fibrillation (AF) pursuing elective surgery.1 The decision to bridge is dependent on the patient’s risk of thromboembolic complications and susceptibility to bleed.1 The BRIDGE trial showed noninferiority in rate of stroke and embolism events between low molecular weight heparins (LMWHs) and no perioperative bridging.2 However, according to the American College of Chest Physicians (CHEST) 2012 guidelines, patients in the BRIDGE trial would be deemed low risk for thromboembolic events displayed by a mean CHADS2 (congestive heart failure [CHF], hypertension, age, diabetes mellitus, and stroke/transient ischemic attack) score of 2.3. Also, the BRIDGE study and many others excluded patients with advanced forms of chronic kidney disease (CKD).2,3
Similar to patients with AF, patients with advanced CKD (ACKD, stage 4 and 5 CKD) have an increased risk of stroke and venous thromboembolism (VTE).4,5 Patients with AF and ACKD have not been adequately studied for perioperative anticoagulation bridging outcomes. Although unfractionated heparin (UFH) is preferred over LMWH in ACKD patients,enoxaparin can be used in this population.1,6 Enoxaparin 1 mg/kg once daily is approved by the US Food and Drug Administration (FDA) for use in patients with severe renal insufficiency defined as creatinine clearance (CrCl) < 30 mL/min. This dosage adjustment is subsequent to studies with enoxaparin 1 mg/kg twice daily that showed a significant increase in major and minor bleeding in severe renal-insufficient patients with CrCl < 30 mL/min vs patients with CrCl > 30 mL/min.7 When comparing the myocardial infarction (MI) outcomes of severe renal-insufficient patients in the ExTRACT-TIMI 25 trial, enoxaparin 1 mg/kg once daily had no significant difference in nonfatal major bleeding vs UFH.8 In patients without renal impairment (no documentation of kidney disease), bridging therapy with LMWH was completed more than UFH in < 24 hours of hospital stay and with similar rates of VTEs and major bleeding.9 In addition to its ability to be administered outpatient, enoxaparin has a more predictable pharmacokinetic profile, allowing for less monitoring and a lower incidence of heparin-induced thrombocytopenia (HIT) vs that of UFH.6
The Michael E. DeBakey Veteran Affairs Medical Center (MEDVAMC) in Houston, Texas, is one of the largest US Department of Veterans Affairs (VA) hospitals in the US, managing > 150,000 veterans in Southeast Texas and other southern states. As a referral center for traveling patients, it is crucial that MEDVAMC decrease hospital length of stay (LOS) to increase space for incoming patients. Reducing LOS also reduces costs and may have a correlation with decreasing the incidence of nosocomial infections. Because of its significance to this facility, hospital LOS is an appropriate primary outcome for this study.
To our knowledge, bridging outcomes between LMWH and UFH in patients with AF and ACKD have never been studied. We hypothesized that using enoxaparin instead of heparin for periprocedural management would result in decreased hospital LOS, leading to a lower economic burden and lower incidence of nosocomial infections with no significant differences in major and minor bleeding and thromboembolic complications.10
Methods
This study was a single-center, retrospective chart review of adult patients from January 2008 to September 2017. The review was conducted at MEDVAMC and was approved by the research and development committee and by the Baylor College of Medicine Institutional Review Board. Formal consent was not required.
Included patients were aged ≥ 18 years with diagnoses of AF or atrial flutter and ACKD as recognized by a glomerular filtration rate (eGFR) of < 30 mL/min/1.73 m2 as calculated by use of the Modification of Diet in Renal Disease Study (MDRD) equation.11 Patients must have previously been on warfarin and required temporary interruption of warfarin for an elective procedure. During the interruption of warfarin therapy, a requirement was set for patients to be on periprocedural anticoagulation with subcutaneous (SC) enoxaparin 1 mg/kg daily or continuous IV heparin per MEDVAMC heparin protocol. Patients were excluded if they had experienced major bleeding in the 6 weeks prior to the elective procedure, had current thrombocytopenia (platelet count < 100 × 109/L), or had a history of heparin-induced thrombocytopenia (HIT) or a heparin allergy.
This patient population was identified using TheraDoc Clinical Surveillance Software System (Charlotte, NC), which has prebuilt alert reviews for anticoagulation medications, including enoxaparin and heparin. An alert for patients on enoxaparin with serum creatinine (SCr) > 1.5 mg/dL was used to screen patients who met the inclusion criteria. A second alert identified patients on heparin. The VA Computerized Patient Record System (CPRS) was used to collect patient data.
Economic Analysis
An economic analysis was conducted using data from the VA Managerial Cost Accounting Reports. Data on the national average cost per bed day was used for the purpose of extrapolating this information to multiple VA institutions.12 National average cost per day was determined by dividing the total cost by the number of bed days for the identified treating specialty during the fiscal period of 2018. Average cost per day data included costs for bed day, surgery, radiology services, laboratory tests, pharmacy services, treatment location (ie, intensive care units [ICUs]) and all other costs associated with an inpatient stay. A cost analysis was performed using this average cost per bed day and the mean LOS between enoxaparin and UFH for each treating specialty. The major outcome of the cost analysis was the total cost per average inpatient stay. The national average cost per bed day for each treating specialty was multiplied by the average LOS found for each treating specialty in this study; the sum of all the average costs per inpatient stay for the treating specialties resulted in the total cost per average inpatient stay. Permission to use these data was granted by the Pharmacy and Critical Care Services at MEDVAMC.
Patient Demographics and Characteristics
Data were collected on patient demographics (Table 1). Nosocomial infections, stroke/transient ischemic attack, MI, VTE, major and minor bleeding, and death are defined in Table 2.
The primary outcome of the study was hospital LOS. The study was powered at 90% for α = .05, which gives a required study population of 114 (1:1 enrollment ratio) patients to determine a statistically significant difference in hospital stay. This sample size was calculated using the mean hospital LOS (the primary objective) in the REGIMEN registry for LMWH (4.6 days) and UFH (10.3 days).9 To our knowledge, the incidence of nosocomial infections (a secondary outcome) has not been studied in this patient population; therefore, there was no basis to assess an appropriate sample size to find a difference in this outcome. Furthermore, the goal was to collect as many patients as possible to best assess this variable. Because of an expected high exclusion rate, 504 patients were reviewed to target a sample size of 120 patients. Due to the single-center nature of this review, the secondary outcomes of thromboembolic complications and major and minor bleeding were expected to be underpowered.
The final analysis compared the enoxaparin arm with the UFH arm. Univariate differences between the treatment groups were compared using the Fisher exact test for categorical variables. Demographic data and other continuous variables were analyzed by an unpaired t test to compare means between the 2 arms. Outcomes and characteristics were deemed statistically significant when α (P value) was < .05. All P values reported were 2-tailed with a 95% CI. No statistical analysis was performed for the cost differences (based on LOS per treating specialty) in the 2 treatment arms. Statistical analyses were completed by utilizing GraphPad Software (San Diego, CA).
Results
In total, 50 patients were analyzed in the study. There were 36 patients bridged with IV UFH at a concentration of 25,000 U/250 mL with an initial infusion rate of 12 U/kg/h. For the other arm, 14 patients were anticoagulated with renally dosed enoxaparin 1 mg/kg/d with an average daily dose of 89.3 mg; the mean actual body weight in this group was 90.9 mg (correlates with enoxaparin daily dose). Physicians of the primary team decided which parenteral anticoagulant to use. The difference in mean duration of inpatient parental anticoagulation between both groups was not statistically significant: enoxaparin at 7.1 days and UFH at 9.6 days (P = .19). Patients in the enoxaparin arm were off warfarin therapy for an average of 6.0 days vs 7.5 days for the UFH group (P = .29). The duration of outpatient anticoagulation with enoxaparin was not analyzed in this study.
Patient and Procedure Characteristics
All patients had AF or atrial flutter with 86% of patients (n = 43) having a CHADS2 > 2 and 48% (n = 29) having a CHA2DS2VASc > 4. Overall, the mean age was 71.3 years with similarities in ethnicity distribution. Patients had multiple comorbidities as shown by a mean Charlson Comorbidity Index (CCI) of 7.7 and an increased risk of bleeding as evidenced by 98% (n = 48) of patients having a HAS-BLED score of ≥ 3. A greater percentage of patients bridged with enoxaparin had DM, history of stroke and MI, and a heart valve, whereas UFH patients were more likely to be in stage 5 CKD (eGFR < 15 mL/min/1.73m2) with a significantly lower mean eGFR (16.76 vs 22.64, P = .03). Furthermore, there were more patients on hemodialysis in the UFH (50%) arm vs enoxaparin (21%) arm and a lower mean CrCl with UFH (20.1 mL/min) compared with enoxaparin (24.9 mL/min); however, the differences in hemodialysis and mean CrCl were not statistically significant. There were no patients on peritoneal dialysis in this review.
Procedure Characteristics
The average Revised Cardiac Risk Index (RCRI) score was about 3, indicating that these patients were at a Class IV risk (11%) of having a perioperative cardiac event (Table 3). Nineteen patients (38%) elected for a major surgery with all but 1 of the surgeries (major or minor) being invasive. The average length of surgery was 1.2 hours, and patients were more likely to undergo cardiothoracic procedures (38%). There were 2 out of 14 (14%) patients on enoxaparin who were able to have surgery as an outpatient; whereas this did not occur in patients on UFH. The procedures completed for these patients were a colostomy (minor surgery) and arteriovenous graft repair (major surgery). There were no statistically significant differences regarding types of procedures between the 2 arms.
Outcomes
The primary outcome of this study, hospital LOS, differed significantly in the enoxaparin arm vs UFH: 10.2 days vs 17.5 days, P = .04 (Table 4). The time-to-discharge from initiation of parenteral anticoagulation was significantly reduced with enoxaparin (7.1 days) compared with UFH (11.9 days); P = .04. Although also reduced in the enoxaparin arm, ICU LOS did not show statistical significance (1.1 days vs 4.0 days, P = .09).
About 36% (n = 18) of patients in this study acquired an infection during hospitalization for elective surgery. The most common microorganism and site of infection were Enterococcus species and urinary tract, respectively (Table 5). Nearly half (44%, n = 16) of the patients in the UFH group had a nosocomial infection vs 14% (n = 2) of enoxaparin-bridged patients with a difference approaching significance; P = .056. Both patients in the enoxaparin group had the urinary tract as the primary source of infection; 1 of these patients had a urologic procedure.
Major bleeding occurred in 7% (n = 1) of enoxaparin patients vs 22% (n = 8) in the UFH arm, but this was not found to be statistically significant (P = .41). Minor bleeding was similar between enoxaparin and UFH arms (14% vs 19%, P = .99). Regarding thromboembolic complications, the enoxaparin group (0%) had a numerical reduction compared to UFH (11%) with VTE (n = 4) being the only occurrence of the composite outcome (P = .57). There were 4 deaths within 30 days posthospitalization—all were from the UFH group (P = .57). Due to the small sample size of this study, these outcomes (bleeding and thrombotic events) were not powered to detect a statistically significant difference.
Economic Analysis
The average cost differences (Table 6) of hospitalization between enoxaparin and UFH were calculated using the average LOS per treating specialty multiplied by the national average cost of the MCO for an inpatient bed day in 2018.12 The treating specialty with the longest average LOS in the enoxaparin arm was thoracic (4.7 days). The UFH arm also had a large LOS (average days) for the thoracic specialty (6.4 days); however, the vascular specialty (6.7 days) had the longest average LOS in this group. Due to a mean LOS of 10.2 days in the enoxaparin arm, which was further stratified by treating specialty, the total cost per average inpatient stay was calculated as $51,710. On the other hand, patients in the UFH arm had a total cost per average inpatient stay of $92,848.
Monitoring
Anti-factor Xa levels for LMWH monitoring were not analyzed in this study due to a lack of values collected; only 1 patient had an anti-factor Xa level checked during this time frame. Infusion rates of UFH were adjusted based on aPTT levels collected per MEDVAMC inpatient anticoagulation protocol. The average percentage of aPTT in therapeutic range was 46.3% and the mean time-to-therapeutic range (SD) was about 2.4 (1.3) days. Due to this study’s retrospective nature, there were inconsistencies with availability of documentation of UFH infusion rates. For this reason, these values were not analyzed further.
Discussion
In 2017, the American College of Cardiology published the Periprocedural Anticoagulation Expert Consensus Pathway, which recommends for patients with AF at low risk (CHA2DS2VASc 1-4) of thromboembolism to not be bridged (unless patient had a prior VTE or stroke/TIA).13 Nearly half the patients in this study, were classified as moderate-to-high thrombotic risk as evidenced by a CHA2DS2VASc > 4 with a mean score of 4.8. Due to this study’s retrospective design from 2008 to 2017, many of the clinicians may have referenced the 2008 CHEST antithrombotic guidelines when making the decision to bridge patients; these guidelines and the previous MEDVAMC anticoagulation protocol recommend bridging patients with AF with CHADS2 > 2 (moderate-to-high thrombotic risk) in which all but 1 of the patients in this study met criteria.1,14 In contrast to the landmark BRIDGE trial, the mean CHADS2 score in this study was 3.6; this is an indication that our patient population was of individuals at an increased risk of stroke and embolism.
In addition to thromboembolic complications, patients in the current study also were at increased risk of clinically relevant bleeding with a mean HAS-BLED score of 4.1 and nearly all patients having a score > 3. The complexity of the veteran population also was displayed by this study’s mean CCI (7.7) and RCRI (3.0) indicating a 0% estimated 10-year survival and a 11% increase in having a perioperative cardiac event, respectively. A mean CCI of 7.7 is associated with a 13.3 relative risk of death within 6 years postoperation.15 All patients had a diagnosis of hypertension, and > 75% had this diagnosis complicated by DM. In addition, this patient population was of those with extensive cardiovascular disease or increased risk, which makes for a clinically relevant application of patients who would require periprocedural bridging.
Another positive aspect of this study is that all the baseline characteristics, apart from renal function, were similar between arms, helping to strengthen the ability to adequately compare the 2 bridging modalities. Our assumption for the reasoning that more stage 5 CKD and dialysis patients were anticoagulated with UFH vs enoxaparin is a result of concern for an increased risk of bleeding with a medication that is renally cleared 30% less in CrCl < 30 mL/min.16 Although, enoxaparin 1 mg/kg/d is FDA approved as a therapeutic anticoagulant option, clinicians at MEDVAMC likely had reservations about its use in end-stage CKD patients. Unlike many studies, including the BRIDGE trial, patients with ACKD were not excluded from this trial, and the outcomes with enoxaparin are available for interpretation.
To no surprise, for patients included in this study, enoxaparin use led to shorter hospital LOS, reduced ICU LOS, and a quicker time-to-discharge from initiation. This is credited to the 100% bioavailability of SC enoxaparin in conjunction with its means to be a therapeutic option as an outpatient.16 Unlike IV UFH, patients requiring bridging can be discharged on SC injections of enoxaparin until a therapeutic INR is maintained with warfarin.The duration of hospital LOS in both arms were longer in this study compared with that of other studies.9 This may be due to clinicians being more cautious with renal insufficient patients, and the patients included in this study had multiple comorbidities. According to an economic analysis performed by Amorosi and colleagues in 2004, bridging with enoxaparin instead of UFH can save up to $3,733 per patient and reduce bridging costs by 63% to 85% driven primarily by decreased hospital LOS.10
Economic Outcome
In our study, we conducted a cost analysis using national VA data that indicated a $41,138 or 44% reduction in total cost per average inpatient stay when bridging 1 patient with enoxaparin vs UFH. The benefit of this cost analysis is that it reflects direct costs at VA institutions nationally; this will allow these data to be useful for practitioners at MEDVAMC and other VA hospitals. Stratifying the costs by treating specialty instead of treatment location minimized skewing of the data as there were some patients with long LOS in the ICU. No patients in the enoxaparin arm were treated in otolaryngology, which may have skewed the data. The data included direct costs for beds as well as costs for multiple services, such as procedures, pharmacy, nursing, laboratory tests, and imaging. Unlike the Amorosi study, our review did not include acquisition costs for enoxaparin syringes and bags of UFH or laboratory costs for aPTT and anti-factor Xa levels in part because of the data source and the difficulty calculating costs over a 10-year span.
Patients in the enoxaparin arm had a trend toward fewer occurrences of hospital-acquired infections than did those in the UFH arm, which we believe is due to a decreased LOS (in both total hospital and ICU days) and fewer blood draws needed for monitoring. This also may be attributed to a longer mean duration of surgery in the UFH arm (1.3 hours) vs enoxaparin (0.9 hours). The percentage of patients with procedures ≥ 45 minutes and the types of procedures between both arms were similar. However, these outcomes were not statistically significant. In addition, elderly males who are hospitalized may require a catheter (due to urinary retention), and catheter-associated urinary tract infection (CAUTI) is one of the highest reported infections in acute care hospitals in the US. This is in line with our patient population and may be a supplementary reason for the increase in infection incidence with UFH. Though, whether urinary catheters were used in these patients was not evaluated in this study.
Despite being at an increased risk of experiencing a major adverse cardiovascular event (MACE), no patients in either arm had a stroke/TIA or MI within 30 days postprocedure. The only occurrences documented were VTEs, which happened only in 4 patients on UFH. Four people died in this study, solely in the UFH arm. The incidence of thromboembolic complications and death along with major and minor bleeding cannot be deduced as meaningful as this study was underpowered for these outcomes. Despite anti-factor Xa monitoring being recommended in ACKD patients on enoxaparin, this monitoring was not routinely performed in this study. Another limitation was the inability to adequately assess the appropriateness of nurse-adjusted UFH infusion rates largely due to the retrospective nature of this study. The variability of aPTT percentage in therapeutic range and time-to-therapeutic range reported was indicative of the difficulties of monitoring for the safety and efficacy of UFH.
In 1991, Cruickshank and colleagues conducted a study in which a standard nomogram (similar to the MEDVAMC nomogram) for the adjustment of IV heparin was implemented at a single hospital.17 The success rate (aPTT percentage in therapeutic range) was 59.4% and average time-to-therapeutic range was about 1 day. The success rate (46.3%) and time-to-therapeutic range (2.4 days) in our study were lower and longer, respectively, than was expected. One potential reason for this discrepancy could be the differences in indication as the patients in Cruickshank and colleagues were being treated for VTE, whereas patients in our study had AF or atrial flutter. Also, there were inconsistencies in the availability of documentation of monitoring parameters for heparin due to the study time frame and retrospective design. Patients on UFH who are not within the therapeutic range in a timely manner are at greater risk of MACE and major/minor bleeding. Our study was not powered to detect these findings.
Strengths and Limitations
A significant limitation of this study was its small sample size; the study was not able to meet power for the primary outcome; it is unknown whether our study met power for nosocomial infections. The study also was not a powered review of other adverse events, such as thromboembolic complications, bleeding, and death. The study had an uneven number of patients, which made it more difficult to appropriately compare 2 patient populations; the study also did not include medians for patient characteristics and outcomes.
Due to this study’s time frame, the clinical pharmacy services at MEDVAMC were not as robust as they are now, which is the reason the decisions on which anticoagulant to use were primarily physician based. The use of TheraDoc to identify patients posed the risk of missing patients who may not have had the appropriate laboratory tests performed (ie, SCr). Patients on UFH had a reduced eGFR compared with that of enoxaparin, which may limit our extrapolation of enoxaparin’s use in end-stage renal disease. The reduced eGFR and higher number of dialysis patients in the UFH arm may have increased the occurrence of more labile INRs and bleeding outcomes. Patients on hemodialysis typically have more comorbidities and an increased risk of infection due to the frequent use of catheters and needles to access the bloodstream. In addition, the potential differences in catheter use and duration between groups were not identified. If these parameters were studied, the data collected may have helped better explain the reasoning for increased incidence of infection in the UFH arm.
Strengths of this study include a complex patient population with similar characteristics, distribution of ethnicities representative of the US population, patients at moderate-to-high thrombotic risk, the analysis of nosocomial infections, and the exclusion of patients with normal renal function or moderate CKD.
Conclusion
To our knowledge, this is the first study to compare periprocedural bridging outcomes and incidence of nosocomial infections in patients with AF and ACKD. This review provides new evidence that in this patient population, enoxaparin is a potential anticoagulant to reduce hospital LOS and hospital-acquired infections. Compared with UFH, bridging with enoxaparin reduced hospital LOS and anticoagulation time-to-discharge by 7 and 5 days, respectively, and decreased the incidence of nosocomial infections by 30%. Using the mean LOS per treating specialty for both arms, bridging 1 patient with AF with enoxaparin vs UFH can potentially lead to an estimated $40,000 (44%) reduction in total cost of hospitalization. Enoxaparin also had no numeric differences in mortality and adverse events (stroke/TIA, MI, VTE) vs that of UFH, but it is important to note that this study was not powered to find a significant difference in these outcomes. Due to the mean eGFR of patients on enoxaparin being 22.6 mL/min/1.73 m2 and only 1 in 5 having stage 5 CKD, at this time, we do not recommend enoxaparin for periprocedural use in stage 5 CKD or in patients on hemodialysis. Larger studies are needed, including randomized trials, in this patient population to further evaluate these outcomes and assess the use of enoxaparin in patients with ACKD.
There has been a long-standing controversy in the use of parenteral anticoagulation for perioperative bridging in patients with atrial fibrillation (AF) pursuing elective surgery.1 The decision to bridge is dependent on the patient’s risk of thromboembolic complications and susceptibility to bleed.1 The BRIDGE trial showed noninferiority in rate of stroke and embolism events between low molecular weight heparins (LMWHs) and no perioperative bridging.2 However, according to the American College of Chest Physicians (CHEST) 2012 guidelines, patients in the BRIDGE trial would be deemed low risk for thromboembolic events displayed by a mean CHADS2 (congestive heart failure [CHF], hypertension, age, diabetes mellitus, and stroke/transient ischemic attack) score of 2.3. Also, the BRIDGE study and many others excluded patients with advanced forms of chronic kidney disease (CKD).2,3
Similar to patients with AF, patients with advanced CKD (ACKD, stage 4 and 5 CKD) have an increased risk of stroke and venous thromboembolism (VTE).4,5 Patients with AF and ACKD have not been adequately studied for perioperative anticoagulation bridging outcomes. Although unfractionated heparin (UFH) is preferred over LMWH in ACKD patients,enoxaparin can be used in this population.1,6 Enoxaparin 1 mg/kg once daily is approved by the US Food and Drug Administration (FDA) for use in patients with severe renal insufficiency defined as creatinine clearance (CrCl) < 30 mL/min. This dosage adjustment is subsequent to studies with enoxaparin 1 mg/kg twice daily that showed a significant increase in major and minor bleeding in severe renal-insufficient patients with CrCl < 30 mL/min vs patients with CrCl > 30 mL/min.7 When comparing the myocardial infarction (MI) outcomes of severe renal-insufficient patients in the ExTRACT-TIMI 25 trial, enoxaparin 1 mg/kg once daily had no significant difference in nonfatal major bleeding vs UFH.8 In patients without renal impairment (no documentation of kidney disease), bridging therapy with LMWH was completed more than UFH in < 24 hours of hospital stay and with similar rates of VTEs and major bleeding.9 In addition to its ability to be administered outpatient, enoxaparin has a more predictable pharmacokinetic profile, allowing for less monitoring and a lower incidence of heparin-induced thrombocytopenia (HIT) vs that of UFH.6
The Michael E. DeBakey Veteran Affairs Medical Center (MEDVAMC) in Houston, Texas, is one of the largest US Department of Veterans Affairs (VA) hospitals in the US, managing > 150,000 veterans in Southeast Texas and other southern states. As a referral center for traveling patients, it is crucial that MEDVAMC decrease hospital length of stay (LOS) to increase space for incoming patients. Reducing LOS also reduces costs and may have a correlation with decreasing the incidence of nosocomial infections. Because of its significance to this facility, hospital LOS is an appropriate primary outcome for this study.
To our knowledge, bridging outcomes between LMWH and UFH in patients with AF and ACKD have never been studied. We hypothesized that using enoxaparin instead of heparin for periprocedural management would result in decreased hospital LOS, leading to a lower economic burden and lower incidence of nosocomial infections with no significant differences in major and minor bleeding and thromboembolic complications.10
Methods
This study was a single-center, retrospective chart review of adult patients from January 2008 to September 2017. The review was conducted at MEDVAMC and was approved by the research and development committee and by the Baylor College of Medicine Institutional Review Board. Formal consent was not required.
Included patients were aged ≥ 18 years with diagnoses of AF or atrial flutter and ACKD as recognized by a glomerular filtration rate (eGFR) of < 30 mL/min/1.73 m2 as calculated by use of the Modification of Diet in Renal Disease Study (MDRD) equation.11 Patients must have previously been on warfarin and required temporary interruption of warfarin for an elective procedure. During the interruption of warfarin therapy, a requirement was set for patients to be on periprocedural anticoagulation with subcutaneous (SC) enoxaparin 1 mg/kg daily or continuous IV heparin per MEDVAMC heparin protocol. Patients were excluded if they had experienced major bleeding in the 6 weeks prior to the elective procedure, had current thrombocytopenia (platelet count < 100 × 109/L), or had a history of heparin-induced thrombocytopenia (HIT) or a heparin allergy.
This patient population was identified using TheraDoc Clinical Surveillance Software System (Charlotte, NC), which has prebuilt alert reviews for anticoagulation medications, including enoxaparin and heparin. An alert for patients on enoxaparin with serum creatinine (SCr) > 1.5 mg/dL was used to screen patients who met the inclusion criteria. A second alert identified patients on heparin. The VA Computerized Patient Record System (CPRS) was used to collect patient data.
Economic Analysis
An economic analysis was conducted using data from the VA Managerial Cost Accounting Reports. Data on the national average cost per bed day was used for the purpose of extrapolating this information to multiple VA institutions.12 National average cost per day was determined by dividing the total cost by the number of bed days for the identified treating specialty during the fiscal period of 2018. Average cost per day data included costs for bed day, surgery, radiology services, laboratory tests, pharmacy services, treatment location (ie, intensive care units [ICUs]) and all other costs associated with an inpatient stay. A cost analysis was performed using this average cost per bed day and the mean LOS between enoxaparin and UFH for each treating specialty. The major outcome of the cost analysis was the total cost per average inpatient stay. The national average cost per bed day for each treating specialty was multiplied by the average LOS found for each treating specialty in this study; the sum of all the average costs per inpatient stay for the treating specialties resulted in the total cost per average inpatient stay. Permission to use these data was granted by the Pharmacy and Critical Care Services at MEDVAMC.
Patient Demographics and Characteristics
Data were collected on patient demographics (Table 1). Nosocomial infections, stroke/transient ischemic attack, MI, VTE, major and minor bleeding, and death are defined in Table 2.
The primary outcome of the study was hospital LOS. The study was powered at 90% for α = .05, which gives a required study population of 114 (1:1 enrollment ratio) patients to determine a statistically significant difference in hospital stay. This sample size was calculated using the mean hospital LOS (the primary objective) in the REGIMEN registry for LMWH (4.6 days) and UFH (10.3 days).9 To our knowledge, the incidence of nosocomial infections (a secondary outcome) has not been studied in this patient population; therefore, there was no basis to assess an appropriate sample size to find a difference in this outcome. Furthermore, the goal was to collect as many patients as possible to best assess this variable. Because of an expected high exclusion rate, 504 patients were reviewed to target a sample size of 120 patients. Due to the single-center nature of this review, the secondary outcomes of thromboembolic complications and major and minor bleeding were expected to be underpowered.
The final analysis compared the enoxaparin arm with the UFH arm. Univariate differences between the treatment groups were compared using the Fisher exact test for categorical variables. Demographic data and other continuous variables were analyzed by an unpaired t test to compare means between the 2 arms. Outcomes and characteristics were deemed statistically significant when α (P value) was < .05. All P values reported were 2-tailed with a 95% CI. No statistical analysis was performed for the cost differences (based on LOS per treating specialty) in the 2 treatment arms. Statistical analyses were completed by utilizing GraphPad Software (San Diego, CA).
Results
In total, 50 patients were analyzed in the study. There were 36 patients bridged with IV UFH at a concentration of 25,000 U/250 mL with an initial infusion rate of 12 U/kg/h. For the other arm, 14 patients were anticoagulated with renally dosed enoxaparin 1 mg/kg/d with an average daily dose of 89.3 mg; the mean actual body weight in this group was 90.9 mg (correlates with enoxaparin daily dose). Physicians of the primary team decided which parenteral anticoagulant to use. The difference in mean duration of inpatient parental anticoagulation between both groups was not statistically significant: enoxaparin at 7.1 days and UFH at 9.6 days (P = .19). Patients in the enoxaparin arm were off warfarin therapy for an average of 6.0 days vs 7.5 days for the UFH group (P = .29). The duration of outpatient anticoagulation with enoxaparin was not analyzed in this study.
Patient and Procedure Characteristics
All patients had AF or atrial flutter with 86% of patients (n = 43) having a CHADS2 > 2 and 48% (n = 29) having a CHA2DS2VASc > 4. Overall, the mean age was 71.3 years with similarities in ethnicity distribution. Patients had multiple comorbidities as shown by a mean Charlson Comorbidity Index (CCI) of 7.7 and an increased risk of bleeding as evidenced by 98% (n = 48) of patients having a HAS-BLED score of ≥ 3. A greater percentage of patients bridged with enoxaparin had DM, history of stroke and MI, and a heart valve, whereas UFH patients were more likely to be in stage 5 CKD (eGFR < 15 mL/min/1.73m2) with a significantly lower mean eGFR (16.76 vs 22.64, P = .03). Furthermore, there were more patients on hemodialysis in the UFH (50%) arm vs enoxaparin (21%) arm and a lower mean CrCl with UFH (20.1 mL/min) compared with enoxaparin (24.9 mL/min); however, the differences in hemodialysis and mean CrCl were not statistically significant. There were no patients on peritoneal dialysis in this review.
Procedure Characteristics
The average Revised Cardiac Risk Index (RCRI) score was about 3, indicating that these patients were at a Class IV risk (11%) of having a perioperative cardiac event (Table 3). Nineteen patients (38%) elected for a major surgery with all but 1 of the surgeries (major or minor) being invasive. The average length of surgery was 1.2 hours, and patients were more likely to undergo cardiothoracic procedures (38%). There were 2 out of 14 (14%) patients on enoxaparin who were able to have surgery as an outpatient; whereas this did not occur in patients on UFH. The procedures completed for these patients were a colostomy (minor surgery) and arteriovenous graft repair (major surgery). There were no statistically significant differences regarding types of procedures between the 2 arms.
Outcomes
The primary outcome of this study, hospital LOS, differed significantly in the enoxaparin arm vs UFH: 10.2 days vs 17.5 days, P = .04 (Table 4). The time-to-discharge from initiation of parenteral anticoagulation was significantly reduced with enoxaparin (7.1 days) compared with UFH (11.9 days); P = .04. Although also reduced in the enoxaparin arm, ICU LOS did not show statistical significance (1.1 days vs 4.0 days, P = .09).
About 36% (n = 18) of patients in this study acquired an infection during hospitalization for elective surgery. The most common microorganism and site of infection were Enterococcus species and urinary tract, respectively (Table 5). Nearly half (44%, n = 16) of the patients in the UFH group had a nosocomial infection vs 14% (n = 2) of enoxaparin-bridged patients with a difference approaching significance; P = .056. Both patients in the enoxaparin group had the urinary tract as the primary source of infection; 1 of these patients had a urologic procedure.
Major bleeding occurred in 7% (n = 1) of enoxaparin patients vs 22% (n = 8) in the UFH arm, but this was not found to be statistically significant (P = .41). Minor bleeding was similar between enoxaparin and UFH arms (14% vs 19%, P = .99). Regarding thromboembolic complications, the enoxaparin group (0%) had a numerical reduction compared to UFH (11%) with VTE (n = 4) being the only occurrence of the composite outcome (P = .57). There were 4 deaths within 30 days posthospitalization—all were from the UFH group (P = .57). Due to the small sample size of this study, these outcomes (bleeding and thrombotic events) were not powered to detect a statistically significant difference.
Economic Analysis
The average cost differences (Table 6) of hospitalization between enoxaparin and UFH were calculated using the average LOS per treating specialty multiplied by the national average cost of the MCO for an inpatient bed day in 2018.12 The treating specialty with the longest average LOS in the enoxaparin arm was thoracic (4.7 days). The UFH arm also had a large LOS (average days) for the thoracic specialty (6.4 days); however, the vascular specialty (6.7 days) had the longest average LOS in this group. Due to a mean LOS of 10.2 days in the enoxaparin arm, which was further stratified by treating specialty, the total cost per average inpatient stay was calculated as $51,710. On the other hand, patients in the UFH arm had a total cost per average inpatient stay of $92,848.
Monitoring
Anti-factor Xa levels for LMWH monitoring were not analyzed in this study due to a lack of values collected; only 1 patient had an anti-factor Xa level checked during this time frame. Infusion rates of UFH were adjusted based on aPTT levels collected per MEDVAMC inpatient anticoagulation protocol. The average percentage of aPTT in therapeutic range was 46.3% and the mean time-to-therapeutic range (SD) was about 2.4 (1.3) days. Due to this study’s retrospective nature, there were inconsistencies with availability of documentation of UFH infusion rates. For this reason, these values were not analyzed further.
Discussion
In 2017, the American College of Cardiology published the Periprocedural Anticoagulation Expert Consensus Pathway, which recommends for patients with AF at low risk (CHA2DS2VASc 1-4) of thromboembolism to not be bridged (unless patient had a prior VTE or stroke/TIA).13 Nearly half the patients in this study, were classified as moderate-to-high thrombotic risk as evidenced by a CHA2DS2VASc > 4 with a mean score of 4.8. Due to this study’s retrospective design from 2008 to 2017, many of the clinicians may have referenced the 2008 CHEST antithrombotic guidelines when making the decision to bridge patients; these guidelines and the previous MEDVAMC anticoagulation protocol recommend bridging patients with AF with CHADS2 > 2 (moderate-to-high thrombotic risk) in which all but 1 of the patients in this study met criteria.1,14 In contrast to the landmark BRIDGE trial, the mean CHADS2 score in this study was 3.6; this is an indication that our patient population was of individuals at an increased risk of stroke and embolism.
In addition to thromboembolic complications, patients in the current study also were at increased risk of clinically relevant bleeding with a mean HAS-BLED score of 4.1 and nearly all patients having a score > 3. The complexity of the veteran population also was displayed by this study’s mean CCI (7.7) and RCRI (3.0) indicating a 0% estimated 10-year survival and a 11% increase in having a perioperative cardiac event, respectively. A mean CCI of 7.7 is associated with a 13.3 relative risk of death within 6 years postoperation.15 All patients had a diagnosis of hypertension, and > 75% had this diagnosis complicated by DM. In addition, this patient population was of those with extensive cardiovascular disease or increased risk, which makes for a clinically relevant application of patients who would require periprocedural bridging.
Another positive aspect of this study is that all the baseline characteristics, apart from renal function, were similar between arms, helping to strengthen the ability to adequately compare the 2 bridging modalities. Our assumption for the reasoning that more stage 5 CKD and dialysis patients were anticoagulated with UFH vs enoxaparin is a result of concern for an increased risk of bleeding with a medication that is renally cleared 30% less in CrCl < 30 mL/min.16 Although, enoxaparin 1 mg/kg/d is FDA approved as a therapeutic anticoagulant option, clinicians at MEDVAMC likely had reservations about its use in end-stage CKD patients. Unlike many studies, including the BRIDGE trial, patients with ACKD were not excluded from this trial, and the outcomes with enoxaparin are available for interpretation.
To no surprise, for patients included in this study, enoxaparin use led to shorter hospital LOS, reduced ICU LOS, and a quicker time-to-discharge from initiation. This is credited to the 100% bioavailability of SC enoxaparin in conjunction with its means to be a therapeutic option as an outpatient.16 Unlike IV UFH, patients requiring bridging can be discharged on SC injections of enoxaparin until a therapeutic INR is maintained with warfarin.The duration of hospital LOS in both arms were longer in this study compared with that of other studies.9 This may be due to clinicians being more cautious with renal insufficient patients, and the patients included in this study had multiple comorbidities. According to an economic analysis performed by Amorosi and colleagues in 2004, bridging with enoxaparin instead of UFH can save up to $3,733 per patient and reduce bridging costs by 63% to 85% driven primarily by decreased hospital LOS.10
Economic Outcome
In our study, we conducted a cost analysis using national VA data that indicated a $41,138 or 44% reduction in total cost per average inpatient stay when bridging 1 patient with enoxaparin vs UFH. The benefit of this cost analysis is that it reflects direct costs at VA institutions nationally; this will allow these data to be useful for practitioners at MEDVAMC and other VA hospitals. Stratifying the costs by treating specialty instead of treatment location minimized skewing of the data as there were some patients with long LOS in the ICU. No patients in the enoxaparin arm were treated in otolaryngology, which may have skewed the data. The data included direct costs for beds as well as costs for multiple services, such as procedures, pharmacy, nursing, laboratory tests, and imaging. Unlike the Amorosi study, our review did not include acquisition costs for enoxaparin syringes and bags of UFH or laboratory costs for aPTT and anti-factor Xa levels in part because of the data source and the difficulty calculating costs over a 10-year span.
Patients in the enoxaparin arm had a trend toward fewer occurrences of hospital-acquired infections than did those in the UFH arm, which we believe is due to a decreased LOS (in both total hospital and ICU days) and fewer blood draws needed for monitoring. This also may be attributed to a longer mean duration of surgery in the UFH arm (1.3 hours) vs enoxaparin (0.9 hours). The percentage of patients with procedures ≥ 45 minutes and the types of procedures between both arms were similar. However, these outcomes were not statistically significant. In addition, elderly males who are hospitalized may require a catheter (due to urinary retention), and catheter-associated urinary tract infection (CAUTI) is one of the highest reported infections in acute care hospitals in the US. This is in line with our patient population and may be a supplementary reason for the increase in infection incidence with UFH. Though, whether urinary catheters were used in these patients was not evaluated in this study.
Despite being at an increased risk of experiencing a major adverse cardiovascular event (MACE), no patients in either arm had a stroke/TIA or MI within 30 days postprocedure. The only occurrences documented were VTEs, which happened only in 4 patients on UFH. Four people died in this study, solely in the UFH arm. The incidence of thromboembolic complications and death along with major and minor bleeding cannot be deduced as meaningful as this study was underpowered for these outcomes. Despite anti-factor Xa monitoring being recommended in ACKD patients on enoxaparin, this monitoring was not routinely performed in this study. Another limitation was the inability to adequately assess the appropriateness of nurse-adjusted UFH infusion rates largely due to the retrospective nature of this study. The variability of aPTT percentage in therapeutic range and time-to-therapeutic range reported was indicative of the difficulties of monitoring for the safety and efficacy of UFH.
In 1991, Cruickshank and colleagues conducted a study in which a standard nomogram (similar to the MEDVAMC nomogram) for the adjustment of IV heparin was implemented at a single hospital.17 The success rate (aPTT percentage in therapeutic range) was 59.4% and average time-to-therapeutic range was about 1 day. The success rate (46.3%) and time-to-therapeutic range (2.4 days) in our study were lower and longer, respectively, than was expected. One potential reason for this discrepancy could be the differences in indication as the patients in Cruickshank and colleagues were being treated for VTE, whereas patients in our study had AF or atrial flutter. Also, there were inconsistencies in the availability of documentation of monitoring parameters for heparin due to the study time frame and retrospective design. Patients on UFH who are not within the therapeutic range in a timely manner are at greater risk of MACE and major/minor bleeding. Our study was not powered to detect these findings.
Strengths and Limitations
A significant limitation of this study was its small sample size; the study was not able to meet power for the primary outcome; it is unknown whether our study met power for nosocomial infections. The study also was not a powered review of other adverse events, such as thromboembolic complications, bleeding, and death. The study had an uneven number of patients, which made it more difficult to appropriately compare 2 patient populations; the study also did not include medians for patient characteristics and outcomes.
Due to this study’s time frame, the clinical pharmacy services at MEDVAMC were not as robust as they are now, which is the reason the decisions on which anticoagulant to use were primarily physician based. The use of TheraDoc to identify patients posed the risk of missing patients who may not have had the appropriate laboratory tests performed (ie, SCr). Patients on UFH had a reduced eGFR compared with that of enoxaparin, which may limit our extrapolation of enoxaparin’s use in end-stage renal disease. The reduced eGFR and higher number of dialysis patients in the UFH arm may have increased the occurrence of more labile INRs and bleeding outcomes. Patients on hemodialysis typically have more comorbidities and an increased risk of infection due to the frequent use of catheters and needles to access the bloodstream. In addition, the potential differences in catheter use and duration between groups were not identified. If these parameters were studied, the data collected may have helped better explain the reasoning for increased incidence of infection in the UFH arm.
Strengths of this study include a complex patient population with similar characteristics, distribution of ethnicities representative of the US population, patients at moderate-to-high thrombotic risk, the analysis of nosocomial infections, and the exclusion of patients with normal renal function or moderate CKD.
Conclusion
To our knowledge, this is the first study to compare periprocedural bridging outcomes and incidence of nosocomial infections in patients with AF and ACKD. This review provides new evidence that in this patient population, enoxaparin is a potential anticoagulant to reduce hospital LOS and hospital-acquired infections. Compared with UFH, bridging with enoxaparin reduced hospital LOS and anticoagulation time-to-discharge by 7 and 5 days, respectively, and decreased the incidence of nosocomial infections by 30%. Using the mean LOS per treating specialty for both arms, bridging 1 patient with AF with enoxaparin vs UFH can potentially lead to an estimated $40,000 (44%) reduction in total cost of hospitalization. Enoxaparin also had no numeric differences in mortality and adverse events (stroke/TIA, MI, VTE) vs that of UFH, but it is important to note that this study was not powered to find a significant difference in these outcomes. Due to the mean eGFR of patients on enoxaparin being 22.6 mL/min/1.73 m2 and only 1 in 5 having stage 5 CKD, at this time, we do not recommend enoxaparin for periprocedural use in stage 5 CKD or in patients on hemodialysis. Larger studies are needed, including randomized trials, in this patient population to further evaluate these outcomes and assess the use of enoxaparin in patients with ACKD.
1. Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2)(suppl):e326S-350S.
2. Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-833.
3. Hammerstingl C, Schmitz A, Fimmers R, Omran H. Bridging of chronic oral anticoagulation with enoxaparin in patients with atrial fibrillation: results from the prospective BRAVE registry. Cardiovasc Ther. 2009;27(4):230-238.
4. Dad T, Weiner DE. Stroke and chronic kidney disease: epidemiology, pathogenesis, and management across kidney disease stages. Semin Nephrol. 2015;35(4):311-322.
5. Wattanakit K, Cushman M. Chronic kidney disease and venous thromboembolism: epidemiology and mechanisms. Curr Opin Pulm Med. 2009;15(5):408-412.
6. Saltiel M. Dosing low molecular weight heparins in kidney disease. J Pharm Pract. 2010;23(3):205-209.
7. Spinler SA, Inverso SM, Cohen M, Goodman SG, Stringer KA, Antman EM; ESSENCE and TIMI 11B Investigators. Safety and efficacy of unfractionated heparin versus enoxaparin in patients who are obese and patients with severe renal impairment: analysis from the ESSENCE and TIMI 11B studies. Am Heart J. 2003;146(1):33-41.
8. Fox KA, Antman EM, Montalescot G, et al. The impact of renal dysfunction on outcomes in the ExTRACT-TIMI 25 trial. J Am Coll Cardiol. 2007;49(23):2249-2255.
9. Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost. 2006;4(6):1246-1252.
10. Amorosi SL, Tsilimingras K, Thompson D, Fanikos J, Weinstein MC, Goldhaber SZ. Cost analysis of “bridging therapy” with low-molecular-weight heparin versus unfractionated heparin during temporary interruption of chronic anticoagulation. Am J Cardiol. 2004;93(4):509-511.
11. Inker LA, Astor BC, Fox CH, et al. KDOQI US commentary on the 2012 KDIGO clinical practice guideline for the evaluation and management of CKD. Am J Kidney Dis. 2014;63(5):713-735.
12. US Department of Veteran Affairs. Managerial Cost Accounting Financial User Support Reports: fiscal year 2018. https://www.herc.research.va.gov/include/page.asp?id=managerial-cost-accounting. [Source not verified.]
13. Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC Expert Consensus Decision Pathway for Periprocedural Management of Anticoagulation in Patients With Nonvalvular Atrial Fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol. 2017;69(7):871-898.
14. Kearon C, Kahn SR, Agnelli G, et al. Antithrombotic therapy for venous thromboembolic disease: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition). Chest. 2008;133(6 suppl):454S-545S.
15. Charlson M, Szatrowski TP, Peterson J, Gold J. Validation of a combined comorbidity index. J Clin Epidemiol. 1994;47(11):1245-1251.
16. Lovenox [package insert]. Bridgewater, NJ: Sanofi-Aventis; December 2017.
17. Cruickshank MK, Levine MN, Hirsh J, Roberts R, Siguenza M. A standard heparin nomogram for the management of heparin therapy. Arch Intern Med. 1991;151(2):333-337.
18. Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation. 2015;131(5):488-494.
19. Verheugt FW, Steinhubl SR, Hamon M, et al. Incidence, prognostic impact, and influence of antithrombotic therapy on access and nonaccess site bleeding in percutaneous coronary intervention. JACC Cardiovasc Interv. 2011;4(2):191-197.
20. Bijsterveld NR, Peters RJ, Murphy SA, Bernink PJ, Tijssen JG, Cohen M. Recurrent cardiac ischemic events early after discontinuation of short-term heparin treatment in acute coronary syndromes: results from the Thrombolysis in Myocardial Infarction (TIMI) 11B and Efficacy and Safety of Subcutaneous Enoxaparin in Non-Q-Wave Coronary Events (ESSENCE) studies. J Am Coll Cardiol. 2003;42(12):2083-2089.
1. Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2)(suppl):e326S-350S.
2. Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-833.
3. Hammerstingl C, Schmitz A, Fimmers R, Omran H. Bridging of chronic oral anticoagulation with enoxaparin in patients with atrial fibrillation: results from the prospective BRAVE registry. Cardiovasc Ther. 2009;27(4):230-238.
4. Dad T, Weiner DE. Stroke and chronic kidney disease: epidemiology, pathogenesis, and management across kidney disease stages. Semin Nephrol. 2015;35(4):311-322.
5. Wattanakit K, Cushman M. Chronic kidney disease and venous thromboembolism: epidemiology and mechanisms. Curr Opin Pulm Med. 2009;15(5):408-412.
6. Saltiel M. Dosing low molecular weight heparins in kidney disease. J Pharm Pract. 2010;23(3):205-209.
7. Spinler SA, Inverso SM, Cohen M, Goodman SG, Stringer KA, Antman EM; ESSENCE and TIMI 11B Investigators. Safety and efficacy of unfractionated heparin versus enoxaparin in patients who are obese and patients with severe renal impairment: analysis from the ESSENCE and TIMI 11B studies. Am Heart J. 2003;146(1):33-41.
8. Fox KA, Antman EM, Montalescot G, et al. The impact of renal dysfunction on outcomes in the ExTRACT-TIMI 25 trial. J Am Coll Cardiol. 2007;49(23):2249-2255.
9. Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost. 2006;4(6):1246-1252.
10. Amorosi SL, Tsilimingras K, Thompson D, Fanikos J, Weinstein MC, Goldhaber SZ. Cost analysis of “bridging therapy” with low-molecular-weight heparin versus unfractionated heparin during temporary interruption of chronic anticoagulation. Am J Cardiol. 2004;93(4):509-511.
11. Inker LA, Astor BC, Fox CH, et al. KDOQI US commentary on the 2012 KDIGO clinical practice guideline for the evaluation and management of CKD. Am J Kidney Dis. 2014;63(5):713-735.
12. US Department of Veteran Affairs. Managerial Cost Accounting Financial User Support Reports: fiscal year 2018. https://www.herc.research.va.gov/include/page.asp?id=managerial-cost-accounting. [Source not verified.]
13. Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC Expert Consensus Decision Pathway for Periprocedural Management of Anticoagulation in Patients With Nonvalvular Atrial Fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol. 2017;69(7):871-898.
14. Kearon C, Kahn SR, Agnelli G, et al. Antithrombotic therapy for venous thromboembolic disease: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition). Chest. 2008;133(6 suppl):454S-545S.
15. Charlson M, Szatrowski TP, Peterson J, Gold J. Validation of a combined comorbidity index. J Clin Epidemiol. 1994;47(11):1245-1251.
16. Lovenox [package insert]. Bridgewater, NJ: Sanofi-Aventis; December 2017.
17. Cruickshank MK, Levine MN, Hirsh J, Roberts R, Siguenza M. A standard heparin nomogram for the management of heparin therapy. Arch Intern Med. 1991;151(2):333-337.
18. Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation. 2015;131(5):488-494.
19. Verheugt FW, Steinhubl SR, Hamon M, et al. Incidence, prognostic impact, and influence of antithrombotic therapy on access and nonaccess site bleeding in percutaneous coronary intervention. JACC Cardiovasc Interv. 2011;4(2):191-197.
20. Bijsterveld NR, Peters RJ, Murphy SA, Bernink PJ, Tijssen JG, Cohen M. Recurrent cardiac ischemic events early after discontinuation of short-term heparin treatment in acute coronary syndromes: results from the Thrombolysis in Myocardial Infarction (TIMI) 11B and Efficacy and Safety of Subcutaneous Enoxaparin in Non-Q-Wave Coronary Events (ESSENCE) studies. J Am Coll Cardiol. 2003;42(12):2083-2089.
Fluoroscopically Guided Lateral Approach Hip Injection
Hip injections are performed as diagnostic and therapeutic interventions across a variety of medical subspecialties, including but not limited to those practicing physical medicine and rehabilitation, pain medicine, sports medicine, orthopedic surgery, and radiology. Traditional image-guided intra-articular hip injection commonly uses an anterior-oblique approach from a starting point on the anterior groin traversing soft tissue anterior to the femoral neck to the target needle placement at the femoral head-neck junction.
In fluoroscopic procedures, a coaxial technique for needles placement is used for safe and precise insertion of needles. An X-ray beam is angled in line with the projected path of the needle from skin entry point to injection target. Coaxial, en face technique (also called EF, parallel, hub view, down the barrel, or barrel view) appears as a single radiopaque dot over the target injection site.1 This technique minimizes needle redirection for correction of the injection path and minimal disturbance of surrounding tissue on the approach to the intended target.
Noncoaxial technique, as used in the anterior-oblique approach, intentionally directs the needle away from a skin entry point, the needle barrel traversing the X-ray beam toward an injection target. Clinical challenges to injection with the anterior-oblique approach include using a noncoaxial technique. Additional challenges to the anterior-oblique (also referred to as anterior) approach are body habitus and pannus, proximity to neurovascular structures, and patient positioning. By understanding the risks and benefits of varied technical approaches to accomplish a clinical goal and outcome, trainees are better able to select the technique most appropriate for a varied patient population.
Common risks to patients for all intra-articular interventions include bleeding, infection, and pain. Risk of damage to nearby structures is often mentioned as part of a standard informed consent process as it relates to the femoral vein, artery, and nerve that are in close anatomical proximity to the target injection site. When prior studies have examined the risk of complications resulting from intra-articular hip injections, a common conclusion is that despite a relatively low-risk profile for skilled interventionalists, efforts to avoid needle placement in the medial 50% of the femoral head on antero-posterior imaging is recommended.2
The anterior technique is a commonly described approach, and the same can be used for both ultrasound-guided and fluoroscopically guided hip injections.3 Using ultrasound guidance, the anterior technique can be performed with in-plane direct visualization of the needle throughout the procedure. With fluoroscopic guidance, the anterior approach is performed out-of-plane, using the noncoaxial technique. This requires the interventionalist to use tactile and anatomic guidance to the target injection site. The anterior approach for hip injection is one of few interventions where coaxial technique is not used for the procedure, making the instruction for a learner less concrete and potentially more challenging related to the needle path not under direct visualization in plane with the X-ray beam.
Technical guidance and detailed instruction for the lateral approach is infrequently described in fluoroscopic interventional texts. Reference to a lateral approach hip injection was made as early as the 1970s, without detail provided on the technique, with respect to the advantage of visualization of the hip joint for needle placement when hardware is in place.4 A more recent article described a lateral approach technique involving the patient in a decubitus (lateral) supine position, which presents limitations in consistent fluoroscopic imaging and can be a challenging static position for the patient to maintain.5
The retrospective review of anterior-oblique and lateral approach procedures in this study aims to demonstrate that there is no significant difference in radiation exposure, rate of successful intra-articular injection, or complication rate. If proven as a noninferior technique, the lateral approach may be a valuable interventional skill to those performing hip injections. Potential benefits to the patient and provider include options for the provider to access the joint using either technique. Additionally, the approach can be added to the instructional plan for those practitioners providing technical instruction to trainees within their health care system.
Methods
The institutional review board at the VA Ann Arbor Healthcare System reviewed and granted approval for this study. One of 5 interventional pain physician staff members at the VA Ann Arbor Healthcare System performed fluoroscopically guided hip injections. Interventional pain fellows under the direct supervision of board-certified physicians performed the procedures for the study cases. Supervising physicians included both physiatrists and anesthesiologists. Images were reviewed and evaluated without corresponding patient biographic data.
For cases using the lateral approach, the patients were positioned supine on the fluoroscopy table. In anterior-posterior and lateral views, trajectory lines are drawn using a long metal marking rod held adjacent to the patient. With pulsed low-dose fluoroscopy, transverse lines are drawn to identify midpoint of the femoral head in lateral view (Figure 1A, x-axis) and the most direct line from skin to lateral femoral head neck junction joint target (Figure 1B, z-axis). Also confirmed in lateral view, the z-axis marked line drawn on the skin is used to confirm that this transverse plane crosses the overlapping femoral heads (Figure 1A, y-axis).
The cross-section of these transverse and coronal plane lines identifies the starting point for the most direct approach from skin to injection target at femoral head-neck junction. Using the coaxial technique in the lateral view, the needle is introduced and advanced using intermittent fluoroscopic images to the lateral joint target. Continuing in this view, the interventionalist can ensure that advancing the needle to the osseous endpoint will place the tip at the midpoint of the femoral head at the target on the lateral surface, avoiding inadvertent advance of the needle anterior or posterior the femoral head. Final needle placement confirmation is then completed in antero-posterior view (Figure 2A). Contrast enhancement is used to confirm intra-articular spread (Figure 2B).
Cases included in the study were performed over an 8-month period in 2017. Case images recorded in IntelliSpace PACS Radiology software (Andover, MA) were included by creating a list of all cases performed and documented using the major joint injection procedure code. The cases reviewed began with the most recent cases. Two research team members (1 radiologist and 1 interventional pain physician) reviewed the series of saved images for each patient and the associated procedure report. The research team members documented and recorded de-identified study data in Microsoft Excel (Redmond, WA).
Imaging reports, using the saved images and the associated procedure report, were classified for technical approach (anterior, lateral, or inconclusive), success of joint injection as evidenced by appropriate contrast enhancement within the joint space (successful, unsuccessful, or incomplete images), documented use of sedation (yes, no), patient positioning (supine, prone), radiation exposure dose, radiation exposure time, and additional comments, such as “notable pannus” or “hardware present” to annotate significant findings on imaging review.
Statistical Analysis
The distribution of 2 outcomes used to compare rates of complication, radiation dose, and exposure time was checked using the Shapiro-Wilk test. Power analysis determined that inclusion of 30 anterior and 30 lateral cases results in adequate power to detect a 1-point mean difference, assuming a standard deviation of 1.5 in each group. Both radiation dose and exposure time were found to be nonnormally distributed (W = 0.65, P < .001; W = 0.86, P < .001; respectively). Median and interquartile range (IQR) of dose and time in seconds for anterior and lateral approaches were computed. Median differences in radiation dose and exposure time between anterior and lateral approaches were assessed with the k-sample test of equality of medians. All analyses were conducted using Stata Version 14.1 (College Station, TX).
Results
Between June 2017 and January 2018, 88 cases were reviewed as performed, with 30 anterior and 30 lateral approach cases included in this retrospective comparison study. A total of 28 cases were excluded from the study for using an inconclusive approach, multiple or bilateral procedures, cases without recorded dose and time data, and inadequately saved images to provide meaningful data (Figure 3).
Rate of successful intervention with needle placement confirmed within the articular space on contrast enhancement was not significantly different in the study groups with 96.7% (29 of 30) anterior approach cases reported as successful, 100% (30 of 30) lateral approach cases reported as successful. Overhanging pannus in the viewing area was reported in 5 anterior approach cases and 4 lateral cases. Hardware was noted in 2 lateral approach cases, none in anterior approach cases. Sedation was used for 3 of the anterior approach cases and none of the lateral approach cases.
Patients undergoing the lateral approach received a higher median radiation dose than did those undergoing the anterior approach, but this was not statistically significant (P = .07) (Table). Those undergoing the lateral approach also had a longer median exposure time than did those undergoing the anterior approach, but this also was not statistically significant (P = .3). With no immediate complications reported in any of the studied interventions, there was no difference in complication rates between anterior and lateral approach cases.
Discussion
Pain medicine fellows who have previously completed residency in a variety of disciplines, often either anesthesiology or physical medicine and rehabilitation, perform fluoroscopically guided procedures and benefit from increased experience with coaxial technique as this improves needle depth and location awareness. Once mastered, this skill set can be applied to and useful for multiple interventional pain procedures. Similar technical instruction with an emphasis on coaxial technique for hip injections as performed in the anterior or anterolateral approach can be used in both fluoroscopic and ultrasound-guided procedures, including facet injection, transforaminal epidural steroid injection, and myriad other procedures performed to ameliorate pain. There are advantages to pursuing a similar approach with all image-guided procedures. Evaluated in this comparison study is an alternative technique that has potential for risk reduction benefit with reduced proximity to neurovascular structures, which ultimately leads to a safer procedure profile.
Using a lateral approach, the interventionalist determines a starting point, entering the skin at a greater distance from any overlying pannus and the elevated concentration of gram-negative and gram-positive bacteria contained within the inguinal skin.6 A previous study demonstrated improved success of intra-articular needle tip placement without image guidance in patients with body mass index (BMI) < 30.7 A prior study of anterior approach using anatomic landmarks as compared to lateral approach demonstrated the anterior approach pierced or contacted the femoral nerve in 27% of anterior cases and came within 5 mm of 60% of anterior cases.2 Use of image guidance, whether ultrasound, fluoroscopy, or computed tomography (CT) is preferred related to reduced risk of contact with adjacent neurovascular structures. Anatomic surface landmarks have been described as an alternative injection technique, without the use of fluoroscopy for confirmatory initial, intraprocedure, and final placement.8 Palpation of anatomic structures is required for this nonimage-guided technique, and although similar to the described technique in this study, the anatomically guided injection starting point is more lateral than the anterior approach but not in the most lateral position in the transverse plane that is used for this fluoroscopically guided lateral approach study.
Physiologic characteristics of subjects and technical aspects of fluoroscopy both can be factors in radiation dose and exposure times for hip injections. Patient BMI was not included in the data collection, but further study would seek to determine whether BMI is a significant risk for any increased radiation dose and exposure times using lateral approach injections. Use of lateral images for fluoroscopy requires penetration of X-ray beam through more tissue compared with that of anterior-posterior images. Further study of these techniques would benefit from comparing the pulse rate of fluoroscopic images and collimation (or focusing of the radiation beam over a smaller area of tissue) as factors in any observed increase in total radiation dose and exposure times.
Improving the safety profile of this procedure could have a positive impact on the patient population receiving fluoroscopic hip injections, both within the VA Ann Arbor Health System and elsewhere. While the study population was limited to the VA patient population seeking subspecialty nonsurgical joint care at a single tertiary care center, this technique is generalizable and can be used in most patients, as hip pain is a common condition necessitating nonoperative evaluation and treatment.
Radiation Exposures
As our analysis demonstrates, mean radiation dose exposure for each group was consistent with low (≤ 3 mSv) to moderate (> 3-20 mSv) annual effective doses in the general population.7 Both anterior and lateral median radiation dose of 1 mGy and 3 mGy, respectively, are within the standard exposure for radiographs of the pelvis (1.31 mGy).9 It is therefore reasonable to consider a lateral approach for hip injection, given the benefits of direct coaxial approach and avoiding needle entry through higher bacteria-concentrated skin.
The lateral approach did have increased radiation dose and exposure time, although it was not statistically significantly greater than the anterior approach. The difference between radiation dose and time to perform either technique was not clinically significant. One potential explanation for this is that the lateral technique has increased tissue to penetrate, which can be reduced with collimation and other fluoroscopic image adjustments. Additionally, as trainees progress in competency, fewer images should need to be obtained.7 We hypothesize that as familiarity and comfort with this technique increase, the number of images necessary for successful injection would decrease, leading to decreased radiation dose and exposure time. We would expect that in the hands of a board-certified interventionalist, radiation dose and exposure time would be significantly decreased as compared to our current dataset, and this is an area of planned further study. With our existing dataset, the majority of procedures were performed with trainees, with inadequate information documented for comparison of dose over time and procedural experience under individual physicians.
Notable strengths of this study are the direct comparison of the anterior approach when compared to the lateral approach with regard to radiation dose and exposure time, which we have not seen described in the literature. A detailed description of the technique may result in increased utilization by other providers. Data were collected from multiple providers, as board-certified pain physicians and board-eligible interventional pain fellows performed the procedures. This variability in providers increases the generalizability of the findings, with a variety of providers, disciplines, years of experiences, and type of training represented.
Limitations
Limitations include the retrospective nature of the study and the relatively small sample size. However, even with this limitation, it is notable that no statistically significant differences were observed in mean radiation dose or fluoroscopy exposure time, making the lateral approach, at minimum, a noninferior technique. Combined with the improved safety profile, this technique is a viable alternative to the traditional anterior-oblique approach. Further study should be performed, such as a prospective, randomized control trial investigating the 2 techniques and following pain scores and functional ability after the procedure.
Conclusion
Given the decreased procedural risk related to proximity of neurovascular structures and coaxial technique for needle advancement, lateral approach for hip injection should be considered by those in any discipline performing fluoroscopically guided procedures. Lateral technique may be particularly useful in technically challenging cases and when skin entry at the anterior groin is suboptimal, as a noninferior alternative to traditional anterior method.
1. Cianfoni A, Boulter DJ, Rumboldt Z, Sapton T, Bonaldi G. Guidelines to imaging landmarks for interventional spine procedures: fluoroscopy and CT anatomy. Neurographics. 2011;1(1):39-48.
2. Leopold SS, Battista V, Oliverio JA. Safety and efficacy of intraarticular hip injection using anatomic landmarks. Clin Orthop Relat Res. 2001;(391):192-197.
3. Dodré E, Lefebvre G, Cockenpot E, Chastanet P, Cotten A. Interventional MSK procedures: the hip. Br J Radiol. 2016;89(1057):20150408.
4. Hankey S, McCall IW, Park WM, O’Connor BT. Technical problems in arthrography of the painful hip arthroplasty. Clin Radiol. 1979;30(6):653-656.
5. Yasar E, Singh JR, Hill J, Akuthota V. Image-guided injections of the hip. J Nov Physiother Phys Rehabil. 2014;1(2):39-48.
6. Aly R, Maibach HI. Aerobic microbial flora of intertrigenous skin. Appl Environ Microbiol. 1977;33(1):97-100.
7. Fazel R, Krumholz HM, Wang W, et al. Exposure to low-dose ionizing radiation from medical imaging procedures. N Engl J Med. 2009;361(9):849-857.
8. Masoud MA, Said HG. Intra-articular hip injection using anatomic surface landmarks. Arthosc Tech. 2013;2(2):e147-e149.
9. Ofori K, Gordon SW, Akrobortu E, Ampene AA, Darko EO. Estimation of adult patient doses for selected x-ray diagnostic examinations. J Radiat Res Appl Sci. 2014;7(4):459-462.
Hip injections are performed as diagnostic and therapeutic interventions across a variety of medical subspecialties, including but not limited to those practicing physical medicine and rehabilitation, pain medicine, sports medicine, orthopedic surgery, and radiology. Traditional image-guided intra-articular hip injection commonly uses an anterior-oblique approach from a starting point on the anterior groin traversing soft tissue anterior to the femoral neck to the target needle placement at the femoral head-neck junction.
In fluoroscopic procedures, a coaxial technique for needles placement is used for safe and precise insertion of needles. An X-ray beam is angled in line with the projected path of the needle from skin entry point to injection target. Coaxial, en face technique (also called EF, parallel, hub view, down the barrel, or barrel view) appears as a single radiopaque dot over the target injection site.1 This technique minimizes needle redirection for correction of the injection path and minimal disturbance of surrounding tissue on the approach to the intended target.
Noncoaxial technique, as used in the anterior-oblique approach, intentionally directs the needle away from a skin entry point, the needle barrel traversing the X-ray beam toward an injection target. Clinical challenges to injection with the anterior-oblique approach include using a noncoaxial technique. Additional challenges to the anterior-oblique (also referred to as anterior) approach are body habitus and pannus, proximity to neurovascular structures, and patient positioning. By understanding the risks and benefits of varied technical approaches to accomplish a clinical goal and outcome, trainees are better able to select the technique most appropriate for a varied patient population.
Common risks to patients for all intra-articular interventions include bleeding, infection, and pain. Risk of damage to nearby structures is often mentioned as part of a standard informed consent process as it relates to the femoral vein, artery, and nerve that are in close anatomical proximity to the target injection site. When prior studies have examined the risk of complications resulting from intra-articular hip injections, a common conclusion is that despite a relatively low-risk profile for skilled interventionalists, efforts to avoid needle placement in the medial 50% of the femoral head on antero-posterior imaging is recommended.2
The anterior technique is a commonly described approach, and the same can be used for both ultrasound-guided and fluoroscopically guided hip injections.3 Using ultrasound guidance, the anterior technique can be performed with in-plane direct visualization of the needle throughout the procedure. With fluoroscopic guidance, the anterior approach is performed out-of-plane, using the noncoaxial technique. This requires the interventionalist to use tactile and anatomic guidance to the target injection site. The anterior approach for hip injection is one of few interventions where coaxial technique is not used for the procedure, making the instruction for a learner less concrete and potentially more challenging related to the needle path not under direct visualization in plane with the X-ray beam.
Technical guidance and detailed instruction for the lateral approach is infrequently described in fluoroscopic interventional texts. Reference to a lateral approach hip injection was made as early as the 1970s, without detail provided on the technique, with respect to the advantage of visualization of the hip joint for needle placement when hardware is in place.4 A more recent article described a lateral approach technique involving the patient in a decubitus (lateral) supine position, which presents limitations in consistent fluoroscopic imaging and can be a challenging static position for the patient to maintain.5
The retrospective review of anterior-oblique and lateral approach procedures in this study aims to demonstrate that there is no significant difference in radiation exposure, rate of successful intra-articular injection, or complication rate. If proven as a noninferior technique, the lateral approach may be a valuable interventional skill to those performing hip injections. Potential benefits to the patient and provider include options for the provider to access the joint using either technique. Additionally, the approach can be added to the instructional plan for those practitioners providing technical instruction to trainees within their health care system.
Methods
The institutional review board at the VA Ann Arbor Healthcare System reviewed and granted approval for this study. One of 5 interventional pain physician staff members at the VA Ann Arbor Healthcare System performed fluoroscopically guided hip injections. Interventional pain fellows under the direct supervision of board-certified physicians performed the procedures for the study cases. Supervising physicians included both physiatrists and anesthesiologists. Images were reviewed and evaluated without corresponding patient biographic data.
For cases using the lateral approach, the patients were positioned supine on the fluoroscopy table. In anterior-posterior and lateral views, trajectory lines are drawn using a long metal marking rod held adjacent to the patient. With pulsed low-dose fluoroscopy, transverse lines are drawn to identify midpoint of the femoral head in lateral view (Figure 1A, x-axis) and the most direct line from skin to lateral femoral head neck junction joint target (Figure 1B, z-axis). Also confirmed in lateral view, the z-axis marked line drawn on the skin is used to confirm that this transverse plane crosses the overlapping femoral heads (Figure 1A, y-axis).
The cross-section of these transverse and coronal plane lines identifies the starting point for the most direct approach from skin to injection target at femoral head-neck junction. Using the coaxial technique in the lateral view, the needle is introduced and advanced using intermittent fluoroscopic images to the lateral joint target. Continuing in this view, the interventionalist can ensure that advancing the needle to the osseous endpoint will place the tip at the midpoint of the femoral head at the target on the lateral surface, avoiding inadvertent advance of the needle anterior or posterior the femoral head. Final needle placement confirmation is then completed in antero-posterior view (Figure 2A). Contrast enhancement is used to confirm intra-articular spread (Figure 2B).
Cases included in the study were performed over an 8-month period in 2017. Case images recorded in IntelliSpace PACS Radiology software (Andover, MA) were included by creating a list of all cases performed and documented using the major joint injection procedure code. The cases reviewed began with the most recent cases. Two research team members (1 radiologist and 1 interventional pain physician) reviewed the series of saved images for each patient and the associated procedure report. The research team members documented and recorded de-identified study data in Microsoft Excel (Redmond, WA).
Imaging reports, using the saved images and the associated procedure report, were classified for technical approach (anterior, lateral, or inconclusive), success of joint injection as evidenced by appropriate contrast enhancement within the joint space (successful, unsuccessful, or incomplete images), documented use of sedation (yes, no), patient positioning (supine, prone), radiation exposure dose, radiation exposure time, and additional comments, such as “notable pannus” or “hardware present” to annotate significant findings on imaging review.
Statistical Analysis
The distribution of 2 outcomes used to compare rates of complication, radiation dose, and exposure time was checked using the Shapiro-Wilk test. Power analysis determined that inclusion of 30 anterior and 30 lateral cases results in adequate power to detect a 1-point mean difference, assuming a standard deviation of 1.5 in each group. Both radiation dose and exposure time were found to be nonnormally distributed (W = 0.65, P < .001; W = 0.86, P < .001; respectively). Median and interquartile range (IQR) of dose and time in seconds for anterior and lateral approaches were computed. Median differences in radiation dose and exposure time between anterior and lateral approaches were assessed with the k-sample test of equality of medians. All analyses were conducted using Stata Version 14.1 (College Station, TX).
Results
Between June 2017 and January 2018, 88 cases were reviewed as performed, with 30 anterior and 30 lateral approach cases included in this retrospective comparison study. A total of 28 cases were excluded from the study for using an inconclusive approach, multiple or bilateral procedures, cases without recorded dose and time data, and inadequately saved images to provide meaningful data (Figure 3).
Rate of successful intervention with needle placement confirmed within the articular space on contrast enhancement was not significantly different in the study groups with 96.7% (29 of 30) anterior approach cases reported as successful, 100% (30 of 30) lateral approach cases reported as successful. Overhanging pannus in the viewing area was reported in 5 anterior approach cases and 4 lateral cases. Hardware was noted in 2 lateral approach cases, none in anterior approach cases. Sedation was used for 3 of the anterior approach cases and none of the lateral approach cases.
Patients undergoing the lateral approach received a higher median radiation dose than did those undergoing the anterior approach, but this was not statistically significant (P = .07) (Table). Those undergoing the lateral approach also had a longer median exposure time than did those undergoing the anterior approach, but this also was not statistically significant (P = .3). With no immediate complications reported in any of the studied interventions, there was no difference in complication rates between anterior and lateral approach cases.
Discussion
Pain medicine fellows who have previously completed residency in a variety of disciplines, often either anesthesiology or physical medicine and rehabilitation, perform fluoroscopically guided procedures and benefit from increased experience with coaxial technique as this improves needle depth and location awareness. Once mastered, this skill set can be applied to and useful for multiple interventional pain procedures. Similar technical instruction with an emphasis on coaxial technique for hip injections as performed in the anterior or anterolateral approach can be used in both fluoroscopic and ultrasound-guided procedures, including facet injection, transforaminal epidural steroid injection, and myriad other procedures performed to ameliorate pain. There are advantages to pursuing a similar approach with all image-guided procedures. Evaluated in this comparison study is an alternative technique that has potential for risk reduction benefit with reduced proximity to neurovascular structures, which ultimately leads to a safer procedure profile.
Using a lateral approach, the interventionalist determines a starting point, entering the skin at a greater distance from any overlying pannus and the elevated concentration of gram-negative and gram-positive bacteria contained within the inguinal skin.6 A previous study demonstrated improved success of intra-articular needle tip placement without image guidance in patients with body mass index (BMI) < 30.7 A prior study of anterior approach using anatomic landmarks as compared to lateral approach demonstrated the anterior approach pierced or contacted the femoral nerve in 27% of anterior cases and came within 5 mm of 60% of anterior cases.2 Use of image guidance, whether ultrasound, fluoroscopy, or computed tomography (CT) is preferred related to reduced risk of contact with adjacent neurovascular structures. Anatomic surface landmarks have been described as an alternative injection technique, without the use of fluoroscopy for confirmatory initial, intraprocedure, and final placement.8 Palpation of anatomic structures is required for this nonimage-guided technique, and although similar to the described technique in this study, the anatomically guided injection starting point is more lateral than the anterior approach but not in the most lateral position in the transverse plane that is used for this fluoroscopically guided lateral approach study.
Physiologic characteristics of subjects and technical aspects of fluoroscopy both can be factors in radiation dose and exposure times for hip injections. Patient BMI was not included in the data collection, but further study would seek to determine whether BMI is a significant risk for any increased radiation dose and exposure times using lateral approach injections. Use of lateral images for fluoroscopy requires penetration of X-ray beam through more tissue compared with that of anterior-posterior images. Further study of these techniques would benefit from comparing the pulse rate of fluoroscopic images and collimation (or focusing of the radiation beam over a smaller area of tissue) as factors in any observed increase in total radiation dose and exposure times.
Improving the safety profile of this procedure could have a positive impact on the patient population receiving fluoroscopic hip injections, both within the VA Ann Arbor Health System and elsewhere. While the study population was limited to the VA patient population seeking subspecialty nonsurgical joint care at a single tertiary care center, this technique is generalizable and can be used in most patients, as hip pain is a common condition necessitating nonoperative evaluation and treatment.
Radiation Exposures
As our analysis demonstrates, mean radiation dose exposure for each group was consistent with low (≤ 3 mSv) to moderate (> 3-20 mSv) annual effective doses in the general population.7 Both anterior and lateral median radiation dose of 1 mGy and 3 mGy, respectively, are within the standard exposure for radiographs of the pelvis (1.31 mGy).9 It is therefore reasonable to consider a lateral approach for hip injection, given the benefits of direct coaxial approach and avoiding needle entry through higher bacteria-concentrated skin.
The lateral approach did have increased radiation dose and exposure time, although it was not statistically significantly greater than the anterior approach. The difference between radiation dose and time to perform either technique was not clinically significant. One potential explanation for this is that the lateral technique has increased tissue to penetrate, which can be reduced with collimation and other fluoroscopic image adjustments. Additionally, as trainees progress in competency, fewer images should need to be obtained.7 We hypothesize that as familiarity and comfort with this technique increase, the number of images necessary for successful injection would decrease, leading to decreased radiation dose and exposure time. We would expect that in the hands of a board-certified interventionalist, radiation dose and exposure time would be significantly decreased as compared to our current dataset, and this is an area of planned further study. With our existing dataset, the majority of procedures were performed with trainees, with inadequate information documented for comparison of dose over time and procedural experience under individual physicians.
Notable strengths of this study are the direct comparison of the anterior approach when compared to the lateral approach with regard to radiation dose and exposure time, which we have not seen described in the literature. A detailed description of the technique may result in increased utilization by other providers. Data were collected from multiple providers, as board-certified pain physicians and board-eligible interventional pain fellows performed the procedures. This variability in providers increases the generalizability of the findings, with a variety of providers, disciplines, years of experiences, and type of training represented.
Limitations
Limitations include the retrospective nature of the study and the relatively small sample size. However, even with this limitation, it is notable that no statistically significant differences were observed in mean radiation dose or fluoroscopy exposure time, making the lateral approach, at minimum, a noninferior technique. Combined with the improved safety profile, this technique is a viable alternative to the traditional anterior-oblique approach. Further study should be performed, such as a prospective, randomized control trial investigating the 2 techniques and following pain scores and functional ability after the procedure.
Conclusion
Given the decreased procedural risk related to proximity of neurovascular structures and coaxial technique for needle advancement, lateral approach for hip injection should be considered by those in any discipline performing fluoroscopically guided procedures. Lateral technique may be particularly useful in technically challenging cases and when skin entry at the anterior groin is suboptimal, as a noninferior alternative to traditional anterior method.
Hip injections are performed as diagnostic and therapeutic interventions across a variety of medical subspecialties, including but not limited to those practicing physical medicine and rehabilitation, pain medicine, sports medicine, orthopedic surgery, and radiology. Traditional image-guided intra-articular hip injection commonly uses an anterior-oblique approach from a starting point on the anterior groin traversing soft tissue anterior to the femoral neck to the target needle placement at the femoral head-neck junction.
In fluoroscopic procedures, a coaxial technique for needles placement is used for safe and precise insertion of needles. An X-ray beam is angled in line with the projected path of the needle from skin entry point to injection target. Coaxial, en face technique (also called EF, parallel, hub view, down the barrel, or barrel view) appears as a single radiopaque dot over the target injection site.1 This technique minimizes needle redirection for correction of the injection path and minimal disturbance of surrounding tissue on the approach to the intended target.
Noncoaxial technique, as used in the anterior-oblique approach, intentionally directs the needle away from a skin entry point, the needle barrel traversing the X-ray beam toward an injection target. Clinical challenges to injection with the anterior-oblique approach include using a noncoaxial technique. Additional challenges to the anterior-oblique (also referred to as anterior) approach are body habitus and pannus, proximity to neurovascular structures, and patient positioning. By understanding the risks and benefits of varied technical approaches to accomplish a clinical goal and outcome, trainees are better able to select the technique most appropriate for a varied patient population.
Common risks to patients for all intra-articular interventions include bleeding, infection, and pain. Risk of damage to nearby structures is often mentioned as part of a standard informed consent process as it relates to the femoral vein, artery, and nerve that are in close anatomical proximity to the target injection site. When prior studies have examined the risk of complications resulting from intra-articular hip injections, a common conclusion is that despite a relatively low-risk profile for skilled interventionalists, efforts to avoid needle placement in the medial 50% of the femoral head on antero-posterior imaging is recommended.2
The anterior technique is a commonly described approach, and the same can be used for both ultrasound-guided and fluoroscopically guided hip injections.3 Using ultrasound guidance, the anterior technique can be performed with in-plane direct visualization of the needle throughout the procedure. With fluoroscopic guidance, the anterior approach is performed out-of-plane, using the noncoaxial technique. This requires the interventionalist to use tactile and anatomic guidance to the target injection site. The anterior approach for hip injection is one of few interventions where coaxial technique is not used for the procedure, making the instruction for a learner less concrete and potentially more challenging related to the needle path not under direct visualization in plane with the X-ray beam.
Technical guidance and detailed instruction for the lateral approach is infrequently described in fluoroscopic interventional texts. Reference to a lateral approach hip injection was made as early as the 1970s, without detail provided on the technique, with respect to the advantage of visualization of the hip joint for needle placement when hardware is in place.4 A more recent article described a lateral approach technique involving the patient in a decubitus (lateral) supine position, which presents limitations in consistent fluoroscopic imaging and can be a challenging static position for the patient to maintain.5
The retrospective review of anterior-oblique and lateral approach procedures in this study aims to demonstrate that there is no significant difference in radiation exposure, rate of successful intra-articular injection, or complication rate. If proven as a noninferior technique, the lateral approach may be a valuable interventional skill to those performing hip injections. Potential benefits to the patient and provider include options for the provider to access the joint using either technique. Additionally, the approach can be added to the instructional plan for those practitioners providing technical instruction to trainees within their health care system.
Methods
The institutional review board at the VA Ann Arbor Healthcare System reviewed and granted approval for this study. One of 5 interventional pain physician staff members at the VA Ann Arbor Healthcare System performed fluoroscopically guided hip injections. Interventional pain fellows under the direct supervision of board-certified physicians performed the procedures for the study cases. Supervising physicians included both physiatrists and anesthesiologists. Images were reviewed and evaluated without corresponding patient biographic data.
For cases using the lateral approach, the patients were positioned supine on the fluoroscopy table. In anterior-posterior and lateral views, trajectory lines are drawn using a long metal marking rod held adjacent to the patient. With pulsed low-dose fluoroscopy, transverse lines are drawn to identify midpoint of the femoral head in lateral view (Figure 1A, x-axis) and the most direct line from skin to lateral femoral head neck junction joint target (Figure 1B, z-axis). Also confirmed in lateral view, the z-axis marked line drawn on the skin is used to confirm that this transverse plane crosses the overlapping femoral heads (Figure 1A, y-axis).
The cross-section of these transverse and coronal plane lines identifies the starting point for the most direct approach from skin to injection target at femoral head-neck junction. Using the coaxial technique in the lateral view, the needle is introduced and advanced using intermittent fluoroscopic images to the lateral joint target. Continuing in this view, the interventionalist can ensure that advancing the needle to the osseous endpoint will place the tip at the midpoint of the femoral head at the target on the lateral surface, avoiding inadvertent advance of the needle anterior or posterior the femoral head. Final needle placement confirmation is then completed in antero-posterior view (Figure 2A). Contrast enhancement is used to confirm intra-articular spread (Figure 2B).
Cases included in the study were performed over an 8-month period in 2017. Case images recorded in IntelliSpace PACS Radiology software (Andover, MA) were included by creating a list of all cases performed and documented using the major joint injection procedure code. The cases reviewed began with the most recent cases. Two research team members (1 radiologist and 1 interventional pain physician) reviewed the series of saved images for each patient and the associated procedure report. The research team members documented and recorded de-identified study data in Microsoft Excel (Redmond, WA).
Imaging reports, using the saved images and the associated procedure report, were classified for technical approach (anterior, lateral, or inconclusive), success of joint injection as evidenced by appropriate contrast enhancement within the joint space (successful, unsuccessful, or incomplete images), documented use of sedation (yes, no), patient positioning (supine, prone), radiation exposure dose, radiation exposure time, and additional comments, such as “notable pannus” or “hardware present” to annotate significant findings on imaging review.
Statistical Analysis
The distribution of 2 outcomes used to compare rates of complication, radiation dose, and exposure time was checked using the Shapiro-Wilk test. Power analysis determined that inclusion of 30 anterior and 30 lateral cases results in adequate power to detect a 1-point mean difference, assuming a standard deviation of 1.5 in each group. Both radiation dose and exposure time were found to be nonnormally distributed (W = 0.65, P < .001; W = 0.86, P < .001; respectively). Median and interquartile range (IQR) of dose and time in seconds for anterior and lateral approaches were computed. Median differences in radiation dose and exposure time between anterior and lateral approaches were assessed with the k-sample test of equality of medians. All analyses were conducted using Stata Version 14.1 (College Station, TX).
Results
Between June 2017 and January 2018, 88 cases were reviewed as performed, with 30 anterior and 30 lateral approach cases included in this retrospective comparison study. A total of 28 cases were excluded from the study for using an inconclusive approach, multiple or bilateral procedures, cases without recorded dose and time data, and inadequately saved images to provide meaningful data (Figure 3).
Rate of successful intervention with needle placement confirmed within the articular space on contrast enhancement was not significantly different in the study groups with 96.7% (29 of 30) anterior approach cases reported as successful, 100% (30 of 30) lateral approach cases reported as successful. Overhanging pannus in the viewing area was reported in 5 anterior approach cases and 4 lateral cases. Hardware was noted in 2 lateral approach cases, none in anterior approach cases. Sedation was used for 3 of the anterior approach cases and none of the lateral approach cases.
Patients undergoing the lateral approach received a higher median radiation dose than did those undergoing the anterior approach, but this was not statistically significant (P = .07) (Table). Those undergoing the lateral approach also had a longer median exposure time than did those undergoing the anterior approach, but this also was not statistically significant (P = .3). With no immediate complications reported in any of the studied interventions, there was no difference in complication rates between anterior and lateral approach cases.
Discussion
Pain medicine fellows who have previously completed residency in a variety of disciplines, often either anesthesiology or physical medicine and rehabilitation, perform fluoroscopically guided procedures and benefit from increased experience with coaxial technique as this improves needle depth and location awareness. Once mastered, this skill set can be applied to and useful for multiple interventional pain procedures. Similar technical instruction with an emphasis on coaxial technique for hip injections as performed in the anterior or anterolateral approach can be used in both fluoroscopic and ultrasound-guided procedures, including facet injection, transforaminal epidural steroid injection, and myriad other procedures performed to ameliorate pain. There are advantages to pursuing a similar approach with all image-guided procedures. Evaluated in this comparison study is an alternative technique that has potential for risk reduction benefit with reduced proximity to neurovascular structures, which ultimately leads to a safer procedure profile.
Using a lateral approach, the interventionalist determines a starting point, entering the skin at a greater distance from any overlying pannus and the elevated concentration of gram-negative and gram-positive bacteria contained within the inguinal skin.6 A previous study demonstrated improved success of intra-articular needle tip placement without image guidance in patients with body mass index (BMI) < 30.7 A prior study of anterior approach using anatomic landmarks as compared to lateral approach demonstrated the anterior approach pierced or contacted the femoral nerve in 27% of anterior cases and came within 5 mm of 60% of anterior cases.2 Use of image guidance, whether ultrasound, fluoroscopy, or computed tomography (CT) is preferred related to reduced risk of contact with adjacent neurovascular structures. Anatomic surface landmarks have been described as an alternative injection technique, without the use of fluoroscopy for confirmatory initial, intraprocedure, and final placement.8 Palpation of anatomic structures is required for this nonimage-guided technique, and although similar to the described technique in this study, the anatomically guided injection starting point is more lateral than the anterior approach but not in the most lateral position in the transverse plane that is used for this fluoroscopically guided lateral approach study.
Physiologic characteristics of subjects and technical aspects of fluoroscopy both can be factors in radiation dose and exposure times for hip injections. Patient BMI was not included in the data collection, but further study would seek to determine whether BMI is a significant risk for any increased radiation dose and exposure times using lateral approach injections. Use of lateral images for fluoroscopy requires penetration of X-ray beam through more tissue compared with that of anterior-posterior images. Further study of these techniques would benefit from comparing the pulse rate of fluoroscopic images and collimation (or focusing of the radiation beam over a smaller area of tissue) as factors in any observed increase in total radiation dose and exposure times.
Improving the safety profile of this procedure could have a positive impact on the patient population receiving fluoroscopic hip injections, both within the VA Ann Arbor Health System and elsewhere. While the study population was limited to the VA patient population seeking subspecialty nonsurgical joint care at a single tertiary care center, this technique is generalizable and can be used in most patients, as hip pain is a common condition necessitating nonoperative evaluation and treatment.
Radiation Exposures
As our analysis demonstrates, mean radiation dose exposure for each group was consistent with low (≤ 3 mSv) to moderate (> 3-20 mSv) annual effective doses in the general population.7 Both anterior and lateral median radiation dose of 1 mGy and 3 mGy, respectively, are within the standard exposure for radiographs of the pelvis (1.31 mGy).9 It is therefore reasonable to consider a lateral approach for hip injection, given the benefits of direct coaxial approach and avoiding needle entry through higher bacteria-concentrated skin.
The lateral approach did have increased radiation dose and exposure time, although it was not statistically significantly greater than the anterior approach. The difference between radiation dose and time to perform either technique was not clinically significant. One potential explanation for this is that the lateral technique has increased tissue to penetrate, which can be reduced with collimation and other fluoroscopic image adjustments. Additionally, as trainees progress in competency, fewer images should need to be obtained.7 We hypothesize that as familiarity and comfort with this technique increase, the number of images necessary for successful injection would decrease, leading to decreased radiation dose and exposure time. We would expect that in the hands of a board-certified interventionalist, radiation dose and exposure time would be significantly decreased as compared to our current dataset, and this is an area of planned further study. With our existing dataset, the majority of procedures were performed with trainees, with inadequate information documented for comparison of dose over time and procedural experience under individual physicians.
Notable strengths of this study are the direct comparison of the anterior approach when compared to the lateral approach with regard to radiation dose and exposure time, which we have not seen described in the literature. A detailed description of the technique may result in increased utilization by other providers. Data were collected from multiple providers, as board-certified pain physicians and board-eligible interventional pain fellows performed the procedures. This variability in providers increases the generalizability of the findings, with a variety of providers, disciplines, years of experiences, and type of training represented.
Limitations
Limitations include the retrospective nature of the study and the relatively small sample size. However, even with this limitation, it is notable that no statistically significant differences were observed in mean radiation dose or fluoroscopy exposure time, making the lateral approach, at minimum, a noninferior technique. Combined with the improved safety profile, this technique is a viable alternative to the traditional anterior-oblique approach. Further study should be performed, such as a prospective, randomized control trial investigating the 2 techniques and following pain scores and functional ability after the procedure.
Conclusion
Given the decreased procedural risk related to proximity of neurovascular structures and coaxial technique for needle advancement, lateral approach for hip injection should be considered by those in any discipline performing fluoroscopically guided procedures. Lateral technique may be particularly useful in technically challenging cases and when skin entry at the anterior groin is suboptimal, as a noninferior alternative to traditional anterior method.
1. Cianfoni A, Boulter DJ, Rumboldt Z, Sapton T, Bonaldi G. Guidelines to imaging landmarks for interventional spine procedures: fluoroscopy and CT anatomy. Neurographics. 2011;1(1):39-48.
2. Leopold SS, Battista V, Oliverio JA. Safety and efficacy of intraarticular hip injection using anatomic landmarks. Clin Orthop Relat Res. 2001;(391):192-197.
3. Dodré E, Lefebvre G, Cockenpot E, Chastanet P, Cotten A. Interventional MSK procedures: the hip. Br J Radiol. 2016;89(1057):20150408.
4. Hankey S, McCall IW, Park WM, O’Connor BT. Technical problems in arthrography of the painful hip arthroplasty. Clin Radiol. 1979;30(6):653-656.
5. Yasar E, Singh JR, Hill J, Akuthota V. Image-guided injections of the hip. J Nov Physiother Phys Rehabil. 2014;1(2):39-48.
6. Aly R, Maibach HI. Aerobic microbial flora of intertrigenous skin. Appl Environ Microbiol. 1977;33(1):97-100.
7. Fazel R, Krumholz HM, Wang W, et al. Exposure to low-dose ionizing radiation from medical imaging procedures. N Engl J Med. 2009;361(9):849-857.
8. Masoud MA, Said HG. Intra-articular hip injection using anatomic surface landmarks. Arthosc Tech. 2013;2(2):e147-e149.
9. Ofori K, Gordon SW, Akrobortu E, Ampene AA, Darko EO. Estimation of adult patient doses for selected x-ray diagnostic examinations. J Radiat Res Appl Sci. 2014;7(4):459-462.
1. Cianfoni A, Boulter DJ, Rumboldt Z, Sapton T, Bonaldi G. Guidelines to imaging landmarks for interventional spine procedures: fluoroscopy and CT anatomy. Neurographics. 2011;1(1):39-48.
2. Leopold SS, Battista V, Oliverio JA. Safety and efficacy of intraarticular hip injection using anatomic landmarks. Clin Orthop Relat Res. 2001;(391):192-197.
3. Dodré E, Lefebvre G, Cockenpot E, Chastanet P, Cotten A. Interventional MSK procedures: the hip. Br J Radiol. 2016;89(1057):20150408.
4. Hankey S, McCall IW, Park WM, O’Connor BT. Technical problems in arthrography of the painful hip arthroplasty. Clin Radiol. 1979;30(6):653-656.
5. Yasar E, Singh JR, Hill J, Akuthota V. Image-guided injections of the hip. J Nov Physiother Phys Rehabil. 2014;1(2):39-48.
6. Aly R, Maibach HI. Aerobic microbial flora of intertrigenous skin. Appl Environ Microbiol. 1977;33(1):97-100.
7. Fazel R, Krumholz HM, Wang W, et al. Exposure to low-dose ionizing radiation from medical imaging procedures. N Engl J Med. 2009;361(9):849-857.
8. Masoud MA, Said HG. Intra-articular hip injection using anatomic surface landmarks. Arthosc Tech. 2013;2(2):e147-e149.
9. Ofori K, Gordon SW, Akrobortu E, Ampene AA, Darko EO. Estimation of adult patient doses for selected x-ray diagnostic examinations. J Radiat Res Appl Sci. 2014;7(4):459-462.