Affiliations
Division of Hospital Medicine, University of Kentucky
Given name(s)
Joseph R.
Family name
Sweigart
Degrees
MD

Hospital medicine resident training tracks: Developing the hospital medicine pipeline

Article Type
Changed
Wed, 03/29/2017 - 06:44
Display Headline
Hospital medicine resident training tracks: Developing the hospital medicine pipeline

The field of hospital medicine (HM) is rapidly expanding in the areas of clinical medicine, administration, and quality improvement (QI).1 Emerging with this growth is a gap in the traditional internal medicine (IM) training and skills needed to be effective in HM.1,2 These skills include clinical and nonclinical aptitudes, such as process improvement, health care economics, and leadership.1-3 However, resident education on these topics must compete with other required curricular content in IM residency training.2,4 Few IM residencies offer focused HM training that emphasizes key components of successful HM careers.3,5

Within the past decade, designated HM tracks within IM residency programs have been proposed as a potential solution. Initially, calls for such tracks focused on gaps in the clinical competencies required of hospitalists.1 Tracks have since evolved to also include skills required to drive high-value care, process improvement, and scholarship. Designated HM tracks address these areas through greater breadth of curricula, additional time for reflection, participation in group projects, and active application to clinical care.4 We conducted a study to identify themes that could inform the ongoing evolution of dedicated HM tracks.

METHODS

Programs were initially identified through communication among professional networks. The phrases hospital medicine residency track and internal medicine residency hospitalist track were used in broader Google searches, as there is no database of such tracks. Searches were performed quarterly during the 2015–2016 academic year. The top 20 hits were manually filtered to identify tracks affiliated with major academic centers. IM residency program websites provided basic information for programs with tracks. We excluded tracks focused entirely on QI6 because, though a crucial part of HM, QI training alone is probably insufficient for preparing residents for success as hospitalists on residency completion. Similarly, IM residencies with stand-alone HM clinical rotations without longitudinal HM curricula were excluded.

Semistructured interviews with track directors were conducted by e-mail or telephone for all tracks except one, the details of which are published.7 We tabulated data and reviewed qualitative information to identify themes among the different tracks. As this study did not involve human participants, Institutional Review Board approval was not needed.

RESULTS

We identified 11 HM residency training programs at major academic centers across the United States: Cleveland Clinic, Stanford University, Tulane University, University of California Davis, University of California Irvine, University of Colorado, University of Kentucky, University of Minnesota, University of New Mexico, Virginia Commonwealth University, and Wake Forest University (Table 1). We reviewed the websites of about 10 other programs, but none suggested existence of a track. Additional programs contacted reported no current track.

Demographic and structural characteristics of current hospital medicine tracks
Table 1

Track Participants and Structure

HM tracks mainly target third-year residents (Table 1). Some extend into the second year of residency, and 4 have opportunities for intern involvement, including a separate match number at Colorado. Tracks accept up to 12 residents per class. Two programs, at Colorado and Virginia, are part of IM programs in which all residents belong to a track (eg, HM, primary care, research).

 

 

HM track structures vary widely and are heavily influenced by the content delivery platforms of their IM residency programs. Several HM track directors emphasized the importance of fitting into existing educational frameworks to ensure access to residents and to minimize the burden of participation. Four programs deliver the bulk of their nonclinical content in dedicated blocks; 6 others use brief recurring sessions to deliver smaller aliquots longitudinally (Table 1). The number of protected hours for content delivery ranges from 10 to more than 40 annually. All tracks use multiple content delivery modes, including didactic sessions and journal clubs. Four tracks employ panel discussions to explore career options within HM. Several also use online platforms, including discussions, readings, and modules.

Quality Improvement

The vast majority of curricula prominently feature experiential QI project involvement (Table 2). These mentored longitudinal projects allow applied delivery of content, such as QI methods and management skills. Four tracks use material from the Institute for Healthcare Improvement.8 Several also offer dedicated QI rotations that immerse residents in ongoing QI efforts.

Curricular content delivered in current hospital medicine tracks
Table 2

Institutional partnerships support these initiatives at several sites. The Minnesota track is a joint venture of the university and Regions Hospital, a nonprofit community hospital. The Virginia track positions HM residents to lead university-wide interdisciplinary QI teams. For project support, the Colorado and Kentucky tracks partner with local QI resources—the Institute for Healthcare Quality, Safety, and Efficiency at Colorado and the Office of Value and Innovation in Healthcare Delivery at Kentucky.

Health Care Economics and Value

Many programs leverage the rapidly growing emphasis on health care “value” as an opportunity for synergy between IM programs and HM tracks. Examples include involving residents in efforts to improve documentation or didactic instruction on topics such as health care finance. The New Mexico and Wake Forest tracks offer elective rotations on health care economics. Several track directors mentioned successfully expanding curricula on health care value from the HM track into IM residency programs at large, providing a measurable service to the residency programs while ensuring content delivery and freeing up additional time for track activities.

Scholarship and Career Development

Most programs provide targeted career development for residents. Six tracks provide sessions on job procurement skills, such as curriculum vitae preparation and interviewing (Table 2). Many also provide content on venues for disseminating scholarly activity. The Colorado, Kentucky, New Mexico, and Tulane programs feature content on abstract and poster creation. Leadership development is addressed in several tracks through dedicated track activities or participation in discrete, outside-track events. Specifically, Colorado offers a leadership track for residents interested in hospital administration, Cleveland has a leadership journal club, Wake Forest enrolls HM residents in leadership training available through the university, and Minnesota sends residents to the Society of Hospital Medicine’s Leadership Academy (Table 2).

Clinical Rotations

Almost all tracks include a clinical rotation, typically pairing residents directly with hospitalist attendings to encourage autonomy and mentorship. Several also offer elective rotations in various disciplines within HM (Table 2). The Kentucky and Virginia tracks incorporate working with advanced practice providers into their practicums. The Cleveland, Minnesota, Tulane, and Virginia tracks offer HM rotations in community hospitals or postacute settings.

HM rotations also pair clinical experiences with didactic education on relevant topics (eg, billing and coding). The Cleveland, Minnesota, and Virginia tracks developed clinical rotations reflecting the common 7-on and 7-off schedule with nonclinical obligations, such as seminars linking specific content to clinical experiences, during nonclinical time.

DISCUSSION

Our investigation into the current state of HM training found that HM track curricula focus largely on QI, health care economics, and professional development. This focus likely developed in response to hospitalists’ increasing engagement in related endeavors. HM tracks have dynamic and variable structures, reflecting an evolving field and the need to fit into existing IM residency program structures. Similarly, the content covered in HM tracks is tightly linked to perceived opportunities within IM residency curricula. The heterogeneity of content suggests the breadth and ambiguity of necessary competencies for aspiring hospitalists. One of the 11 tracks has not had any residents enroll within the past few years—a testament to the continued effort necessary to sustain such tracks, including curricular updates and recruiting. Conversely, many programs now share track content with the larger IM residency program, suggesting HM tracks may be near the forefront of medical education in some areas.

Our study had several limitations. As we are unaware of any databases of HM tracks, we discussed tracks with professional contacts, performed Internet searches, and reviewed IM residency program websites. Our search, however, was not exhaustive; despite our best efforts, we may have missed or mischaracterized some track offerings. Nevertheless, we think that our analysis represents the first thorough compilation of HM tracks and that it will be useful to institutions seeking to create or enhance HM-specific training.

As the field continues to evolve, we are optimistic about the future of HM training. We suspect that HM residency training tracks will continue to expand. More work is needed so these tracks can adjust to the changing HM and IM residency program landscapes and supply well-trained physicians for the HM workforce.

 

 

Acknowledgment

The authors thank track directors Alpesh Amin, David Gugliotti, Rick Hilger, Karnjit Johl, Nasir Majeed, Georgia McIntosh, Charles Pizanis, and Jeff Wiese for making this study possible.

Disclosure

Nothing to report.

References

1. Glasheen JJ, Siegal EM, Epstein K, Kutner J, Prochazka AV. Fulfilling the promise of hospital medicine: tailoring internal medicine training to address hospitalists’ needs [published correction appears in J Gen Intern Med. 2008;23(11):1931]. J Gen Intern Med. 2008;23(7):1110-1115. PubMed
2. Arora V, Guardiano S, Donaldson D, Storch I, Hemstreet P. Closing the gap between internal medicine training and practice: recommendations from recent graduates. Am J Med. 2005;118(6):680-685. PubMed
3. Glasheen JJ, Goldenberg J, Nelson JR. Achieving hospital medicine’s promise through internal medicine residency redesign. Mt Sinai J Med. 2008;75(5):436-441. PubMed
4. Wiese J. Residency training: beginning with the end in mind. J Gen Intern Med. 2008;23(7):1122-1123. PubMed
5. Glasheen JJ, Epstein KR, Siegal E, Kutner JS, Prochazka AV. The spectrum of community-based hospitalist practice: a call to tailor internal medicine residency training. Arch Intern Med. 2007;167(7):727-728. PubMed
6. Patel N, Brennan PJ, Metlay J, Bellini L, Shannon RP, Myers JS. Building the pipeline: the creation of a residency training pathway for future physician leaders in health care quality. Acad Med. 2015;90(2):185-190. PubMed
7. Kumar A, Smeraglio A, Witteles R, et al. A resident-created hospitalist curriculum for internal medicine housestaff. J Hosp Med. 2016;11(9):646-649. PubMed
8. Institute for Healthcare Improvement website. http://www.ihi.org. Accessed December 15, 2015.

Article PDF
Issue
Journal of Hospital Medicine - 12(3)
Publications
Topics
Page Number
173-176
Sections
Article PDF
Article PDF

The field of hospital medicine (HM) is rapidly expanding in the areas of clinical medicine, administration, and quality improvement (QI).1 Emerging with this growth is a gap in the traditional internal medicine (IM) training and skills needed to be effective in HM.1,2 These skills include clinical and nonclinical aptitudes, such as process improvement, health care economics, and leadership.1-3 However, resident education on these topics must compete with other required curricular content in IM residency training.2,4 Few IM residencies offer focused HM training that emphasizes key components of successful HM careers.3,5

Within the past decade, designated HM tracks within IM residency programs have been proposed as a potential solution. Initially, calls for such tracks focused on gaps in the clinical competencies required of hospitalists.1 Tracks have since evolved to also include skills required to drive high-value care, process improvement, and scholarship. Designated HM tracks address these areas through greater breadth of curricula, additional time for reflection, participation in group projects, and active application to clinical care.4 We conducted a study to identify themes that could inform the ongoing evolution of dedicated HM tracks.

METHODS

Programs were initially identified through communication among professional networks. The phrases hospital medicine residency track and internal medicine residency hospitalist track were used in broader Google searches, as there is no database of such tracks. Searches were performed quarterly during the 2015–2016 academic year. The top 20 hits were manually filtered to identify tracks affiliated with major academic centers. IM residency program websites provided basic information for programs with tracks. We excluded tracks focused entirely on QI6 because, though a crucial part of HM, QI training alone is probably insufficient for preparing residents for success as hospitalists on residency completion. Similarly, IM residencies with stand-alone HM clinical rotations without longitudinal HM curricula were excluded.

Semistructured interviews with track directors were conducted by e-mail or telephone for all tracks except one, the details of which are published.7 We tabulated data and reviewed qualitative information to identify themes among the different tracks. As this study did not involve human participants, Institutional Review Board approval was not needed.

RESULTS

We identified 11 HM residency training programs at major academic centers across the United States: Cleveland Clinic, Stanford University, Tulane University, University of California Davis, University of California Irvine, University of Colorado, University of Kentucky, University of Minnesota, University of New Mexico, Virginia Commonwealth University, and Wake Forest University (Table 1). We reviewed the websites of about 10 other programs, but none suggested existence of a track. Additional programs contacted reported no current track.

Demographic and structural characteristics of current hospital medicine tracks
Table 1

Track Participants and Structure

HM tracks mainly target third-year residents (Table 1). Some extend into the second year of residency, and 4 have opportunities for intern involvement, including a separate match number at Colorado. Tracks accept up to 12 residents per class. Two programs, at Colorado and Virginia, are part of IM programs in which all residents belong to a track (eg, HM, primary care, research).

 

 

HM track structures vary widely and are heavily influenced by the content delivery platforms of their IM residency programs. Several HM track directors emphasized the importance of fitting into existing educational frameworks to ensure access to residents and to minimize the burden of participation. Four programs deliver the bulk of their nonclinical content in dedicated blocks; 6 others use brief recurring sessions to deliver smaller aliquots longitudinally (Table 1). The number of protected hours for content delivery ranges from 10 to more than 40 annually. All tracks use multiple content delivery modes, including didactic sessions and journal clubs. Four tracks employ panel discussions to explore career options within HM. Several also use online platforms, including discussions, readings, and modules.

Quality Improvement

The vast majority of curricula prominently feature experiential QI project involvement (Table 2). These mentored longitudinal projects allow applied delivery of content, such as QI methods and management skills. Four tracks use material from the Institute for Healthcare Improvement.8 Several also offer dedicated QI rotations that immerse residents in ongoing QI efforts.

Curricular content delivered in current hospital medicine tracks
Table 2

Institutional partnerships support these initiatives at several sites. The Minnesota track is a joint venture of the university and Regions Hospital, a nonprofit community hospital. The Virginia track positions HM residents to lead university-wide interdisciplinary QI teams. For project support, the Colorado and Kentucky tracks partner with local QI resources—the Institute for Healthcare Quality, Safety, and Efficiency at Colorado and the Office of Value and Innovation in Healthcare Delivery at Kentucky.

Health Care Economics and Value

Many programs leverage the rapidly growing emphasis on health care “value” as an opportunity for synergy between IM programs and HM tracks. Examples include involving residents in efforts to improve documentation or didactic instruction on topics such as health care finance. The New Mexico and Wake Forest tracks offer elective rotations on health care economics. Several track directors mentioned successfully expanding curricula on health care value from the HM track into IM residency programs at large, providing a measurable service to the residency programs while ensuring content delivery and freeing up additional time for track activities.

Scholarship and Career Development

Most programs provide targeted career development for residents. Six tracks provide sessions on job procurement skills, such as curriculum vitae preparation and interviewing (Table 2). Many also provide content on venues for disseminating scholarly activity. The Colorado, Kentucky, New Mexico, and Tulane programs feature content on abstract and poster creation. Leadership development is addressed in several tracks through dedicated track activities or participation in discrete, outside-track events. Specifically, Colorado offers a leadership track for residents interested in hospital administration, Cleveland has a leadership journal club, Wake Forest enrolls HM residents in leadership training available through the university, and Minnesota sends residents to the Society of Hospital Medicine’s Leadership Academy (Table 2).

Clinical Rotations

Almost all tracks include a clinical rotation, typically pairing residents directly with hospitalist attendings to encourage autonomy and mentorship. Several also offer elective rotations in various disciplines within HM (Table 2). The Kentucky and Virginia tracks incorporate working with advanced practice providers into their practicums. The Cleveland, Minnesota, Tulane, and Virginia tracks offer HM rotations in community hospitals or postacute settings.

HM rotations also pair clinical experiences with didactic education on relevant topics (eg, billing and coding). The Cleveland, Minnesota, and Virginia tracks developed clinical rotations reflecting the common 7-on and 7-off schedule with nonclinical obligations, such as seminars linking specific content to clinical experiences, during nonclinical time.

DISCUSSION

Our investigation into the current state of HM training found that HM track curricula focus largely on QI, health care economics, and professional development. This focus likely developed in response to hospitalists’ increasing engagement in related endeavors. HM tracks have dynamic and variable structures, reflecting an evolving field and the need to fit into existing IM residency program structures. Similarly, the content covered in HM tracks is tightly linked to perceived opportunities within IM residency curricula. The heterogeneity of content suggests the breadth and ambiguity of necessary competencies for aspiring hospitalists. One of the 11 tracks has not had any residents enroll within the past few years—a testament to the continued effort necessary to sustain such tracks, including curricular updates and recruiting. Conversely, many programs now share track content with the larger IM residency program, suggesting HM tracks may be near the forefront of medical education in some areas.

Our study had several limitations. As we are unaware of any databases of HM tracks, we discussed tracks with professional contacts, performed Internet searches, and reviewed IM residency program websites. Our search, however, was not exhaustive; despite our best efforts, we may have missed or mischaracterized some track offerings. Nevertheless, we think that our analysis represents the first thorough compilation of HM tracks and that it will be useful to institutions seeking to create or enhance HM-specific training.

As the field continues to evolve, we are optimistic about the future of HM training. We suspect that HM residency training tracks will continue to expand. More work is needed so these tracks can adjust to the changing HM and IM residency program landscapes and supply well-trained physicians for the HM workforce.

 

 

Acknowledgment

The authors thank track directors Alpesh Amin, David Gugliotti, Rick Hilger, Karnjit Johl, Nasir Majeed, Georgia McIntosh, Charles Pizanis, and Jeff Wiese for making this study possible.

Disclosure

Nothing to report.

The field of hospital medicine (HM) is rapidly expanding in the areas of clinical medicine, administration, and quality improvement (QI).1 Emerging with this growth is a gap in the traditional internal medicine (IM) training and skills needed to be effective in HM.1,2 These skills include clinical and nonclinical aptitudes, such as process improvement, health care economics, and leadership.1-3 However, resident education on these topics must compete with other required curricular content in IM residency training.2,4 Few IM residencies offer focused HM training that emphasizes key components of successful HM careers.3,5

Within the past decade, designated HM tracks within IM residency programs have been proposed as a potential solution. Initially, calls for such tracks focused on gaps in the clinical competencies required of hospitalists.1 Tracks have since evolved to also include skills required to drive high-value care, process improvement, and scholarship. Designated HM tracks address these areas through greater breadth of curricula, additional time for reflection, participation in group projects, and active application to clinical care.4 We conducted a study to identify themes that could inform the ongoing evolution of dedicated HM tracks.

METHODS

Programs were initially identified through communication among professional networks. The phrases hospital medicine residency track and internal medicine residency hospitalist track were used in broader Google searches, as there is no database of such tracks. Searches were performed quarterly during the 2015–2016 academic year. The top 20 hits were manually filtered to identify tracks affiliated with major academic centers. IM residency program websites provided basic information for programs with tracks. We excluded tracks focused entirely on QI6 because, though a crucial part of HM, QI training alone is probably insufficient for preparing residents for success as hospitalists on residency completion. Similarly, IM residencies with stand-alone HM clinical rotations without longitudinal HM curricula were excluded.

Semistructured interviews with track directors were conducted by e-mail or telephone for all tracks except one, the details of which are published.7 We tabulated data and reviewed qualitative information to identify themes among the different tracks. As this study did not involve human participants, Institutional Review Board approval was not needed.

RESULTS

We identified 11 HM residency training programs at major academic centers across the United States: Cleveland Clinic, Stanford University, Tulane University, University of California Davis, University of California Irvine, University of Colorado, University of Kentucky, University of Minnesota, University of New Mexico, Virginia Commonwealth University, and Wake Forest University (Table 1). We reviewed the websites of about 10 other programs, but none suggested existence of a track. Additional programs contacted reported no current track.

Demographic and structural characteristics of current hospital medicine tracks
Table 1

Track Participants and Structure

HM tracks mainly target third-year residents (Table 1). Some extend into the second year of residency, and 4 have opportunities for intern involvement, including a separate match number at Colorado. Tracks accept up to 12 residents per class. Two programs, at Colorado and Virginia, are part of IM programs in which all residents belong to a track (eg, HM, primary care, research).

 

 

HM track structures vary widely and are heavily influenced by the content delivery platforms of their IM residency programs. Several HM track directors emphasized the importance of fitting into existing educational frameworks to ensure access to residents and to minimize the burden of participation. Four programs deliver the bulk of their nonclinical content in dedicated blocks; 6 others use brief recurring sessions to deliver smaller aliquots longitudinally (Table 1). The number of protected hours for content delivery ranges from 10 to more than 40 annually. All tracks use multiple content delivery modes, including didactic sessions and journal clubs. Four tracks employ panel discussions to explore career options within HM. Several also use online platforms, including discussions, readings, and modules.

Quality Improvement

The vast majority of curricula prominently feature experiential QI project involvement (Table 2). These mentored longitudinal projects allow applied delivery of content, such as QI methods and management skills. Four tracks use material from the Institute for Healthcare Improvement.8 Several also offer dedicated QI rotations that immerse residents in ongoing QI efforts.

Curricular content delivered in current hospital medicine tracks
Table 2

Institutional partnerships support these initiatives at several sites. The Minnesota track is a joint venture of the university and Regions Hospital, a nonprofit community hospital. The Virginia track positions HM residents to lead university-wide interdisciplinary QI teams. For project support, the Colorado and Kentucky tracks partner with local QI resources—the Institute for Healthcare Quality, Safety, and Efficiency at Colorado and the Office of Value and Innovation in Healthcare Delivery at Kentucky.

Health Care Economics and Value

Many programs leverage the rapidly growing emphasis on health care “value” as an opportunity for synergy between IM programs and HM tracks. Examples include involving residents in efforts to improve documentation or didactic instruction on topics such as health care finance. The New Mexico and Wake Forest tracks offer elective rotations on health care economics. Several track directors mentioned successfully expanding curricula on health care value from the HM track into IM residency programs at large, providing a measurable service to the residency programs while ensuring content delivery and freeing up additional time for track activities.

Scholarship and Career Development

Most programs provide targeted career development for residents. Six tracks provide sessions on job procurement skills, such as curriculum vitae preparation and interviewing (Table 2). Many also provide content on venues for disseminating scholarly activity. The Colorado, Kentucky, New Mexico, and Tulane programs feature content on abstract and poster creation. Leadership development is addressed in several tracks through dedicated track activities or participation in discrete, outside-track events. Specifically, Colorado offers a leadership track for residents interested in hospital administration, Cleveland has a leadership journal club, Wake Forest enrolls HM residents in leadership training available through the university, and Minnesota sends residents to the Society of Hospital Medicine’s Leadership Academy (Table 2).

Clinical Rotations

Almost all tracks include a clinical rotation, typically pairing residents directly with hospitalist attendings to encourage autonomy and mentorship. Several also offer elective rotations in various disciplines within HM (Table 2). The Kentucky and Virginia tracks incorporate working with advanced practice providers into their practicums. The Cleveland, Minnesota, Tulane, and Virginia tracks offer HM rotations in community hospitals or postacute settings.

HM rotations also pair clinical experiences with didactic education on relevant topics (eg, billing and coding). The Cleveland, Minnesota, and Virginia tracks developed clinical rotations reflecting the common 7-on and 7-off schedule with nonclinical obligations, such as seminars linking specific content to clinical experiences, during nonclinical time.

DISCUSSION

Our investigation into the current state of HM training found that HM track curricula focus largely on QI, health care economics, and professional development. This focus likely developed in response to hospitalists’ increasing engagement in related endeavors. HM tracks have dynamic and variable structures, reflecting an evolving field and the need to fit into existing IM residency program structures. Similarly, the content covered in HM tracks is tightly linked to perceived opportunities within IM residency curricula. The heterogeneity of content suggests the breadth and ambiguity of necessary competencies for aspiring hospitalists. One of the 11 tracks has not had any residents enroll within the past few years—a testament to the continued effort necessary to sustain such tracks, including curricular updates and recruiting. Conversely, many programs now share track content with the larger IM residency program, suggesting HM tracks may be near the forefront of medical education in some areas.

Our study had several limitations. As we are unaware of any databases of HM tracks, we discussed tracks with professional contacts, performed Internet searches, and reviewed IM residency program websites. Our search, however, was not exhaustive; despite our best efforts, we may have missed or mischaracterized some track offerings. Nevertheless, we think that our analysis represents the first thorough compilation of HM tracks and that it will be useful to institutions seeking to create or enhance HM-specific training.

As the field continues to evolve, we are optimistic about the future of HM training. We suspect that HM residency training tracks will continue to expand. More work is needed so these tracks can adjust to the changing HM and IM residency program landscapes and supply well-trained physicians for the HM workforce.

 

 

Acknowledgment

The authors thank track directors Alpesh Amin, David Gugliotti, Rick Hilger, Karnjit Johl, Nasir Majeed, Georgia McIntosh, Charles Pizanis, and Jeff Wiese for making this study possible.

Disclosure

Nothing to report.

References

1. Glasheen JJ, Siegal EM, Epstein K, Kutner J, Prochazka AV. Fulfilling the promise of hospital medicine: tailoring internal medicine training to address hospitalists’ needs [published correction appears in J Gen Intern Med. 2008;23(11):1931]. J Gen Intern Med. 2008;23(7):1110-1115. PubMed
2. Arora V, Guardiano S, Donaldson D, Storch I, Hemstreet P. Closing the gap between internal medicine training and practice: recommendations from recent graduates. Am J Med. 2005;118(6):680-685. PubMed
3. Glasheen JJ, Goldenberg J, Nelson JR. Achieving hospital medicine’s promise through internal medicine residency redesign. Mt Sinai J Med. 2008;75(5):436-441. PubMed
4. Wiese J. Residency training: beginning with the end in mind. J Gen Intern Med. 2008;23(7):1122-1123. PubMed
5. Glasheen JJ, Epstein KR, Siegal E, Kutner JS, Prochazka AV. The spectrum of community-based hospitalist practice: a call to tailor internal medicine residency training. Arch Intern Med. 2007;167(7):727-728. PubMed
6. Patel N, Brennan PJ, Metlay J, Bellini L, Shannon RP, Myers JS. Building the pipeline: the creation of a residency training pathway for future physician leaders in health care quality. Acad Med. 2015;90(2):185-190. PubMed
7. Kumar A, Smeraglio A, Witteles R, et al. A resident-created hospitalist curriculum for internal medicine housestaff. J Hosp Med. 2016;11(9):646-649. PubMed
8. Institute for Healthcare Improvement website. http://www.ihi.org. Accessed December 15, 2015.

References

1. Glasheen JJ, Siegal EM, Epstein K, Kutner J, Prochazka AV. Fulfilling the promise of hospital medicine: tailoring internal medicine training to address hospitalists’ needs [published correction appears in J Gen Intern Med. 2008;23(11):1931]. J Gen Intern Med. 2008;23(7):1110-1115. PubMed
2. Arora V, Guardiano S, Donaldson D, Storch I, Hemstreet P. Closing the gap between internal medicine training and practice: recommendations from recent graduates. Am J Med. 2005;118(6):680-685. PubMed
3. Glasheen JJ, Goldenberg J, Nelson JR. Achieving hospital medicine’s promise through internal medicine residency redesign. Mt Sinai J Med. 2008;75(5):436-441. PubMed
4. Wiese J. Residency training: beginning with the end in mind. J Gen Intern Med. 2008;23(7):1122-1123. PubMed
5. Glasheen JJ, Epstein KR, Siegal E, Kutner JS, Prochazka AV. The spectrum of community-based hospitalist practice: a call to tailor internal medicine residency training. Arch Intern Med. 2007;167(7):727-728. PubMed
6. Patel N, Brennan PJ, Metlay J, Bellini L, Shannon RP, Myers JS. Building the pipeline: the creation of a residency training pathway for future physician leaders in health care quality. Acad Med. 2015;90(2):185-190. PubMed
7. Kumar A, Smeraglio A, Witteles R, et al. A resident-created hospitalist curriculum for internal medicine housestaff. J Hosp Med. 2016;11(9):646-649. PubMed
8. Institute for Healthcare Improvement website. http://www.ihi.org. Accessed December 15, 2015.

Issue
Journal of Hospital Medicine - 12(3)
Issue
Journal of Hospital Medicine - 12(3)
Page Number
173-176
Page Number
173-176
Publications
Publications
Topics
Article Type
Display Headline
Hospital medicine resident training tracks: Developing the hospital medicine pipeline
Display Headline
Hospital medicine resident training tracks: Developing the hospital medicine pipeline
Sections
Article Source

© 2017 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Joseph R. Sweigart, MD, Internal Medicine, Albert B. Chandler Hospital, University of Kentucky, 800 Rose St, MN602, Lexington, KY 40536-0294; Telephone: 859-323-6047; Fax: 859-257-3873; E-mail: [email protected]
Content Gating
Gated (full article locked unless allowed per User)
Alternative CME
Gating Strategy
First Peek Free
Article PDF Media

Delayed ICU Transfer Affects Mortality and Length of Stay

Article Type
Changed
Thu, 12/15/2022 - 15:58
Display Headline
Delayed ICU Transfer Affects Mortality and Length of Stay

Clinical Question: Can an objective measurement of critical illness inform intensive care unit (ICU) transfer timeliness?

Background: Early intervention has shown mortality benefit in many critical illness syndromes, yet heterogeneity in timing of ICU transfer exists. Previous studies examining ICU transfer timeliness have mostly focused on subjective criteria.

Study Design: Retrospective observational cohort study.

Setting: Medical-surgical units at five hospitals including the University of Chicago and NorthShore University HealthSystem in Illinois.

Synopsis: All medical-surgical ward patients between November 2008 and January 2013 were scored using eCART, a previously validated objective scoring system, to decide when transfer was appropriate. Of those, 3,789 patients reached the predetermined threshold for critical illness. Transfers more than six hours after crossing the threshold were considered delayed. Patients with delayed transfer had a statistically significant increase in length of stay (LOS) and in-hospital mortality (33.2% versus 24.5%; P < 0.001), and the mortality increase was linear, with a 3% increase in odds for each one hour of further transfer delay (P < 0.001). The rate of change of eCART score did influence time of transfer, and the authors suggest that rapid changes were more likely to be recognized. They postulate that routine implementation of eCART or similar objective scoring may lead to earlier recognition of necessary ICU transfer and thus improve mortality and LOS, and they suggest this as a topic for future trials.

Bottom Line: Delayed ICU transfer negatively affects LOS and in-hospital mortality. Objective criteria may identify more appropriate timing of transfer. Clinical trials to investigate this are warranted.

Citation: Churpek MM, Wendlandt B, Zadravecz FJ, Adhikari R, Winslow C, Edelson DP. Association between intensive care unit transfer delay and hospital mortality: a multicenter investigation [published online ahead of print June 28, 2016]. J Hosp Med. doi:10.1002/jhm.2630.

Short Take

Intranasal Live Attenuated Influenza Vaccine Not Recommended

The Centers for Disease Control and Prevention recommends against use of the nasal spray live attenuated influenza vaccine. This is based on data showing poor effectiveness in prior years.

Citation: ACIP votes down use of LAIV for 2016-2017 flu season [press release]. CDC website.

Issue
The Hospitalist - 2016(10)
Publications
Topics
Sections

Clinical Question: Can an objective measurement of critical illness inform intensive care unit (ICU) transfer timeliness?

Background: Early intervention has shown mortality benefit in many critical illness syndromes, yet heterogeneity in timing of ICU transfer exists. Previous studies examining ICU transfer timeliness have mostly focused on subjective criteria.

Study Design: Retrospective observational cohort study.

Setting: Medical-surgical units at five hospitals including the University of Chicago and NorthShore University HealthSystem in Illinois.

Synopsis: All medical-surgical ward patients between November 2008 and January 2013 were scored using eCART, a previously validated objective scoring system, to decide when transfer was appropriate. Of those, 3,789 patients reached the predetermined threshold for critical illness. Transfers more than six hours after crossing the threshold were considered delayed. Patients with delayed transfer had a statistically significant increase in length of stay (LOS) and in-hospital mortality (33.2% versus 24.5%; P < 0.001), and the mortality increase was linear, with a 3% increase in odds for each one hour of further transfer delay (P < 0.001). The rate of change of eCART score did influence time of transfer, and the authors suggest that rapid changes were more likely to be recognized. They postulate that routine implementation of eCART or similar objective scoring may lead to earlier recognition of necessary ICU transfer and thus improve mortality and LOS, and they suggest this as a topic for future trials.

Bottom Line: Delayed ICU transfer negatively affects LOS and in-hospital mortality. Objective criteria may identify more appropriate timing of transfer. Clinical trials to investigate this are warranted.

Citation: Churpek MM, Wendlandt B, Zadravecz FJ, Adhikari R, Winslow C, Edelson DP. Association between intensive care unit transfer delay and hospital mortality: a multicenter investigation [published online ahead of print June 28, 2016]. J Hosp Med. doi:10.1002/jhm.2630.

Short Take

Intranasal Live Attenuated Influenza Vaccine Not Recommended

The Centers for Disease Control and Prevention recommends against use of the nasal spray live attenuated influenza vaccine. This is based on data showing poor effectiveness in prior years.

Citation: ACIP votes down use of LAIV for 2016-2017 flu season [press release]. CDC website.

Clinical Question: Can an objective measurement of critical illness inform intensive care unit (ICU) transfer timeliness?

Background: Early intervention has shown mortality benefit in many critical illness syndromes, yet heterogeneity in timing of ICU transfer exists. Previous studies examining ICU transfer timeliness have mostly focused on subjective criteria.

Study Design: Retrospective observational cohort study.

Setting: Medical-surgical units at five hospitals including the University of Chicago and NorthShore University HealthSystem in Illinois.

Synopsis: All medical-surgical ward patients between November 2008 and January 2013 were scored using eCART, a previously validated objective scoring system, to decide when transfer was appropriate. Of those, 3,789 patients reached the predetermined threshold for critical illness. Transfers more than six hours after crossing the threshold were considered delayed. Patients with delayed transfer had a statistically significant increase in length of stay (LOS) and in-hospital mortality (33.2% versus 24.5%; P < 0.001), and the mortality increase was linear, with a 3% increase in odds for each one hour of further transfer delay (P < 0.001). The rate of change of eCART score did influence time of transfer, and the authors suggest that rapid changes were more likely to be recognized. They postulate that routine implementation of eCART or similar objective scoring may lead to earlier recognition of necessary ICU transfer and thus improve mortality and LOS, and they suggest this as a topic for future trials.

Bottom Line: Delayed ICU transfer negatively affects LOS and in-hospital mortality. Objective criteria may identify more appropriate timing of transfer. Clinical trials to investigate this are warranted.

Citation: Churpek MM, Wendlandt B, Zadravecz FJ, Adhikari R, Winslow C, Edelson DP. Association between intensive care unit transfer delay and hospital mortality: a multicenter investigation [published online ahead of print June 28, 2016]. J Hosp Med. doi:10.1002/jhm.2630.

Short Take

Intranasal Live Attenuated Influenza Vaccine Not Recommended

The Centers for Disease Control and Prevention recommends against use of the nasal spray live attenuated influenza vaccine. This is based on data showing poor effectiveness in prior years.

Citation: ACIP votes down use of LAIV for 2016-2017 flu season [press release]. CDC website.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Topics
Article Type
Display Headline
Delayed ICU Transfer Affects Mortality and Length of Stay
Display Headline
Delayed ICU Transfer Affects Mortality and Length of Stay
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

IV Fluid Can Save Lives in Hemodynamically Stable Patients with Sepsis

Article Type
Changed
Thu, 12/15/2022 - 15:58
Display Headline
IV Fluid Can Save Lives in Hemodynamically Stable Patients with Sepsis

Clinical Question: Does increased fluid administration in patients with sepsis with intermediate lactate levels improve outcomes?

Background: The Surviving Sepsis Campaign bundle, which improves ED mortality, targets patients with hypotension or lactate levels >4 mmol/L. No similar optimal treatment strategy exists for less severe sepsis patients even though such patients are more common in hospitalized populations.

Study Design: Retrospective study of a quality improvement bundle.

Setting: 21 community-based hospitals in the Kaiser Permanente Northern California system.

Synopsis: This study evaluated implementation of a treatment bundle for 18,122 hemodynamically stable sepsis patients presenting to the ED with lactate levels between 2 and 4 mmol/L during the 12 months prior to and after bundle implementation. The bundle included antibiotic administration within three hours, repeated lactate levels within four hours, and 30 mL/kg or ≥2 L of intravenous fluids within three hours of initial lactate result. Patients with kidney disease and/or heart failure were separately evaluated because of the perceived risk of fluid administration.

Treatment after bundle implementation was associated with an adjusted hospital mortality odds ratio of 0.81 (95% CI, 0.66–0.99; P = 0.04). Significant reductions in hospital mortality were observed in patients with heart failure and/or kidney disease (P < 0.01) but not without (P > 0.4). This correlated with increased fluid administration in patients with heart failure and/or kidney disease following bundle implementation. This is not a randomized controlled study, which invites biases and confounding.

Bottom Line: Increased fluid administration improved mortality in patients with kidney disease and heart failure presenting with sepsis.

Reference: Liu V, Morehouse JW, Marelich GP, et al. Multicenter implementation of a treatment bundle for patients with sepsis and intermediate lactate values. Am J Respir Crit Care Med. 2016;193(11):1264-1270.

Short Take

New Framework for Learners’ Clinical Reasoning

A qualitative study involving 37 emergency medicine residents found that clinical reasoning through individual cases progresses from case framing (phase 1) to pattern recognition (phase 2), then self-monitoring (phase 3).

Citation: Adams E, Goyder C, Heneghan C, Brand L, Ajjawi R. Clinical reasoning of junior doctors in emergency medicine: a grounded theory study [published online ahead of print June 23, 2016]. Emerg Med J. doi:10.1136/emermed-2015-205650.

Issue
The Hospitalist - 2016(10)
Publications
Sections

Clinical Question: Does increased fluid administration in patients with sepsis with intermediate lactate levels improve outcomes?

Background: The Surviving Sepsis Campaign bundle, which improves ED mortality, targets patients with hypotension or lactate levels >4 mmol/L. No similar optimal treatment strategy exists for less severe sepsis patients even though such patients are more common in hospitalized populations.

Study Design: Retrospective study of a quality improvement bundle.

Setting: 21 community-based hospitals in the Kaiser Permanente Northern California system.

Synopsis: This study evaluated implementation of a treatment bundle for 18,122 hemodynamically stable sepsis patients presenting to the ED with lactate levels between 2 and 4 mmol/L during the 12 months prior to and after bundle implementation. The bundle included antibiotic administration within three hours, repeated lactate levels within four hours, and 30 mL/kg or ≥2 L of intravenous fluids within three hours of initial lactate result. Patients with kidney disease and/or heart failure were separately evaluated because of the perceived risk of fluid administration.

Treatment after bundle implementation was associated with an adjusted hospital mortality odds ratio of 0.81 (95% CI, 0.66–0.99; P = 0.04). Significant reductions in hospital mortality were observed in patients with heart failure and/or kidney disease (P < 0.01) but not without (P > 0.4). This correlated with increased fluid administration in patients with heart failure and/or kidney disease following bundle implementation. This is not a randomized controlled study, which invites biases and confounding.

Bottom Line: Increased fluid administration improved mortality in patients with kidney disease and heart failure presenting with sepsis.

Reference: Liu V, Morehouse JW, Marelich GP, et al. Multicenter implementation of a treatment bundle for patients with sepsis and intermediate lactate values. Am J Respir Crit Care Med. 2016;193(11):1264-1270.

Short Take

New Framework for Learners’ Clinical Reasoning

A qualitative study involving 37 emergency medicine residents found that clinical reasoning through individual cases progresses from case framing (phase 1) to pattern recognition (phase 2), then self-monitoring (phase 3).

Citation: Adams E, Goyder C, Heneghan C, Brand L, Ajjawi R. Clinical reasoning of junior doctors in emergency medicine: a grounded theory study [published online ahead of print June 23, 2016]. Emerg Med J. doi:10.1136/emermed-2015-205650.

Clinical Question: Does increased fluid administration in patients with sepsis with intermediate lactate levels improve outcomes?

Background: The Surviving Sepsis Campaign bundle, which improves ED mortality, targets patients with hypotension or lactate levels >4 mmol/L. No similar optimal treatment strategy exists for less severe sepsis patients even though such patients are more common in hospitalized populations.

Study Design: Retrospective study of a quality improvement bundle.

Setting: 21 community-based hospitals in the Kaiser Permanente Northern California system.

Synopsis: This study evaluated implementation of a treatment bundle for 18,122 hemodynamically stable sepsis patients presenting to the ED with lactate levels between 2 and 4 mmol/L during the 12 months prior to and after bundle implementation. The bundle included antibiotic administration within three hours, repeated lactate levels within four hours, and 30 mL/kg or ≥2 L of intravenous fluids within three hours of initial lactate result. Patients with kidney disease and/or heart failure were separately evaluated because of the perceived risk of fluid administration.

Treatment after bundle implementation was associated with an adjusted hospital mortality odds ratio of 0.81 (95% CI, 0.66–0.99; P = 0.04). Significant reductions in hospital mortality were observed in patients with heart failure and/or kidney disease (P < 0.01) but not without (P > 0.4). This correlated with increased fluid administration in patients with heart failure and/or kidney disease following bundle implementation. This is not a randomized controlled study, which invites biases and confounding.

Bottom Line: Increased fluid administration improved mortality in patients with kidney disease and heart failure presenting with sepsis.

Reference: Liu V, Morehouse JW, Marelich GP, et al. Multicenter implementation of a treatment bundle for patients with sepsis and intermediate lactate values. Am J Respir Crit Care Med. 2016;193(11):1264-1270.

Short Take

New Framework for Learners’ Clinical Reasoning

A qualitative study involving 37 emergency medicine residents found that clinical reasoning through individual cases progresses from case framing (phase 1) to pattern recognition (phase 2), then self-monitoring (phase 3).

Citation: Adams E, Goyder C, Heneghan C, Brand L, Ajjawi R. Clinical reasoning of junior doctors in emergency medicine: a grounded theory study [published online ahead of print June 23, 2016]. Emerg Med J. doi:10.1136/emermed-2015-205650.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Article Type
Display Headline
IV Fluid Can Save Lives in Hemodynamically Stable Patients with Sepsis
Display Headline
IV Fluid Can Save Lives in Hemodynamically Stable Patients with Sepsis
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Real-World Safety and Effectiveness of Oral Anticoagulants for Afib

Article Type
Changed
Thu, 12/15/2022 - 15:59
Display Headline
Real-World Safety and Effectiveness of Oral Anticoagulants for Afib

Clinical Question: Which oral anticoagulants are safest and most effective in nonvalvular atrial fibrillation?

Background: Use of direct oral anticoagulants (DOACs) has been increasing since their introduction and widespread marketing. While dosing is a challenge for warfarin, certain medical conditions limit the use of DOACs. Choosing the optimal oral anticoagulant is challenging with the increasing complexity of patients.

Study Design: Nationwide observational cohort study.

Setting: Three national Danish databases, from August 2011 to October 2015.

Synopsis: Authors reviewed data from 61,678 patients with nonvalvular atrial fibrillation who were new to oral anticoagulants. The study compared the efficacy, safety, and patient characteristics of DOACs and warfarin. Ischemic stroke, systemic embolism, and death were evaluated separately and as a composite measure of efficacy. Any bleeding, intracranial bleeding, and major bleeding were measured as safety outcomes. DOACs patients were younger and had lower CHA2DS2-VASc and HAS-BLED scores. No significant difference in risk of ischemic stroke was identified between DOACs and warfarin. Rivaroxaban was associated with lower rates of ischemic stroke and systemic embolism but had bleeding rates that were similar to warfarin. Any bleeding and major bleeding rates were lowest for dabigatran and apixaban. All-cause mortality was lowest in the dabigatran group and highest in the warfarin group.

Limitations were the retrospective, observational study design, with an average follow-up of only 1.9 years.

Bottom Line: All DOACs appear to be safer and more effective alternatives to warfarin. Oral anticoagulant selection needs to be based on individual patient clinical profile.

Citation: Larsen TB, Skjoth F, Nielsen PB, Kjaeldgaard JN, Lip GY. Comparative effectiveness and safety of non-vitamin K antagonist oral anticoagulants and warfarin in patients with atrial fibrillation: propensity weighted nationwide cohort study. BMJ. 2016;353:i3189.

Short Take

Mortality and Long-Acting Opiates

This retrospective cohort study raises questions about the safety of long-acting opioids for chronic noncancer pain. When compared with anticonvulsants or antidepressants, the adjusted hazard ratio was 1.64 for total mortality.

Citation: Ray W, Chung CP, Murray KT, Hall K, Stein CM. Prescription of long-acting opioids and mortality in patients with chronic noncancer pain. JAMA. 2016;315(22):2415-2423.

Issue
The Hospitalist - 2016(10)
Publications
Sections

Clinical Question: Which oral anticoagulants are safest and most effective in nonvalvular atrial fibrillation?

Background: Use of direct oral anticoagulants (DOACs) has been increasing since their introduction and widespread marketing. While dosing is a challenge for warfarin, certain medical conditions limit the use of DOACs. Choosing the optimal oral anticoagulant is challenging with the increasing complexity of patients.

Study Design: Nationwide observational cohort study.

Setting: Three national Danish databases, from August 2011 to October 2015.

Synopsis: Authors reviewed data from 61,678 patients with nonvalvular atrial fibrillation who were new to oral anticoagulants. The study compared the efficacy, safety, and patient characteristics of DOACs and warfarin. Ischemic stroke, systemic embolism, and death were evaluated separately and as a composite measure of efficacy. Any bleeding, intracranial bleeding, and major bleeding were measured as safety outcomes. DOACs patients were younger and had lower CHA2DS2-VASc and HAS-BLED scores. No significant difference in risk of ischemic stroke was identified between DOACs and warfarin. Rivaroxaban was associated with lower rates of ischemic stroke and systemic embolism but had bleeding rates that were similar to warfarin. Any bleeding and major bleeding rates were lowest for dabigatran and apixaban. All-cause mortality was lowest in the dabigatran group and highest in the warfarin group.

Limitations were the retrospective, observational study design, with an average follow-up of only 1.9 years.

Bottom Line: All DOACs appear to be safer and more effective alternatives to warfarin. Oral anticoagulant selection needs to be based on individual patient clinical profile.

Citation: Larsen TB, Skjoth F, Nielsen PB, Kjaeldgaard JN, Lip GY. Comparative effectiveness and safety of non-vitamin K antagonist oral anticoagulants and warfarin in patients with atrial fibrillation: propensity weighted nationwide cohort study. BMJ. 2016;353:i3189.

Short Take

Mortality and Long-Acting Opiates

This retrospective cohort study raises questions about the safety of long-acting opioids for chronic noncancer pain. When compared with anticonvulsants or antidepressants, the adjusted hazard ratio was 1.64 for total mortality.

Citation: Ray W, Chung CP, Murray KT, Hall K, Stein CM. Prescription of long-acting opioids and mortality in patients with chronic noncancer pain. JAMA. 2016;315(22):2415-2423.

Clinical Question: Which oral anticoagulants are safest and most effective in nonvalvular atrial fibrillation?

Background: Use of direct oral anticoagulants (DOACs) has been increasing since their introduction and widespread marketing. While dosing is a challenge for warfarin, certain medical conditions limit the use of DOACs. Choosing the optimal oral anticoagulant is challenging with the increasing complexity of patients.

Study Design: Nationwide observational cohort study.

Setting: Three national Danish databases, from August 2011 to October 2015.

Synopsis: Authors reviewed data from 61,678 patients with nonvalvular atrial fibrillation who were new to oral anticoagulants. The study compared the efficacy, safety, and patient characteristics of DOACs and warfarin. Ischemic stroke, systemic embolism, and death were evaluated separately and as a composite measure of efficacy. Any bleeding, intracranial bleeding, and major bleeding were measured as safety outcomes. DOACs patients were younger and had lower CHA2DS2-VASc and HAS-BLED scores. No significant difference in risk of ischemic stroke was identified between DOACs and warfarin. Rivaroxaban was associated with lower rates of ischemic stroke and systemic embolism but had bleeding rates that were similar to warfarin. Any bleeding and major bleeding rates were lowest for dabigatran and apixaban. All-cause mortality was lowest in the dabigatran group and highest in the warfarin group.

Limitations were the retrospective, observational study design, with an average follow-up of only 1.9 years.

Bottom Line: All DOACs appear to be safer and more effective alternatives to warfarin. Oral anticoagulant selection needs to be based on individual patient clinical profile.

Citation: Larsen TB, Skjoth F, Nielsen PB, Kjaeldgaard JN, Lip GY. Comparative effectiveness and safety of non-vitamin K antagonist oral anticoagulants and warfarin in patients with atrial fibrillation: propensity weighted nationwide cohort study. BMJ. 2016;353:i3189.

Short Take

Mortality and Long-Acting Opiates

This retrospective cohort study raises questions about the safety of long-acting opioids for chronic noncancer pain. When compared with anticonvulsants or antidepressants, the adjusted hazard ratio was 1.64 for total mortality.

Citation: Ray W, Chung CP, Murray KT, Hall K, Stein CM. Prescription of long-acting opioids and mortality in patients with chronic noncancer pain. JAMA. 2016;315(22):2415-2423.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Article Type
Display Headline
Real-World Safety and Effectiveness of Oral Anticoagulants for Afib
Display Headline
Real-World Safety and Effectiveness of Oral Anticoagulants for Afib
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Prescribing Naloxone for Patients on Long-Term Opioid Therapy

Article Type
Changed
Thu, 12/15/2022 - 15:59
Display Headline
Prescribing Naloxone for Patients on Long-Term Opioid Therapy

Clinical Question: Does naloxone co-prescription for patients on long-term opioids for pain prevent opioid-related adverse events?

Background: Unintentional opioid overdose is a major public health issue. Studies have shown that provision of naloxone to at-risk patients reduces mortality and improves survival. The CDC recommends considering naloxone prescription in high-risk patients. This study focused on patient education and prescription habits of providers rather than just making naloxone available.

Study Design: Non-randomized interventional study.

Setting: Six safety-net primary-care clinics in San Francisco.

Synopsis: The authors identified 1,985 adults on long-term opioid treatment, of which 759 were prescribed naloxone. Providers were encouraged to prescribe naloxone along with opioids. Patients were educated about use of the intranasal naloxone device. Outcomes included opioid-related emergency department visits and prescribed dosage. They noted that patients on a higher dose of opioids and with opioid-related ED visits in the prior 12 months were more likely to be prescribed naloxone. When compared to patients who were not prescribed naloxone, patients who received naloxone had 47% fewer ED visits per month in the first six months and 63% fewer ED visits over 12 months. Limitations include lack of randomization and being a single-center study.

Hospitalists can prioritize patients and consider providing naloxone prescription to reduce ED visits and perhaps readmissions. Further studies are needed focusing on patients who get discharged from the hospital.

Bottom Line: Naloxone prescription in patients on long-term opioid treatment may prevent opioid-related ED visits.

Citation: Coffin PO, Behar E, Rowe C, et al. Nonrandomized intervention study of naloxone coprescription for primary care patients receiving long-term opioid therapy for pain. Ann Intern Med. 2016;165(4):245-252.

Short Take

Mortality and Long-Acting Opiates

This retrospective cohort study raises questions about the safety of long-acting opioids for chronic noncancer pain. When compared with anticonvulsants or antidepressants, the adjusted hazard ratio was 1.64 for total mortality.

Citation: Ray W, Chung CP, Murray KT, Hall K, Stein CM. Prescription of long-acting opioids and mortality in patients with chronic noncancer pain. JAMA. 2016;315(22):2415-2423.

Issue
The Hospitalist - 2016(10)
Publications
Sections

Clinical Question: Does naloxone co-prescription for patients on long-term opioids for pain prevent opioid-related adverse events?

Background: Unintentional opioid overdose is a major public health issue. Studies have shown that provision of naloxone to at-risk patients reduces mortality and improves survival. The CDC recommends considering naloxone prescription in high-risk patients. This study focused on patient education and prescription habits of providers rather than just making naloxone available.

Study Design: Non-randomized interventional study.

Setting: Six safety-net primary-care clinics in San Francisco.

Synopsis: The authors identified 1,985 adults on long-term opioid treatment, of which 759 were prescribed naloxone. Providers were encouraged to prescribe naloxone along with opioids. Patients were educated about use of the intranasal naloxone device. Outcomes included opioid-related emergency department visits and prescribed dosage. They noted that patients on a higher dose of opioids and with opioid-related ED visits in the prior 12 months were more likely to be prescribed naloxone. When compared to patients who were not prescribed naloxone, patients who received naloxone had 47% fewer ED visits per month in the first six months and 63% fewer ED visits over 12 months. Limitations include lack of randomization and being a single-center study.

Hospitalists can prioritize patients and consider providing naloxone prescription to reduce ED visits and perhaps readmissions. Further studies are needed focusing on patients who get discharged from the hospital.

Bottom Line: Naloxone prescription in patients on long-term opioid treatment may prevent opioid-related ED visits.

Citation: Coffin PO, Behar E, Rowe C, et al. Nonrandomized intervention study of naloxone coprescription for primary care patients receiving long-term opioid therapy for pain. Ann Intern Med. 2016;165(4):245-252.

Short Take

Mortality and Long-Acting Opiates

This retrospective cohort study raises questions about the safety of long-acting opioids for chronic noncancer pain. When compared with anticonvulsants or antidepressants, the adjusted hazard ratio was 1.64 for total mortality.

Citation: Ray W, Chung CP, Murray KT, Hall K, Stein CM. Prescription of long-acting opioids and mortality in patients with chronic noncancer pain. JAMA. 2016;315(22):2415-2423.

Clinical Question: Does naloxone co-prescription for patients on long-term opioids for pain prevent opioid-related adverse events?

Background: Unintentional opioid overdose is a major public health issue. Studies have shown that provision of naloxone to at-risk patients reduces mortality and improves survival. The CDC recommends considering naloxone prescription in high-risk patients. This study focused on patient education and prescription habits of providers rather than just making naloxone available.

Study Design: Non-randomized interventional study.

Setting: Six safety-net primary-care clinics in San Francisco.

Synopsis: The authors identified 1,985 adults on long-term opioid treatment, of which 759 were prescribed naloxone. Providers were encouraged to prescribe naloxone along with opioids. Patients were educated about use of the intranasal naloxone device. Outcomes included opioid-related emergency department visits and prescribed dosage. They noted that patients on a higher dose of opioids and with opioid-related ED visits in the prior 12 months were more likely to be prescribed naloxone. When compared to patients who were not prescribed naloxone, patients who received naloxone had 47% fewer ED visits per month in the first six months and 63% fewer ED visits over 12 months. Limitations include lack of randomization and being a single-center study.

Hospitalists can prioritize patients and consider providing naloxone prescription to reduce ED visits and perhaps readmissions. Further studies are needed focusing on patients who get discharged from the hospital.

Bottom Line: Naloxone prescription in patients on long-term opioid treatment may prevent opioid-related ED visits.

Citation: Coffin PO, Behar E, Rowe C, et al. Nonrandomized intervention study of naloxone coprescription for primary care patients receiving long-term opioid therapy for pain. Ann Intern Med. 2016;165(4):245-252.

Short Take

Mortality and Long-Acting Opiates

This retrospective cohort study raises questions about the safety of long-acting opioids for chronic noncancer pain. When compared with anticonvulsants or antidepressants, the adjusted hazard ratio was 1.64 for total mortality.

Citation: Ray W, Chung CP, Murray KT, Hall K, Stein CM. Prescription of long-acting opioids and mortality in patients with chronic noncancer pain. JAMA. 2016;315(22):2415-2423.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Article Type
Display Headline
Prescribing Naloxone for Patients on Long-Term Opioid Therapy
Display Headline
Prescribing Naloxone for Patients on Long-Term Opioid Therapy
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Palliative Care May Improve End-of-Life Care for Patients with ESRD, Cardiopulmonary Failure, Frailty

Article Type
Changed
Thu, 12/15/2022 - 15:59
Display Headline
Palliative Care May Improve End-of-Life Care for Patients with ESRD, Cardiopulmonary Failure, Frailty

Clinical Question: Is there a difference in family-rated quality of care for patients dying with different serious illnesses?

Background: End-of-life care has focused largely on cancer patients. However, other conditions lead to more deaths than cancer in the United States.

Study Design: A retrospective cross-sectional study.

Setting: 146 inpatient Veterans Affairs (VA) facilities.

Synopsis: This study included 57,753 patients who died in inpatient facilities with a diagnosis of cancer, dementia, end-stage renal disease (ESRD), cardiopulmonary failure (heart failure or chronic obstructive pulmonary disease), or frailty. Measures included palliative care consultations, do-not-resuscitate (DNR) orders, death in inpatient hospice, death in the intensive care unit (ICU), and family-reported quality of end-of-life care. Palliative care consultations were given to 73.5% of patients with cancer and 61.4% of patients with dementia, which was significantly more than patients with other diagnoses (P < .001).

Approximately one-third of patients with diagnoses other than cancer or dementia died in the ICU, which was more than double the rate among patients with cancer or dementia (P < .001). Rates of excellent quality of end-of-life care were similar for patients with cancer and dementia (59.2% and 59.3%) but lower for other conditions (P = 0.02 when compared with cancer patient). This was mediated by palliative care consultation, setting of death, and DNR status. Difficulty defining frailty and restriction to only the VA system are limitations of this study.

Bottom Line: Increasing access to palliative care, goals-of-care discussions, and preferred setting of death may improve overall quality of end-of-life care.

Citation: Wachterman MW, Pilver C, Smith D, Ersek M, Lipsitz SR, Keating NL. Quality of end-of-life care provided to patients with different serious illnesses. JAMA Intern Med. 2016;176(8):1095-1102. doi:10.1001/jamainternmed.2016.1200.

Issue
The Hospitalist - 2016(10)
Publications
Topics
Sections

Clinical Question: Is there a difference in family-rated quality of care for patients dying with different serious illnesses?

Background: End-of-life care has focused largely on cancer patients. However, other conditions lead to more deaths than cancer in the United States.

Study Design: A retrospective cross-sectional study.

Setting: 146 inpatient Veterans Affairs (VA) facilities.

Synopsis: This study included 57,753 patients who died in inpatient facilities with a diagnosis of cancer, dementia, end-stage renal disease (ESRD), cardiopulmonary failure (heart failure or chronic obstructive pulmonary disease), or frailty. Measures included palliative care consultations, do-not-resuscitate (DNR) orders, death in inpatient hospice, death in the intensive care unit (ICU), and family-reported quality of end-of-life care. Palliative care consultations were given to 73.5% of patients with cancer and 61.4% of patients with dementia, which was significantly more than patients with other diagnoses (P < .001).

Approximately one-third of patients with diagnoses other than cancer or dementia died in the ICU, which was more than double the rate among patients with cancer or dementia (P < .001). Rates of excellent quality of end-of-life care were similar for patients with cancer and dementia (59.2% and 59.3%) but lower for other conditions (P = 0.02 when compared with cancer patient). This was mediated by palliative care consultation, setting of death, and DNR status. Difficulty defining frailty and restriction to only the VA system are limitations of this study.

Bottom Line: Increasing access to palliative care, goals-of-care discussions, and preferred setting of death may improve overall quality of end-of-life care.

Citation: Wachterman MW, Pilver C, Smith D, Ersek M, Lipsitz SR, Keating NL. Quality of end-of-life care provided to patients with different serious illnesses. JAMA Intern Med. 2016;176(8):1095-1102. doi:10.1001/jamainternmed.2016.1200.

Clinical Question: Is there a difference in family-rated quality of care for patients dying with different serious illnesses?

Background: End-of-life care has focused largely on cancer patients. However, other conditions lead to more deaths than cancer in the United States.

Study Design: A retrospective cross-sectional study.

Setting: 146 inpatient Veterans Affairs (VA) facilities.

Synopsis: This study included 57,753 patients who died in inpatient facilities with a diagnosis of cancer, dementia, end-stage renal disease (ESRD), cardiopulmonary failure (heart failure or chronic obstructive pulmonary disease), or frailty. Measures included palliative care consultations, do-not-resuscitate (DNR) orders, death in inpatient hospice, death in the intensive care unit (ICU), and family-reported quality of end-of-life care. Palliative care consultations were given to 73.5% of patients with cancer and 61.4% of patients with dementia, which was significantly more than patients with other diagnoses (P < .001).

Approximately one-third of patients with diagnoses other than cancer or dementia died in the ICU, which was more than double the rate among patients with cancer or dementia (P < .001). Rates of excellent quality of end-of-life care were similar for patients with cancer and dementia (59.2% and 59.3%) but lower for other conditions (P = 0.02 when compared with cancer patient). This was mediated by palliative care consultation, setting of death, and DNR status. Difficulty defining frailty and restriction to only the VA system are limitations of this study.

Bottom Line: Increasing access to palliative care, goals-of-care discussions, and preferred setting of death may improve overall quality of end-of-life care.

Citation: Wachterman MW, Pilver C, Smith D, Ersek M, Lipsitz SR, Keating NL. Quality of end-of-life care provided to patients with different serious illnesses. JAMA Intern Med. 2016;176(8):1095-1102. doi:10.1001/jamainternmed.2016.1200.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Topics
Article Type
Display Headline
Palliative Care May Improve End-of-Life Care for Patients with ESRD, Cardiopulmonary Failure, Frailty
Display Headline
Palliative Care May Improve End-of-Life Care for Patients with ESRD, Cardiopulmonary Failure, Frailty
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

How Should Hospitalists Manage Elderly Patients with Dysphagia?

Article Type
Changed
Fri, 09/14/2018 - 12:02
Display Headline
How Should Hospitalists Manage Elderly Patients with Dysphagia?

The Case

A 74-year-old man with Alzheimer’s dementia presents with urinary tract infection (UTI), hypovolemia, and hypernatremia. He also has chronic dysphagia with a history of aspiration pneumonia and has been on thickened liquids at home for the past five months. As his infection is treated, he improves and requests water to drink.

Background

Dysphagia is a very common problem, particularly among elderly patients. The exact prevalence is unknown, but it is estimated to occur in up to 30% of the elderly population. Dysphagia is defined as difficulty or discomfort in swallowing and is traditionally classified as either oropharyngeal or esophageal in origin. Normal aging as well as chronic illness may lead to decreased connective tissue elasticity, muscle mass, and oral secretions, which affect swallowing performance.1 Stroke is a common predisposing condition.2 Dysphagia predisposes patients to dehydration, malnutrition, and electrolyte derangements. The most feared immediate complication is aspiration pneumonia (AP) resulting from impaired clearance of oral secretions.3

The diagnosis of dysphagia is clinical, and assessments from patients and family are often sufficient. The optimal test to assess the severity of dysphagia is a bedside swallow evaluation using small amounts of water.1 Video-assisted fluoroscopic examinations can identify problem areas within the oropharynx and esophagus and may help determine the etiology of dysphagia.

What evidence supports various treatment options for dysphagia?

Access to Water

Water is a thin liquid with low viscosity, which allows for rapid transit through the oropharynx. In debilitated and elderly patients, thin liquids easily reach the epiglottis and enter the trachea before pharyngeal muscles compensate. As such, access to water and other thin liquids is often restricted in patients suspected to have dysphagia.4

However, allowing access to water improves patient satisfaction, reduces the development of dehydration, and does not increase the incidence of AP. Bedside therapy interventions such as correct positioning and chin-tuck and sipping technique as well as attention to oral hygiene are recommended prior to more noxious options such as thickened liquids.1 The Frazier water protocol may help provide logistical guidance for facilities interested in improving access to water for patients with dysphagia.

Liquid Modification

Many clinicians manage dysphagia through restricting access to all thin liquids. In the hospital setting where video fluoroscopy and speech therapy are readily available, clinicians frequently employ the use of modified diets with thickened liquids in order to minimize the risk of aspiration despite the lack of high-quality evidence supporting liquid modification.2 Patients associate thickened liquids and restricted diets with a reduction in quality of life. Compliance studies have shown that only a minority of patients are compliant with thickened liquids at five days. In addition, thickening liquids has not been shown to decrease the risk of AP nor improve nutritional status, and it may actually cause harm by increasing the risk of dehydration and UTI.4

Tube Feeding

In patients with severe dysphagia in whom conservative management is not feasible or has failed, maintaining adequate nutrition can be a challenge. There are encouraging data with nutritionally enriching and modifying the texture of solid foods.1 Alternative methods of enteral nutrition delivery are often also considered. The most common vehicles of delivery are nasogastric tubes, post-pyloric feeding tubes, and percutaneous endoscopic gastrostomy (PEG) tubes. In theory, bypassing the pharynx and esophagus could result in fewer aspiration events and less AP.3 However, nasogastric, post-pyloric, or PEG feeding does not decrease the risk of AP. For patients with advanced dementia, there have been no randomized trials demonstrating an improvement in mortality with tube feeds.4 Tube feeding also carries with it a slight procedural risk and a high incidence of associated diarrhea, plus is associated with electrolyte derangements such as hypernatremia. The decision to pursue tube feeding should be weighed heavily in every patient and is highly influenced by the etiology and anticipated duration of dysphagia.

 

 

Selective Digestive Decontamination

Selective digestive decontamination (SDD) is a protocol-based treatment that aims to eradicate potentially pathogenic gut flora, particularly aerobic gram-negatives, in critically ill patients to reduce the impact of aspiration events. The utilization of SDD and the available literature center firmly on critically ill and ventilated patients. Subsequent studies have demonstrated recolonization after protocol cessation, and long-term effects are currently undefined.5 Until it can be studied in broader populations and proven to have clinical benefit, employing SSD in non-critically ill patients with dysphagia remains unsupported.

Multimodal Approach

Many rehabilitation centers incorporate a therapist-driven swallowing treatment program. Evidence suggests patient and family counseling alone may not be effective, so these programs variably incorporate diet/liquid modification, strengthening exercises, sensory processing techniques, and even neuromuscular electrical stimulation for muscle building.1 Accordingly, these programs are resource-intensive.

Management

Dysphagia remains a significant clinical problem for hospitalized patients. The existing literature and practice guidelines generally support a “less is more” approach. Though liquid/diet modification is common practice, it is not based in solid evidence and may contribute to unnecessary tube feeding. The best current evidence supports allowing access to water and ice chips. The ideal management plan for each patient will differ and should incorporate patient and family preferences in a multidisciplinary approach.

Back to the Case

Our patient requests water. He coughs after drinking during a bedside swallow evaluation. The risks of potential aspiration and AP are explained, and he expresses his understanding. He reiterates his choice to be allowed access to water as it is important to his quality of life. The speech therapy team is consulted and provides instruction on chin-tuck positioning, oral care, and timing water between meals rather than while eating food. He does well for the remainder of the hospital stay, and by time of discharge, his electrolytes are corrected, and he is much more comfortable being allowed to drink water. He is discharged home and encouraged to continue with these conservative measures.

Bottom Line

Evidence to support many common interventions for dysphagia is lacking; patients with dysphagia are best managed via a multidisciplinary, multimodal approach that provides access to water whenever possible. TH


Vijay G. Paryani, MD, is an internal medicine resident in the department of internal medicine at the University of Kentucky. Joseph R. Sweigart, MD, is a hospitalist and assistant professor of hospital medicine in the division of hospital medicine at the University of Kentucky. Laura C. Fanucchi, MD, is a hospitalist and assistant professor of hospital medicine in the division of hospital medicine at the University of Kentucky.

References

  1. Karagiannis MJ, Chivers L, Karagiannis TC. Effects of oral intake of water in patients with oropharyngeal dysphagia. BMC Geriatr. 2011;11(2):9.
  2. Foley N, Teasell R, Salter K, Kruger E, Martino R. Dysphagia treatment post stroke: a systematic review of randomized controlled trials. Age Ageing. 2008;37(3):258-264.
  3. Marik PE. Aspiration pneumonitis and aspiration pneumonia. N Engl J Med. 2001;344(9):665-671.
  4. Loeb MB, Becker M, Eady A, Walker-Dilks C. Interventions to prevent aspiration pneumonia in older adults: a systematic review. J Am Geriatr Soc. 2003;51(7):1018-1022.
  5. Gosney M, Martin MV, Wright AE. The role of selective decontamination of the digestive tract in acute stroke. Age Ageing 2006;35(1):42-47.

Key Takeaways

  1. Free access to ice chips and water is beneficial and important to quality of life in the setting of dysphagia.
  2. Minimal evidence exists to support many common interventions for dysphagia.
  3. Thickening liquids has sparse, low-quality evidence supporting use and some evidence of harm, including increased rate of dehydration, UTI, and chronic thirst.
  4. Multidisciplinary approaches incorporating patient and family preferences are optimal.

 

 

Additional Reading

  1. Frazier water protocol. KentuckyOne Health website.
  2. Karagiannis M, Karagiannis TC. Oropharyngeal dysphagia, free water protocol and quality of life: an update from a prospective clinical trial. Hell J Nucl Med. 2014;17 Suppl 1:26-9.
  3. Loeb MB, Becker M, Eady A, Walker-Dilks C. Interventions to prevent aspiration pneumonia in older adults: a systematic review. J Am Geriatr Soc. 2003;51(7):1018-1022.

Issue
The Hospitalist - 2016(10)
Publications
Sections

The Case

A 74-year-old man with Alzheimer’s dementia presents with urinary tract infection (UTI), hypovolemia, and hypernatremia. He also has chronic dysphagia with a history of aspiration pneumonia and has been on thickened liquids at home for the past five months. As his infection is treated, he improves and requests water to drink.

Background

Dysphagia is a very common problem, particularly among elderly patients. The exact prevalence is unknown, but it is estimated to occur in up to 30% of the elderly population. Dysphagia is defined as difficulty or discomfort in swallowing and is traditionally classified as either oropharyngeal or esophageal in origin. Normal aging as well as chronic illness may lead to decreased connective tissue elasticity, muscle mass, and oral secretions, which affect swallowing performance.1 Stroke is a common predisposing condition.2 Dysphagia predisposes patients to dehydration, malnutrition, and electrolyte derangements. The most feared immediate complication is aspiration pneumonia (AP) resulting from impaired clearance of oral secretions.3

The diagnosis of dysphagia is clinical, and assessments from patients and family are often sufficient. The optimal test to assess the severity of dysphagia is a bedside swallow evaluation using small amounts of water.1 Video-assisted fluoroscopic examinations can identify problem areas within the oropharynx and esophagus and may help determine the etiology of dysphagia.

What evidence supports various treatment options for dysphagia?

Access to Water

Water is a thin liquid with low viscosity, which allows for rapid transit through the oropharynx. In debilitated and elderly patients, thin liquids easily reach the epiglottis and enter the trachea before pharyngeal muscles compensate. As such, access to water and other thin liquids is often restricted in patients suspected to have dysphagia.4

However, allowing access to water improves patient satisfaction, reduces the development of dehydration, and does not increase the incidence of AP. Bedside therapy interventions such as correct positioning and chin-tuck and sipping technique as well as attention to oral hygiene are recommended prior to more noxious options such as thickened liquids.1 The Frazier water protocol may help provide logistical guidance for facilities interested in improving access to water for patients with dysphagia.

Liquid Modification

Many clinicians manage dysphagia through restricting access to all thin liquids. In the hospital setting where video fluoroscopy and speech therapy are readily available, clinicians frequently employ the use of modified diets with thickened liquids in order to minimize the risk of aspiration despite the lack of high-quality evidence supporting liquid modification.2 Patients associate thickened liquids and restricted diets with a reduction in quality of life. Compliance studies have shown that only a minority of patients are compliant with thickened liquids at five days. In addition, thickening liquids has not been shown to decrease the risk of AP nor improve nutritional status, and it may actually cause harm by increasing the risk of dehydration and UTI.4

Tube Feeding

In patients with severe dysphagia in whom conservative management is not feasible or has failed, maintaining adequate nutrition can be a challenge. There are encouraging data with nutritionally enriching and modifying the texture of solid foods.1 Alternative methods of enteral nutrition delivery are often also considered. The most common vehicles of delivery are nasogastric tubes, post-pyloric feeding tubes, and percutaneous endoscopic gastrostomy (PEG) tubes. In theory, bypassing the pharynx and esophagus could result in fewer aspiration events and less AP.3 However, nasogastric, post-pyloric, or PEG feeding does not decrease the risk of AP. For patients with advanced dementia, there have been no randomized trials demonstrating an improvement in mortality with tube feeds.4 Tube feeding also carries with it a slight procedural risk and a high incidence of associated diarrhea, plus is associated with electrolyte derangements such as hypernatremia. The decision to pursue tube feeding should be weighed heavily in every patient and is highly influenced by the etiology and anticipated duration of dysphagia.

 

 

Selective Digestive Decontamination

Selective digestive decontamination (SDD) is a protocol-based treatment that aims to eradicate potentially pathogenic gut flora, particularly aerobic gram-negatives, in critically ill patients to reduce the impact of aspiration events. The utilization of SDD and the available literature center firmly on critically ill and ventilated patients. Subsequent studies have demonstrated recolonization after protocol cessation, and long-term effects are currently undefined.5 Until it can be studied in broader populations and proven to have clinical benefit, employing SSD in non-critically ill patients with dysphagia remains unsupported.

Multimodal Approach

Many rehabilitation centers incorporate a therapist-driven swallowing treatment program. Evidence suggests patient and family counseling alone may not be effective, so these programs variably incorporate diet/liquid modification, strengthening exercises, sensory processing techniques, and even neuromuscular electrical stimulation for muscle building.1 Accordingly, these programs are resource-intensive.

Management

Dysphagia remains a significant clinical problem for hospitalized patients. The existing literature and practice guidelines generally support a “less is more” approach. Though liquid/diet modification is common practice, it is not based in solid evidence and may contribute to unnecessary tube feeding. The best current evidence supports allowing access to water and ice chips. The ideal management plan for each patient will differ and should incorporate patient and family preferences in a multidisciplinary approach.

Back to the Case

Our patient requests water. He coughs after drinking during a bedside swallow evaluation. The risks of potential aspiration and AP are explained, and he expresses his understanding. He reiterates his choice to be allowed access to water as it is important to his quality of life. The speech therapy team is consulted and provides instruction on chin-tuck positioning, oral care, and timing water between meals rather than while eating food. He does well for the remainder of the hospital stay, and by time of discharge, his electrolytes are corrected, and he is much more comfortable being allowed to drink water. He is discharged home and encouraged to continue with these conservative measures.

Bottom Line

Evidence to support many common interventions for dysphagia is lacking; patients with dysphagia are best managed via a multidisciplinary, multimodal approach that provides access to water whenever possible. TH


Vijay G. Paryani, MD, is an internal medicine resident in the department of internal medicine at the University of Kentucky. Joseph R. Sweigart, MD, is a hospitalist and assistant professor of hospital medicine in the division of hospital medicine at the University of Kentucky. Laura C. Fanucchi, MD, is a hospitalist and assistant professor of hospital medicine in the division of hospital medicine at the University of Kentucky.

References

  1. Karagiannis MJ, Chivers L, Karagiannis TC. Effects of oral intake of water in patients with oropharyngeal dysphagia. BMC Geriatr. 2011;11(2):9.
  2. Foley N, Teasell R, Salter K, Kruger E, Martino R. Dysphagia treatment post stroke: a systematic review of randomized controlled trials. Age Ageing. 2008;37(3):258-264.
  3. Marik PE. Aspiration pneumonitis and aspiration pneumonia. N Engl J Med. 2001;344(9):665-671.
  4. Loeb MB, Becker M, Eady A, Walker-Dilks C. Interventions to prevent aspiration pneumonia in older adults: a systematic review. J Am Geriatr Soc. 2003;51(7):1018-1022.
  5. Gosney M, Martin MV, Wright AE. The role of selective decontamination of the digestive tract in acute stroke. Age Ageing 2006;35(1):42-47.

Key Takeaways

  1. Free access to ice chips and water is beneficial and important to quality of life in the setting of dysphagia.
  2. Minimal evidence exists to support many common interventions for dysphagia.
  3. Thickening liquids has sparse, low-quality evidence supporting use and some evidence of harm, including increased rate of dehydration, UTI, and chronic thirst.
  4. Multidisciplinary approaches incorporating patient and family preferences are optimal.

 

 

Additional Reading

  1. Frazier water protocol. KentuckyOne Health website.
  2. Karagiannis M, Karagiannis TC. Oropharyngeal dysphagia, free water protocol and quality of life: an update from a prospective clinical trial. Hell J Nucl Med. 2014;17 Suppl 1:26-9.
  3. Loeb MB, Becker M, Eady A, Walker-Dilks C. Interventions to prevent aspiration pneumonia in older adults: a systematic review. J Am Geriatr Soc. 2003;51(7):1018-1022.

The Case

A 74-year-old man with Alzheimer’s dementia presents with urinary tract infection (UTI), hypovolemia, and hypernatremia. He also has chronic dysphagia with a history of aspiration pneumonia and has been on thickened liquids at home for the past five months. As his infection is treated, he improves and requests water to drink.

Background

Dysphagia is a very common problem, particularly among elderly patients. The exact prevalence is unknown, but it is estimated to occur in up to 30% of the elderly population. Dysphagia is defined as difficulty or discomfort in swallowing and is traditionally classified as either oropharyngeal or esophageal in origin. Normal aging as well as chronic illness may lead to decreased connective tissue elasticity, muscle mass, and oral secretions, which affect swallowing performance.1 Stroke is a common predisposing condition.2 Dysphagia predisposes patients to dehydration, malnutrition, and electrolyte derangements. The most feared immediate complication is aspiration pneumonia (AP) resulting from impaired clearance of oral secretions.3

The diagnosis of dysphagia is clinical, and assessments from patients and family are often sufficient. The optimal test to assess the severity of dysphagia is a bedside swallow evaluation using small amounts of water.1 Video-assisted fluoroscopic examinations can identify problem areas within the oropharynx and esophagus and may help determine the etiology of dysphagia.

What evidence supports various treatment options for dysphagia?

Access to Water

Water is a thin liquid with low viscosity, which allows for rapid transit through the oropharynx. In debilitated and elderly patients, thin liquids easily reach the epiglottis and enter the trachea before pharyngeal muscles compensate. As such, access to water and other thin liquids is often restricted in patients suspected to have dysphagia.4

However, allowing access to water improves patient satisfaction, reduces the development of dehydration, and does not increase the incidence of AP. Bedside therapy interventions such as correct positioning and chin-tuck and sipping technique as well as attention to oral hygiene are recommended prior to more noxious options such as thickened liquids.1 The Frazier water protocol may help provide logistical guidance for facilities interested in improving access to water for patients with dysphagia.

Liquid Modification

Many clinicians manage dysphagia through restricting access to all thin liquids. In the hospital setting where video fluoroscopy and speech therapy are readily available, clinicians frequently employ the use of modified diets with thickened liquids in order to minimize the risk of aspiration despite the lack of high-quality evidence supporting liquid modification.2 Patients associate thickened liquids and restricted diets with a reduction in quality of life. Compliance studies have shown that only a minority of patients are compliant with thickened liquids at five days. In addition, thickening liquids has not been shown to decrease the risk of AP nor improve nutritional status, and it may actually cause harm by increasing the risk of dehydration and UTI.4

Tube Feeding

In patients with severe dysphagia in whom conservative management is not feasible or has failed, maintaining adequate nutrition can be a challenge. There are encouraging data with nutritionally enriching and modifying the texture of solid foods.1 Alternative methods of enteral nutrition delivery are often also considered. The most common vehicles of delivery are nasogastric tubes, post-pyloric feeding tubes, and percutaneous endoscopic gastrostomy (PEG) tubes. In theory, bypassing the pharynx and esophagus could result in fewer aspiration events and less AP.3 However, nasogastric, post-pyloric, or PEG feeding does not decrease the risk of AP. For patients with advanced dementia, there have been no randomized trials demonstrating an improvement in mortality with tube feeds.4 Tube feeding also carries with it a slight procedural risk and a high incidence of associated diarrhea, plus is associated with electrolyte derangements such as hypernatremia. The decision to pursue tube feeding should be weighed heavily in every patient and is highly influenced by the etiology and anticipated duration of dysphagia.

 

 

Selective Digestive Decontamination

Selective digestive decontamination (SDD) is a protocol-based treatment that aims to eradicate potentially pathogenic gut flora, particularly aerobic gram-negatives, in critically ill patients to reduce the impact of aspiration events. The utilization of SDD and the available literature center firmly on critically ill and ventilated patients. Subsequent studies have demonstrated recolonization after protocol cessation, and long-term effects are currently undefined.5 Until it can be studied in broader populations and proven to have clinical benefit, employing SSD in non-critically ill patients with dysphagia remains unsupported.

Multimodal Approach

Many rehabilitation centers incorporate a therapist-driven swallowing treatment program. Evidence suggests patient and family counseling alone may not be effective, so these programs variably incorporate diet/liquid modification, strengthening exercises, sensory processing techniques, and even neuromuscular electrical stimulation for muscle building.1 Accordingly, these programs are resource-intensive.

Management

Dysphagia remains a significant clinical problem for hospitalized patients. The existing literature and practice guidelines generally support a “less is more” approach. Though liquid/diet modification is common practice, it is not based in solid evidence and may contribute to unnecessary tube feeding. The best current evidence supports allowing access to water and ice chips. The ideal management plan for each patient will differ and should incorporate patient and family preferences in a multidisciplinary approach.

Back to the Case

Our patient requests water. He coughs after drinking during a bedside swallow evaluation. The risks of potential aspiration and AP are explained, and he expresses his understanding. He reiterates his choice to be allowed access to water as it is important to his quality of life. The speech therapy team is consulted and provides instruction on chin-tuck positioning, oral care, and timing water between meals rather than while eating food. He does well for the remainder of the hospital stay, and by time of discharge, his electrolytes are corrected, and he is much more comfortable being allowed to drink water. He is discharged home and encouraged to continue with these conservative measures.

Bottom Line

Evidence to support many common interventions for dysphagia is lacking; patients with dysphagia are best managed via a multidisciplinary, multimodal approach that provides access to water whenever possible. TH


Vijay G. Paryani, MD, is an internal medicine resident in the department of internal medicine at the University of Kentucky. Joseph R. Sweigart, MD, is a hospitalist and assistant professor of hospital medicine in the division of hospital medicine at the University of Kentucky. Laura C. Fanucchi, MD, is a hospitalist and assistant professor of hospital medicine in the division of hospital medicine at the University of Kentucky.

References

  1. Karagiannis MJ, Chivers L, Karagiannis TC. Effects of oral intake of water in patients with oropharyngeal dysphagia. BMC Geriatr. 2011;11(2):9.
  2. Foley N, Teasell R, Salter K, Kruger E, Martino R. Dysphagia treatment post stroke: a systematic review of randomized controlled trials. Age Ageing. 2008;37(3):258-264.
  3. Marik PE. Aspiration pneumonitis and aspiration pneumonia. N Engl J Med. 2001;344(9):665-671.
  4. Loeb MB, Becker M, Eady A, Walker-Dilks C. Interventions to prevent aspiration pneumonia in older adults: a systematic review. J Am Geriatr Soc. 2003;51(7):1018-1022.
  5. Gosney M, Martin MV, Wright AE. The role of selective decontamination of the digestive tract in acute stroke. Age Ageing 2006;35(1):42-47.

Key Takeaways

  1. Free access to ice chips and water is beneficial and important to quality of life in the setting of dysphagia.
  2. Minimal evidence exists to support many common interventions for dysphagia.
  3. Thickening liquids has sparse, low-quality evidence supporting use and some evidence of harm, including increased rate of dehydration, UTI, and chronic thirst.
  4. Multidisciplinary approaches incorporating patient and family preferences are optimal.

 

 

Additional Reading

  1. Frazier water protocol. KentuckyOne Health website.
  2. Karagiannis M, Karagiannis TC. Oropharyngeal dysphagia, free water protocol and quality of life: an update from a prospective clinical trial. Hell J Nucl Med. 2014;17 Suppl 1:26-9.
  3. Loeb MB, Becker M, Eady A, Walker-Dilks C. Interventions to prevent aspiration pneumonia in older adults: a systematic review. J Am Geriatr Soc. 2003;51(7):1018-1022.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Article Type
Display Headline
How Should Hospitalists Manage Elderly Patients with Dysphagia?
Display Headline
How Should Hospitalists Manage Elderly Patients with Dysphagia?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Updated Guideline for Acute Diarrheal Infection

Article Type
Changed
Thu, 12/15/2022 - 15:59
Display Headline
Updated Guideline for Acute Diarrheal Infection

Clinical Question: What are current recommendations for diagnosis, management, and prevention of acute gastrointestinal infection in immune-competent adults?

Background: Acute diarrheal infection is a leading cause of healthcare visits and lost quality of life. The Centers for Disease Control and Prevention estimates there are 47.8 million cases annually, with a healthcare economy burden of $150 million.

Study Design: American College of Gastroenterology (ACG) practice guideline.

Setting: Expert panel.

Synopsis: Stool diagnostic studies may be used for dysentery with moderate-severe disease and symptoms lasting more than seven days (strong recommendation, low level of evidence). Traditional diagnostic methods in most cases fail to reveal etiology (strong recommendation, low level of evidence). Treatment with probiotics or prebiotics is not recommended (strong recommendation, moderate level of evidence). Bismuth subsalicylates may be considered for prophylaxis against traveler’s diarrhea (strong recommendation, high level of evidence). Short-term antibiotic chemoprophylaxis also may be considered for high-risk groups (strong recommendation, high level of evidence). Empiric antimicrobial therapy is not recommended except in cases of traveler’s diarrhea (strong recommendation, high level of evidence). Loperamide may be used as an adjunct to antibiotics for traveler’s diarrhea (strong recommendation, moderate level of evidence).

Bottom Line: ACG acute diarrheal illness guidelines have been updated. Few recommendations are strong, and very few have high levels of evidence.

Citation: Riddle MS, DuPont HL, Conner BA. ACG clinical guideline: diagnosis, treatment, and prevention of acute diarrheal infections in adults. Am J Gastroenterol. 2016;111(5):602-622.

Issue
The Hospitalist - 2016(10)
Publications
Sections

Clinical Question: What are current recommendations for diagnosis, management, and prevention of acute gastrointestinal infection in immune-competent adults?

Background: Acute diarrheal infection is a leading cause of healthcare visits and lost quality of life. The Centers for Disease Control and Prevention estimates there are 47.8 million cases annually, with a healthcare economy burden of $150 million.

Study Design: American College of Gastroenterology (ACG) practice guideline.

Setting: Expert panel.

Synopsis: Stool diagnostic studies may be used for dysentery with moderate-severe disease and symptoms lasting more than seven days (strong recommendation, low level of evidence). Traditional diagnostic methods in most cases fail to reveal etiology (strong recommendation, low level of evidence). Treatment with probiotics or prebiotics is not recommended (strong recommendation, moderate level of evidence). Bismuth subsalicylates may be considered for prophylaxis against traveler’s diarrhea (strong recommendation, high level of evidence). Short-term antibiotic chemoprophylaxis also may be considered for high-risk groups (strong recommendation, high level of evidence). Empiric antimicrobial therapy is not recommended except in cases of traveler’s diarrhea (strong recommendation, high level of evidence). Loperamide may be used as an adjunct to antibiotics for traveler’s diarrhea (strong recommendation, moderate level of evidence).

Bottom Line: ACG acute diarrheal illness guidelines have been updated. Few recommendations are strong, and very few have high levels of evidence.

Citation: Riddle MS, DuPont HL, Conner BA. ACG clinical guideline: diagnosis, treatment, and prevention of acute diarrheal infections in adults. Am J Gastroenterol. 2016;111(5):602-622.

Clinical Question: What are current recommendations for diagnosis, management, and prevention of acute gastrointestinal infection in immune-competent adults?

Background: Acute diarrheal infection is a leading cause of healthcare visits and lost quality of life. The Centers for Disease Control and Prevention estimates there are 47.8 million cases annually, with a healthcare economy burden of $150 million.

Study Design: American College of Gastroenterology (ACG) practice guideline.

Setting: Expert panel.

Synopsis: Stool diagnostic studies may be used for dysentery with moderate-severe disease and symptoms lasting more than seven days (strong recommendation, low level of evidence). Traditional diagnostic methods in most cases fail to reveal etiology (strong recommendation, low level of evidence). Treatment with probiotics or prebiotics is not recommended (strong recommendation, moderate level of evidence). Bismuth subsalicylates may be considered for prophylaxis against traveler’s diarrhea (strong recommendation, high level of evidence). Short-term antibiotic chemoprophylaxis also may be considered for high-risk groups (strong recommendation, high level of evidence). Empiric antimicrobial therapy is not recommended except in cases of traveler’s diarrhea (strong recommendation, high level of evidence). Loperamide may be used as an adjunct to antibiotics for traveler’s diarrhea (strong recommendation, moderate level of evidence).

Bottom Line: ACG acute diarrheal illness guidelines have been updated. Few recommendations are strong, and very few have high levels of evidence.

Citation: Riddle MS, DuPont HL, Conner BA. ACG clinical guideline: diagnosis, treatment, and prevention of acute diarrheal infections in adults. Am J Gastroenterol. 2016;111(5):602-622.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Article Type
Display Headline
Updated Guideline for Acute Diarrheal Infection
Display Headline
Updated Guideline for Acute Diarrheal Infection
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Risk-Assessment Models Are Unreliable Predictors of Venous Thromboembolism

Article Type
Changed
Thu, 12/15/2022 - 15:59
Display Headline
Risk-Assessment Models Are Unreliable Predictors of Venous Thromboembolism

Clinical Question: Do risk-assessment models (RAMs) accurately predict which hospitalized medical patients are at risk for venous thromboembolism (VTE)?

Background: Predicting which patients are at high risk for VTE is important. Several models exist, but limited data support their generalizability and accuracy in medical inpatients.

Study Design: Retrospective cohort.

Setting: Hospitals participating in the Michigan Hospital Medicine Safety Consortium (MHMSC).

Synopsis: Data collected through MHMSC for selected medical patients were used in the Kucher, Padua, predictive IMPROVE, and Intermountain DVT risk-assessment models. Patients were classified as “low risk” or “at risk” based on each RAM. Follow-up data came from chart extraction (100% of patients) and 90-day post-discharge telephone calls (58% of patients). The primary outcome was image-confirmed hospital associated VTE, including proximal upper- or proximal lower-extremity DVT or pulmonary embolism. These RAMs classified less than 20% of patients as “at risk.” The incidence of VTE was less than 1%. In this external validation study, the Kucher RAM was the least discriminate and the Intermountain was the best, but none yielded results equivalent to the original studies.

This study was limited by the retrospective design, subjectivity of some risk factors (such as immobility), and inability to obtain 90-day telephone follow-up in all patients. Lastly, the binary approach (“at risk” versus “low risk”) may not align with the original derivation studies in which each factor was evaluated independently.

Bottom Line: The incidence of VTE is low in medical inpatients, and current RAMs may not accurately identify at-risk patients.

Citation: Greene MT, Spyropoulos AC, Chopra V, et al. Validation of risk assessment models of venous thromboembolism in hospitalized medical patients. Am J Med. 2016;129(9):1001.e9-1001.e18. doi:10.1016/j.amjmed.2016.03.031.

Issue
The Hospitalist - 2016(10)
Publications
Sections

Clinical Question: Do risk-assessment models (RAMs) accurately predict which hospitalized medical patients are at risk for venous thromboembolism (VTE)?

Background: Predicting which patients are at high risk for VTE is important. Several models exist, but limited data support their generalizability and accuracy in medical inpatients.

Study Design: Retrospective cohort.

Setting: Hospitals participating in the Michigan Hospital Medicine Safety Consortium (MHMSC).

Synopsis: Data collected through MHMSC for selected medical patients were used in the Kucher, Padua, predictive IMPROVE, and Intermountain DVT risk-assessment models. Patients were classified as “low risk” or “at risk” based on each RAM. Follow-up data came from chart extraction (100% of patients) and 90-day post-discharge telephone calls (58% of patients). The primary outcome was image-confirmed hospital associated VTE, including proximal upper- or proximal lower-extremity DVT or pulmonary embolism. These RAMs classified less than 20% of patients as “at risk.” The incidence of VTE was less than 1%. In this external validation study, the Kucher RAM was the least discriminate and the Intermountain was the best, but none yielded results equivalent to the original studies.

This study was limited by the retrospective design, subjectivity of some risk factors (such as immobility), and inability to obtain 90-day telephone follow-up in all patients. Lastly, the binary approach (“at risk” versus “low risk”) may not align with the original derivation studies in which each factor was evaluated independently.

Bottom Line: The incidence of VTE is low in medical inpatients, and current RAMs may not accurately identify at-risk patients.

Citation: Greene MT, Spyropoulos AC, Chopra V, et al. Validation of risk assessment models of venous thromboembolism in hospitalized medical patients. Am J Med. 2016;129(9):1001.e9-1001.e18. doi:10.1016/j.amjmed.2016.03.031.

Clinical Question: Do risk-assessment models (RAMs) accurately predict which hospitalized medical patients are at risk for venous thromboembolism (VTE)?

Background: Predicting which patients are at high risk for VTE is important. Several models exist, but limited data support their generalizability and accuracy in medical inpatients.

Study Design: Retrospective cohort.

Setting: Hospitals participating in the Michigan Hospital Medicine Safety Consortium (MHMSC).

Synopsis: Data collected through MHMSC for selected medical patients were used in the Kucher, Padua, predictive IMPROVE, and Intermountain DVT risk-assessment models. Patients were classified as “low risk” or “at risk” based on each RAM. Follow-up data came from chart extraction (100% of patients) and 90-day post-discharge telephone calls (58% of patients). The primary outcome was image-confirmed hospital associated VTE, including proximal upper- or proximal lower-extremity DVT or pulmonary embolism. These RAMs classified less than 20% of patients as “at risk.” The incidence of VTE was less than 1%. In this external validation study, the Kucher RAM was the least discriminate and the Intermountain was the best, but none yielded results equivalent to the original studies.

This study was limited by the retrospective design, subjectivity of some risk factors (such as immobility), and inability to obtain 90-day telephone follow-up in all patients. Lastly, the binary approach (“at risk” versus “low risk”) may not align with the original derivation studies in which each factor was evaluated independently.

Bottom Line: The incidence of VTE is low in medical inpatients, and current RAMs may not accurately identify at-risk patients.

Citation: Greene MT, Spyropoulos AC, Chopra V, et al. Validation of risk assessment models of venous thromboembolism in hospitalized medical patients. Am J Med. 2016;129(9):1001.e9-1001.e18. doi:10.1016/j.amjmed.2016.03.031.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Article Type
Display Headline
Risk-Assessment Models Are Unreliable Predictors of Venous Thromboembolism
Display Headline
Risk-Assessment Models Are Unreliable Predictors of Venous Thromboembolism
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Acute HIV Causes Transient Neurologic Findings

Article Type
Changed
Thu, 12/15/2022 - 15:59
Display Headline
Acute HIV Causes Transient Neurologic Findings

Clinical Question: How common are neurologic findings in acute HIV infection?

Background: The incidence of neurologic findings with acute HIV is unknown.

Study Design: Cohort study.

Setting: Bangkok, Thailand.

Synopsis: In this study, 134 patients were identified after presenting for voluntary HIV testing. Five others were enrolled through an ongoing local study. All 139 participants underwent structured neurologic evaluations at enrollment (median of 19 days after presumed exposure), then at four and 12 weeks. Combination antiretroviral therapy (cART) was initiated immediately after initial evaluation.

The cohort was 93% male. Mean age was younger than 30 years. Fifty-three percent of participants experienced some neurologic finding within 12 weeks of diagnosis. One-third (33%) were cognitive symptoms, predominantly problems of concentration (24% of patients) and memory (16% of patients). One-third (34%) were motor findings, and 11% were neuropathy. Forty-nine percent of the neurologic issues were present at diagnosis. Symptoms were mostly mild, although one patient developed fulminant Guillain-Barré syndrome. Patients with neurologic findings had higher viral loads at diagnosis (mean plasma log10 HIV RNA 5.9 versus 5.4; P = 0.006). Participants with and without neurologic findings had similar cerebral spinal fluid viral loads (mean log10 HIV RNA 3.7 versus 3.1, P = 0.14) and serum CD4 counts (339 versus 381 cells/mm3; P = 0.46). Neurologic findings resolved within one month of cART treatment in 90% of patients. Study limitations include lack of a control cohort and potential confounding from illicit drug use among participants.

Bottom Line: Acute HIV infection commonly causes mild neurologic problems, which remit with treatment.

Citation: Hellmuth J, Fletcher JL, Valcour V, et al. Neurologic signs and symptoms frequently manifest in acute HIV infection. Neurology. 2016;87(2):148-154.

Issue
The Hospitalist - 2016(10)
Publications
Topics
Sections

Clinical Question: How common are neurologic findings in acute HIV infection?

Background: The incidence of neurologic findings with acute HIV is unknown.

Study Design: Cohort study.

Setting: Bangkok, Thailand.

Synopsis: In this study, 134 patients were identified after presenting for voluntary HIV testing. Five others were enrolled through an ongoing local study. All 139 participants underwent structured neurologic evaluations at enrollment (median of 19 days after presumed exposure), then at four and 12 weeks. Combination antiretroviral therapy (cART) was initiated immediately after initial evaluation.

The cohort was 93% male. Mean age was younger than 30 years. Fifty-three percent of participants experienced some neurologic finding within 12 weeks of diagnosis. One-third (33%) were cognitive symptoms, predominantly problems of concentration (24% of patients) and memory (16% of patients). One-third (34%) were motor findings, and 11% were neuropathy. Forty-nine percent of the neurologic issues were present at diagnosis. Symptoms were mostly mild, although one patient developed fulminant Guillain-Barré syndrome. Patients with neurologic findings had higher viral loads at diagnosis (mean plasma log10 HIV RNA 5.9 versus 5.4; P = 0.006). Participants with and without neurologic findings had similar cerebral spinal fluid viral loads (mean log10 HIV RNA 3.7 versus 3.1, P = 0.14) and serum CD4 counts (339 versus 381 cells/mm3; P = 0.46). Neurologic findings resolved within one month of cART treatment in 90% of patients. Study limitations include lack of a control cohort and potential confounding from illicit drug use among participants.

Bottom Line: Acute HIV infection commonly causes mild neurologic problems, which remit with treatment.

Citation: Hellmuth J, Fletcher JL, Valcour V, et al. Neurologic signs and symptoms frequently manifest in acute HIV infection. Neurology. 2016;87(2):148-154.

Clinical Question: How common are neurologic findings in acute HIV infection?

Background: The incidence of neurologic findings with acute HIV is unknown.

Study Design: Cohort study.

Setting: Bangkok, Thailand.

Synopsis: In this study, 134 patients were identified after presenting for voluntary HIV testing. Five others were enrolled through an ongoing local study. All 139 participants underwent structured neurologic evaluations at enrollment (median of 19 days after presumed exposure), then at four and 12 weeks. Combination antiretroviral therapy (cART) was initiated immediately after initial evaluation.

The cohort was 93% male. Mean age was younger than 30 years. Fifty-three percent of participants experienced some neurologic finding within 12 weeks of diagnosis. One-third (33%) were cognitive symptoms, predominantly problems of concentration (24% of patients) and memory (16% of patients). One-third (34%) were motor findings, and 11% were neuropathy. Forty-nine percent of the neurologic issues were present at diagnosis. Symptoms were mostly mild, although one patient developed fulminant Guillain-Barré syndrome. Patients with neurologic findings had higher viral loads at diagnosis (mean plasma log10 HIV RNA 5.9 versus 5.4; P = 0.006). Participants with and without neurologic findings had similar cerebral spinal fluid viral loads (mean log10 HIV RNA 3.7 versus 3.1, P = 0.14) and serum CD4 counts (339 versus 381 cells/mm3; P = 0.46). Neurologic findings resolved within one month of cART treatment in 90% of patients. Study limitations include lack of a control cohort and potential confounding from illicit drug use among participants.

Bottom Line: Acute HIV infection commonly causes mild neurologic problems, which remit with treatment.

Citation: Hellmuth J, Fletcher JL, Valcour V, et al. Neurologic signs and symptoms frequently manifest in acute HIV infection. Neurology. 2016;87(2):148-154.

Issue
The Hospitalist - 2016(10)
Issue
The Hospitalist - 2016(10)
Publications
Publications
Topics
Article Type
Display Headline
Acute HIV Causes Transient Neurologic Findings
Display Headline
Acute HIV Causes Transient Neurologic Findings
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)