User login
An update on the current standard for ultrasound education in fellowship
Point-of-care ultrasound (POCUS) is an essential part of ICU care. It has been demonstrated to improve patient safety and outcomes through procedural guidance (Brass P, et al. Cochrane Database Syst Rev. 2015 Jan 9;1:CD006962) and aid in accurate and timely diagnosis of cardiopulmonary failure (Lichtenstein DA, Mezière GA. Chest. 2008 Jul;134[1]:117-25). Due in part to increasing affordability and portability of ultrasound technologies, the use of POCUS has become seemingly ubiquitous and will continue to increase in coming years. According to expert groups representing 12 critical care societies worldwide, general critical care ultrasound and basic critical care echocardiography should be mandatory training for ICU physicians (Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37[7]:1077-83).
Currently, POCUS is not universally taught to pulmonary and critical care fellows (PCCM); and when training does exist, curriculums are not standardized. This is in part due to the broadly worded requirements set forth from the ACGME for pulmonary disease and critical care medicine. The totality of ACGME common program requirements as it regards to ultrasound training are as follows: 1. “Fellows must demonstrate competence in procedural and technical skills, including ... use of ultrasound techniques to perform thoracentesis and place intravascular and intracavitary tubes and catheters”; and 2. “Fellows must demonstrate knowledge of imaging techniques commonly employed in the evaluation of patients with pulmonary disease or critical illness, including the use of ultrasound” (ACGME Program Requirements for Graduate Medical Education in Pulmonary Disease and Critical Care Medicine).
In comparison, recently updated ACGME common program requirements for ultrasound in emergency medicine and anesthesiology residencies are robust and detailed. Requirements for anesthesia residency training include: ” ... competency in using surface ultrasound ... and transthoracic echocardiography to guide the performance of invasive procedures and to evaluate organ function and pathology ... understanding the principles of ultrasound, including the physics of ultrasound transmission, ultrasound transducer construction, and transducer selection for specific applications, to include being able to obtain images with an understanding of limitations and artifacts ... obtaining standard views of the heart and inferior vena cava with transthoracic echocardiography allowing the evaluation of myocardial function, estimation of central venous pressure, and gross pericardial/cardiac pathology (eg, large pericardial effusion) ... using transthoracic ultrasound for the detection of pneumothorax and pleural effusion ... using surface ultrasound to guide vascular access (both central and peripheral) ... describing techniques, views, and findings in standard language” (ACGME Program Requirements for Graduate Medical Education In Anesthesiology).
Herein lies a stark contrast in what is required of programs that train physicians to care for unstable patients and the critically ill. Current requirements leave graduates of PCCM training programs vulnerable to completing ACGME milestones without being adequately prepared to evaluate patients in a modern ICU setting. Increasingly, hospitals credentialing committees expect PCCM graduates to be suitably trained in ultrasound. Regrettably, there is no assurance that is true, or standardized, with current PCCM fellowship training requirements.
There is not a national standard for competency assessment or requirements for credentialing in POCUS for critical care physicians at this time. However, multiple national and international critical care societies, including CHEST, have consensus statements and recommendations outlining the areas of competence expected in critical care ultrasound (Mayo PH, et al. Chest. 2009 Apr;135[4];1050-60, Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37(7):1077-83). The PCCM ACGME requirements should be updated to reflect such recommendations, thereby placing greater emphasis on ultrasound teaching requirements and standardized curriculums. Despite the current ACGME program requirements, it is incumbent upon critical care training programs to provide competency-based education of this now “standard of care” technology.
Barriers to universal POCUS training exist. Fellowship programs may lack trained, ultrasound confident faculty, time, and funding to successfully develop and sustain an ultrasound curriculum. (Eisen LA, et al. Crit Care Med. 2010;38[10]:1978-83; Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12: [Epub ahead of print].)
Although access to adequate quality and quantity of ultrasound machines is less often a problem than in the past, many institutions lack archival and image review software that allows for quality assurance of image acquisition, and some still may not have a faculty member with expertise and ability to champion the cause.
In attempts to mitigate the local faculty gaps, national and regional solutions have been developed for ultrasonography education. CHEST has educated more than 1,400 learners in the Ultrasound Essentials course since 2013. Also, grassroots efforts have led to the development of courses specifically designed to teach incoming PCCM fellows. Using a collaborative and cost-effective model, these regional programs pool faculty and experts in the field to train multiple fellowship programs simultaneously. The first of these was created over a decade ago in New York City (Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12:[Epub ahead of print].)
Currently, there are at least four regional annual ultrasound courses directed at teaching PCCM fellows. These courses are typically held over multiple days and encompass the basics of critical care ultrasound, including vascular, thoracic, abdominal, cardiac, and procedural imaging. By estimation, these four courses provide a basic ultrasonography education to approximately two-thirds of first year pulmonary and critical care fellows in the United States. In addition to training fellows, these programs also serve as a platform for the development of local faculty experts, so that training can continue at their institutions.
Introductory courses are highly effective (Dinh VA, et al. Crit Care Res Pract. 2015 Aug 5:675041 Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12: [Epub ahead of print]), but ongoing education, assessment, and quality assurance is required to achieve sustained competence. Ideally, training in POCUS should entail a dedicated, intensive introduction to the competencies of critical care ultrasound (such as the above regional courses or CHEST ultrasound courses), followed by a formal curriculum within the PCCM fellowship programs. This curriculum should afford the trainee exposure to critically ill patients in an environment with adequate ultrasound equipment and a method to record studies. The trainee then interprets the acquired studies in clinical context. Preferably, the program will afford the trainee real-time quality assurance for image acquisition and interpretation by a program champion. Quality assurance can be provided on site or remotely using fixed interval review sessions. Lastly, the program should have internal milestones to evaluate when a trainee has reached competency to perform these tasks independently. The completion of training should include a letter to any future employee attesting to the trainee’s acquisition of these skills and ability to apply them safely while caring for the critically ill. This robust education is occurring in many centers across the country. PCCM fellowship programs owe it to their trainees, and patients, that competency-based critical care ultrasound training is robust, standardized, and supported.
Point-of-care ultrasound (POCUS) is an essential part of ICU care. It has been demonstrated to improve patient safety and outcomes through procedural guidance (Brass P, et al. Cochrane Database Syst Rev. 2015 Jan 9;1:CD006962) and aid in accurate and timely diagnosis of cardiopulmonary failure (Lichtenstein DA, Mezière GA. Chest. 2008 Jul;134[1]:117-25). Due in part to increasing affordability and portability of ultrasound technologies, the use of POCUS has become seemingly ubiquitous and will continue to increase in coming years. According to expert groups representing 12 critical care societies worldwide, general critical care ultrasound and basic critical care echocardiography should be mandatory training for ICU physicians (Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37[7]:1077-83).
Currently, POCUS is not universally taught to pulmonary and critical care fellows (PCCM); and when training does exist, curriculums are not standardized. This is in part due to the broadly worded requirements set forth from the ACGME for pulmonary disease and critical care medicine. The totality of ACGME common program requirements as it regards to ultrasound training are as follows: 1. “Fellows must demonstrate competence in procedural and technical skills, including ... use of ultrasound techniques to perform thoracentesis and place intravascular and intracavitary tubes and catheters”; and 2. “Fellows must demonstrate knowledge of imaging techniques commonly employed in the evaluation of patients with pulmonary disease or critical illness, including the use of ultrasound” (ACGME Program Requirements for Graduate Medical Education in Pulmonary Disease and Critical Care Medicine).
In comparison, recently updated ACGME common program requirements for ultrasound in emergency medicine and anesthesiology residencies are robust and detailed. Requirements for anesthesia residency training include: ” ... competency in using surface ultrasound ... and transthoracic echocardiography to guide the performance of invasive procedures and to evaluate organ function and pathology ... understanding the principles of ultrasound, including the physics of ultrasound transmission, ultrasound transducer construction, and transducer selection for specific applications, to include being able to obtain images with an understanding of limitations and artifacts ... obtaining standard views of the heart and inferior vena cava with transthoracic echocardiography allowing the evaluation of myocardial function, estimation of central venous pressure, and gross pericardial/cardiac pathology (eg, large pericardial effusion) ... using transthoracic ultrasound for the detection of pneumothorax and pleural effusion ... using surface ultrasound to guide vascular access (both central and peripheral) ... describing techniques, views, and findings in standard language” (ACGME Program Requirements for Graduate Medical Education In Anesthesiology).
Herein lies a stark contrast in what is required of programs that train physicians to care for unstable patients and the critically ill. Current requirements leave graduates of PCCM training programs vulnerable to completing ACGME milestones without being adequately prepared to evaluate patients in a modern ICU setting. Increasingly, hospitals credentialing committees expect PCCM graduates to be suitably trained in ultrasound. Regrettably, there is no assurance that is true, or standardized, with current PCCM fellowship training requirements.
There is not a national standard for competency assessment or requirements for credentialing in POCUS for critical care physicians at this time. However, multiple national and international critical care societies, including CHEST, have consensus statements and recommendations outlining the areas of competence expected in critical care ultrasound (Mayo PH, et al. Chest. 2009 Apr;135[4];1050-60, Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37(7):1077-83). The PCCM ACGME requirements should be updated to reflect such recommendations, thereby placing greater emphasis on ultrasound teaching requirements and standardized curriculums. Despite the current ACGME program requirements, it is incumbent upon critical care training programs to provide competency-based education of this now “standard of care” technology.
Barriers to universal POCUS training exist. Fellowship programs may lack trained, ultrasound confident faculty, time, and funding to successfully develop and sustain an ultrasound curriculum. (Eisen LA, et al. Crit Care Med. 2010;38[10]:1978-83; Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12: [Epub ahead of print].)
Although access to adequate quality and quantity of ultrasound machines is less often a problem than in the past, many institutions lack archival and image review software that allows for quality assurance of image acquisition, and some still may not have a faculty member with expertise and ability to champion the cause.
In attempts to mitigate the local faculty gaps, national and regional solutions have been developed for ultrasonography education. CHEST has educated more than 1,400 learners in the Ultrasound Essentials course since 2013. Also, grassroots efforts have led to the development of courses specifically designed to teach incoming PCCM fellows. Using a collaborative and cost-effective model, these regional programs pool faculty and experts in the field to train multiple fellowship programs simultaneously. The first of these was created over a decade ago in New York City (Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12:[Epub ahead of print].)
Currently, there are at least four regional annual ultrasound courses directed at teaching PCCM fellows. These courses are typically held over multiple days and encompass the basics of critical care ultrasound, including vascular, thoracic, abdominal, cardiac, and procedural imaging. By estimation, these four courses provide a basic ultrasonography education to approximately two-thirds of first year pulmonary and critical care fellows in the United States. In addition to training fellows, these programs also serve as a platform for the development of local faculty experts, so that training can continue at their institutions.
Introductory courses are highly effective (Dinh VA, et al. Crit Care Res Pract. 2015 Aug 5:675041 Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12: [Epub ahead of print]), but ongoing education, assessment, and quality assurance is required to achieve sustained competence. Ideally, training in POCUS should entail a dedicated, intensive introduction to the competencies of critical care ultrasound (such as the above regional courses or CHEST ultrasound courses), followed by a formal curriculum within the PCCM fellowship programs. This curriculum should afford the trainee exposure to critically ill patients in an environment with adequate ultrasound equipment and a method to record studies. The trainee then interprets the acquired studies in clinical context. Preferably, the program will afford the trainee real-time quality assurance for image acquisition and interpretation by a program champion. Quality assurance can be provided on site or remotely using fixed interval review sessions. Lastly, the program should have internal milestones to evaluate when a trainee has reached competency to perform these tasks independently. The completion of training should include a letter to any future employee attesting to the trainee’s acquisition of these skills and ability to apply them safely while caring for the critically ill. This robust education is occurring in many centers across the country. PCCM fellowship programs owe it to their trainees, and patients, that competency-based critical care ultrasound training is robust, standardized, and supported.
Point-of-care ultrasound (POCUS) is an essential part of ICU care. It has been demonstrated to improve patient safety and outcomes through procedural guidance (Brass P, et al. Cochrane Database Syst Rev. 2015 Jan 9;1:CD006962) and aid in accurate and timely diagnosis of cardiopulmonary failure (Lichtenstein DA, Mezière GA. Chest. 2008 Jul;134[1]:117-25). Due in part to increasing affordability and portability of ultrasound technologies, the use of POCUS has become seemingly ubiquitous and will continue to increase in coming years. According to expert groups representing 12 critical care societies worldwide, general critical care ultrasound and basic critical care echocardiography should be mandatory training for ICU physicians (Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37[7]:1077-83).
Currently, POCUS is not universally taught to pulmonary and critical care fellows (PCCM); and when training does exist, curriculums are not standardized. This is in part due to the broadly worded requirements set forth from the ACGME for pulmonary disease and critical care medicine. The totality of ACGME common program requirements as it regards to ultrasound training are as follows: 1. “Fellows must demonstrate competence in procedural and technical skills, including ... use of ultrasound techniques to perform thoracentesis and place intravascular and intracavitary tubes and catheters”; and 2. “Fellows must demonstrate knowledge of imaging techniques commonly employed in the evaluation of patients with pulmonary disease or critical illness, including the use of ultrasound” (ACGME Program Requirements for Graduate Medical Education in Pulmonary Disease and Critical Care Medicine).
In comparison, recently updated ACGME common program requirements for ultrasound in emergency medicine and anesthesiology residencies are robust and detailed. Requirements for anesthesia residency training include: ” ... competency in using surface ultrasound ... and transthoracic echocardiography to guide the performance of invasive procedures and to evaluate organ function and pathology ... understanding the principles of ultrasound, including the physics of ultrasound transmission, ultrasound transducer construction, and transducer selection for specific applications, to include being able to obtain images with an understanding of limitations and artifacts ... obtaining standard views of the heart and inferior vena cava with transthoracic echocardiography allowing the evaluation of myocardial function, estimation of central venous pressure, and gross pericardial/cardiac pathology (eg, large pericardial effusion) ... using transthoracic ultrasound for the detection of pneumothorax and pleural effusion ... using surface ultrasound to guide vascular access (both central and peripheral) ... describing techniques, views, and findings in standard language” (ACGME Program Requirements for Graduate Medical Education In Anesthesiology).
Herein lies a stark contrast in what is required of programs that train physicians to care for unstable patients and the critically ill. Current requirements leave graduates of PCCM training programs vulnerable to completing ACGME milestones without being adequately prepared to evaluate patients in a modern ICU setting. Increasingly, hospitals credentialing committees expect PCCM graduates to be suitably trained in ultrasound. Regrettably, there is no assurance that is true, or standardized, with current PCCM fellowship training requirements.
There is not a national standard for competency assessment or requirements for credentialing in POCUS for critical care physicians at this time. However, multiple national and international critical care societies, including CHEST, have consensus statements and recommendations outlining the areas of competence expected in critical care ultrasound (Mayo PH, et al. Chest. 2009 Apr;135[4];1050-60, Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37(7):1077-83). The PCCM ACGME requirements should be updated to reflect such recommendations, thereby placing greater emphasis on ultrasound teaching requirements and standardized curriculums. Despite the current ACGME program requirements, it is incumbent upon critical care training programs to provide competency-based education of this now “standard of care” technology.
Barriers to universal POCUS training exist. Fellowship programs may lack trained, ultrasound confident faculty, time, and funding to successfully develop and sustain an ultrasound curriculum. (Eisen LA, et al. Crit Care Med. 2010;38[10]:1978-83; Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12: [Epub ahead of print].)
Although access to adequate quality and quantity of ultrasound machines is less often a problem than in the past, many institutions lack archival and image review software that allows for quality assurance of image acquisition, and some still may not have a faculty member with expertise and ability to champion the cause.
In attempts to mitigate the local faculty gaps, national and regional solutions have been developed for ultrasonography education. CHEST has educated more than 1,400 learners in the Ultrasound Essentials course since 2013. Also, grassroots efforts have led to the development of courses specifically designed to teach incoming PCCM fellows. Using a collaborative and cost-effective model, these regional programs pool faculty and experts in the field to train multiple fellowship programs simultaneously. The first of these was created over a decade ago in New York City (Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12:[Epub ahead of print].)
Currently, there are at least four regional annual ultrasound courses directed at teaching PCCM fellows. These courses are typically held over multiple days and encompass the basics of critical care ultrasound, including vascular, thoracic, abdominal, cardiac, and procedural imaging. By estimation, these four courses provide a basic ultrasonography education to approximately two-thirds of first year pulmonary and critical care fellows in the United States. In addition to training fellows, these programs also serve as a platform for the development of local faculty experts, so that training can continue at their institutions.
Introductory courses are highly effective (Dinh VA, et al. Crit Care Res Pract. 2015 Aug 5:675041 Patrawalla P, et al. J Intensive Care Med. 2019 Feb 12: [Epub ahead of print]), but ongoing education, assessment, and quality assurance is required to achieve sustained competence. Ideally, training in POCUS should entail a dedicated, intensive introduction to the competencies of critical care ultrasound (such as the above regional courses or CHEST ultrasound courses), followed by a formal curriculum within the PCCM fellowship programs. This curriculum should afford the trainee exposure to critically ill patients in an environment with adequate ultrasound equipment and a method to record studies. The trainee then interprets the acquired studies in clinical context. Preferably, the program will afford the trainee real-time quality assurance for image acquisition and interpretation by a program champion. Quality assurance can be provided on site or remotely using fixed interval review sessions. Lastly, the program should have internal milestones to evaluate when a trainee has reached competency to perform these tasks independently. The completion of training should include a letter to any future employee attesting to the trainee’s acquisition of these skills and ability to apply them safely while caring for the critically ill. This robust education is occurring in many centers across the country. PCCM fellowship programs owe it to their trainees, and patients, that competency-based critical care ultrasound training is robust, standardized, and supported.
Nutrition support during adult critical illness
Many critically ill patients you care for cannot maintain volitional oral intake. Therefore, nutrition support, through enteral or parenteral routes, remains a cornerstone in ensuring our critically ill patients receive substrates like glucose and protein. To understand the supportive role of nutrition during critical illness, let’s identify and contextualize the different phases of critical illness.
Phases of critical illness
The European Society of Parenteral and Enteral Nutrition’s (ESPEN) 2018 critical care nutrition guideline incorporates stages of critical illness in making nutrition recommendations (Singer P et al. Clin Nutr. 2019;38:48-79). The first week of critical illness is the acute phase and hallmarked by catabolism and metabolic and hemodynamic instability. The late phase is thereafter and hallmarked by rehabilitation and anabolism or chronic critical illness. The acute phase is further divided into early (days 1-2) and late acute phase (days 3-7). The time-points are arbitrary and merely serve as placeholders. An objective marker to distinguish phases does not exist and transition periods will be different for each patient.
Acute phase
Critical illness defining conditions like circulatory shock, respiratory failure, and trauma are stressors and lead to two key acute phase perturbations that nutrition may have a role in altering:
The first is hypercatabolism. Critical illness defining conditions activate neuroendocrine, inflammatory/immune, adipokine, and GI tract hormone pathways that increase serum glucagon, cortisol, and catecholamines to promote glycogenolysis, gluconeogenesis, insulin resistance, protein catabolism, and restricted/impaired anabolism.
The second is gut dysfunction. During health, there is cross-talk signaling that occurs between commensal bacteria, epithelium, and the immune system, which maintains gut barrier functions, achieved, for example, by promoting tight junction protein production. Acute critical illness pathophysiology loosens epithelial tight junctions, and the gut barrier is breached, creating an opportunity for downstream migration of pancreatic enzymes and cytokines. Furthermore, the microbiome morphs into a virulent pathobiome, which induces gut-derived inflammation.
When, where, and how much should we feed critically ill patients?
Since the acute phase of critical illness begins a series of events leading to negative energy balance and gut dysfunction, you might find early nutrition provision intuitive. Indeed, the 2016 ASPEN/SCCM and 2018 ESPEN critical care nutrition guidelines recommend early (within 24-48 hours of ICU admission) enteral nutrition (EN), delivered into the stomach, for all critically ill patients unable to maintain volitional intake. Meta-analyses of randomized controlled trials (RCT) conducted between1979 and 2013 show early EN reduces both mortality and infectious complications, compared with no early nutrition (McClave SA et al. JPEN. 2016;40:159-211).
RCT level data do not show superiority of EN over parenteral nutrition (PN). Nonetheless, early EN is recommended over PN because it maintains epithelial barrier function and supports immunity.
What is the optimal nutrition dose? The 2016 ASPEN/SCCM guideline recommends getting to >80% estimated energy goal within 48-72 hours in patients with high nutrition risk while the 2018 ESPEN guideline suggests maintaining a hypocaloric, or not exceeding 70% of prescribed energy goal, during the early acute phase. The recommendation is based on meta-analyses of RCTs conducted between 2011 and 2017, which shows no mortality difference between hypocaloric and isocaloric nutrition.
Biologically plausible rationale for starting hypocaloric, as opposed to full dose nutrition, during the acute phase of critical illness includes: (a) the acute phase represents a period of hemodynamic instability and mitochondrial dysfunction, and full-dose EN may lead to feeding intolerance and lack of substrate utilization, respectively; (b) in those with risk factors (like pre-existing malnutrition), starting full dose nutrition may lead to refeeding syndrome; and (c) endogenous glucose production occurs during the acute phase, and full dose nutrition may worsen hyperglycemia.
Therefore, during the early acute phase of critical illness, hypocaloric feeding using an isosmotic formula, with a slow up-titration to goal rate thereafter, while monitoring for feeding intolerance and refeeding syndrome is a reasonable starting point.
What is the role of parenteral nutrition in critical illness?
PN can be exclusive or supplemental (in a patient receiving EN). Historically, providers may have been reluctant to utilize PN for fear of infectious morbidity; however, contemporary pragmatic-design RCTs demonstrate safety with exclusive PN (Harvey SE et al. N Engl J Med. 2014;371:1673-84). When your patient has a contraindication for EN or does not tolerate it despite a trial of small bowel feeding, meta-analyses have shown a mortality benefit of early exclusive PN in malnourished patients, as compared with no nutrition (Braunschweig C et al. Am J Clin Nutr.2001;74:534-42).
As for supplemental PN (SPN), the 2016 ASPEN/SCCM guideline does not recommend it until day 7 in all critically ill patients, while the 2018 ESPEN guideline recommends its use on a case-by-case basis. Since, two trials inform SPN use. The EAT-ICU trial showed no difference in 6-month physical function between EN group and early-goal-directed nutritiongroup, which included SPN to achieve estimated energy requirement during the first week of critical illness (Allingstrup MJ et al. Intensive Care Med. 2017;43:1637-47). The TOP-UP trial compared EN alone with EN plus SPN in nutritionally high risk patients (ie, those who stand to have more complications as a result of undernutrition) and found those with a BMI < 25 kg/m2 and those with a NUTRIC score >5 who received supplemental PN atop EN had improved 30-day mortality, as compared with EN alone (Wischmeyer P et al. Crit Care. 2017;21:142). Mortality was a secondary outcome, and further study of supplemental PN in nutritionally high-risk patients is warranted. Until further data are available, supplemental PN should probably be restricted during the acute phase of critical illness.
Protein may be the important substrate
Proteolysis is the rule during critical illness, and amino acids are liberated from skeletal muscle breakdown. Using ultrasound, Puthucheary et al found a 17.7% reduction in rectus femoris cross-sectional area in 63 critically ill adults and identified muscle cellular infiltration at ICU day 10, suggesting critical illness leads to quantitative and qualitative muscle defects (Puthucheary Z et al. JAMA. 2013;15:1591-1600).
Since survivorship from critical illness is increasing, acquired loss of muscle mass may contribute to post-ICU physical functioning impairments. Thus, protein may be the most important substrate to deliver during critical illness. The 2016 ASPEN/SCCM guideline recommends 1.2 – 2.0 g/kg actual body weight (ABW)/day in nonobese critically ill patients.
Unfortunately, the optimal protein dose and the timing of intake are unknown. Observational studies suggest benefit with lower and higher doses, which creates equipoise for protein dose. The signal may be lost in heterogeneity, and observational data suggest higher protein dose may benefit patients with high nutritional risk. In terms of timing, one observational study found lower (<0.8 g/kg/d) protein dose before day 3 followed by higher (>0.8 g/kg/d) dose thereafter was associated with mortality benefit (Koekkoek WAC et al. Clin Nutr.2019;38:883-890).
Until stronger data are available to guide optimal protein dose and timing, it is reasonable to observe the 2016 ASPEN/SCCM guideline protein recommendation of at least 1.2 g/kg/day. The 2018 ESPEN guideline recommends a similar dose of 1.3 g/kg/day.
Future research and summary
Many questions remain unanswered and present opportunities for future research. Priorities for critical care nutrition research include studying the impact of combined nutrition and exercise in the acute and late phases of critical illness and identifying best tools to differentiate responses to caloric and protein intake.
In summary, critical illness has acute and late phases. The acute phase is a hypercatabolic state leading to negative energy and nitrogen balance and gut dysfunction. Take-home points for nutrition support in the acute phase of critical illness are:
1. It is reasonable to start early hypocaloric EN with an isosmotic formula with slow up-titration over the first week of critical illness while monitoring for refeeding syndrome and feeding intolerance.
2. Use exclusive PN in ICU patients with pre-existing malnutrition when EN is contraindicated or not tolerated.
3. Supplemental PN should probably be restricted during the acute phase of critical illness.
4. Optimal protein dose and timing are unknown. It is reasonable to start with at least 1.2 g/kg ABW/day in non-obese patients.
Dr. Patel is with the Department of Medicine, Division of Pulmonary, Critical Care, and Sleep Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin.
Dr. Rice is with the Department of Medicine, Division of Pulmonary, Critical Care, and Sleep Medicine, Vanderbilt University, Nashville, Tennessee.
Many critically ill patients you care for cannot maintain volitional oral intake. Therefore, nutrition support, through enteral or parenteral routes, remains a cornerstone in ensuring our critically ill patients receive substrates like glucose and protein. To understand the supportive role of nutrition during critical illness, let’s identify and contextualize the different phases of critical illness.
Phases of critical illness
The European Society of Parenteral and Enteral Nutrition’s (ESPEN) 2018 critical care nutrition guideline incorporates stages of critical illness in making nutrition recommendations (Singer P et al. Clin Nutr. 2019;38:48-79). The first week of critical illness is the acute phase and hallmarked by catabolism and metabolic and hemodynamic instability. The late phase is thereafter and hallmarked by rehabilitation and anabolism or chronic critical illness. The acute phase is further divided into early (days 1-2) and late acute phase (days 3-7). The time-points are arbitrary and merely serve as placeholders. An objective marker to distinguish phases does not exist and transition periods will be different for each patient.
Acute phase
Critical illness defining conditions like circulatory shock, respiratory failure, and trauma are stressors and lead to two key acute phase perturbations that nutrition may have a role in altering:
The first is hypercatabolism. Critical illness defining conditions activate neuroendocrine, inflammatory/immune, adipokine, and GI tract hormone pathways that increase serum glucagon, cortisol, and catecholamines to promote glycogenolysis, gluconeogenesis, insulin resistance, protein catabolism, and restricted/impaired anabolism.
The second is gut dysfunction. During health, there is cross-talk signaling that occurs between commensal bacteria, epithelium, and the immune system, which maintains gut barrier functions, achieved, for example, by promoting tight junction protein production. Acute critical illness pathophysiology loosens epithelial tight junctions, and the gut barrier is breached, creating an opportunity for downstream migration of pancreatic enzymes and cytokines. Furthermore, the microbiome morphs into a virulent pathobiome, which induces gut-derived inflammation.
When, where, and how much should we feed critically ill patients?
Since the acute phase of critical illness begins a series of events leading to negative energy balance and gut dysfunction, you might find early nutrition provision intuitive. Indeed, the 2016 ASPEN/SCCM and 2018 ESPEN critical care nutrition guidelines recommend early (within 24-48 hours of ICU admission) enteral nutrition (EN), delivered into the stomach, for all critically ill patients unable to maintain volitional intake. Meta-analyses of randomized controlled trials (RCT) conducted between1979 and 2013 show early EN reduces both mortality and infectious complications, compared with no early nutrition (McClave SA et al. JPEN. 2016;40:159-211).
RCT level data do not show superiority of EN over parenteral nutrition (PN). Nonetheless, early EN is recommended over PN because it maintains epithelial barrier function and supports immunity.
What is the optimal nutrition dose? The 2016 ASPEN/SCCM guideline recommends getting to >80% estimated energy goal within 48-72 hours in patients with high nutrition risk while the 2018 ESPEN guideline suggests maintaining a hypocaloric, or not exceeding 70% of prescribed energy goal, during the early acute phase. The recommendation is based on meta-analyses of RCTs conducted between 2011 and 2017, which shows no mortality difference between hypocaloric and isocaloric nutrition.
Biologically plausible rationale for starting hypocaloric, as opposed to full dose nutrition, during the acute phase of critical illness includes: (a) the acute phase represents a period of hemodynamic instability and mitochondrial dysfunction, and full-dose EN may lead to feeding intolerance and lack of substrate utilization, respectively; (b) in those with risk factors (like pre-existing malnutrition), starting full dose nutrition may lead to refeeding syndrome; and (c) endogenous glucose production occurs during the acute phase, and full dose nutrition may worsen hyperglycemia.
Therefore, during the early acute phase of critical illness, hypocaloric feeding using an isosmotic formula, with a slow up-titration to goal rate thereafter, while monitoring for feeding intolerance and refeeding syndrome is a reasonable starting point.
What is the role of parenteral nutrition in critical illness?
PN can be exclusive or supplemental (in a patient receiving EN). Historically, providers may have been reluctant to utilize PN for fear of infectious morbidity; however, contemporary pragmatic-design RCTs demonstrate safety with exclusive PN (Harvey SE et al. N Engl J Med. 2014;371:1673-84). When your patient has a contraindication for EN or does not tolerate it despite a trial of small bowel feeding, meta-analyses have shown a mortality benefit of early exclusive PN in malnourished patients, as compared with no nutrition (Braunschweig C et al. Am J Clin Nutr.2001;74:534-42).
As for supplemental PN (SPN), the 2016 ASPEN/SCCM guideline does not recommend it until day 7 in all critically ill patients, while the 2018 ESPEN guideline recommends its use on a case-by-case basis. Since, two trials inform SPN use. The EAT-ICU trial showed no difference in 6-month physical function between EN group and early-goal-directed nutritiongroup, which included SPN to achieve estimated energy requirement during the first week of critical illness (Allingstrup MJ et al. Intensive Care Med. 2017;43:1637-47). The TOP-UP trial compared EN alone with EN plus SPN in nutritionally high risk patients (ie, those who stand to have more complications as a result of undernutrition) and found those with a BMI < 25 kg/m2 and those with a NUTRIC score >5 who received supplemental PN atop EN had improved 30-day mortality, as compared with EN alone (Wischmeyer P et al. Crit Care. 2017;21:142). Mortality was a secondary outcome, and further study of supplemental PN in nutritionally high-risk patients is warranted. Until further data are available, supplemental PN should probably be restricted during the acute phase of critical illness.
Protein may be the important substrate
Proteolysis is the rule during critical illness, and amino acids are liberated from skeletal muscle breakdown. Using ultrasound, Puthucheary et al found a 17.7% reduction in rectus femoris cross-sectional area in 63 critically ill adults and identified muscle cellular infiltration at ICU day 10, suggesting critical illness leads to quantitative and qualitative muscle defects (Puthucheary Z et al. JAMA. 2013;15:1591-1600).
Since survivorship from critical illness is increasing, acquired loss of muscle mass may contribute to post-ICU physical functioning impairments. Thus, protein may be the most important substrate to deliver during critical illness. The 2016 ASPEN/SCCM guideline recommends 1.2 – 2.0 g/kg actual body weight (ABW)/day in nonobese critically ill patients.
Unfortunately, the optimal protein dose and the timing of intake are unknown. Observational studies suggest benefit with lower and higher doses, which creates equipoise for protein dose. The signal may be lost in heterogeneity, and observational data suggest higher protein dose may benefit patients with high nutritional risk. In terms of timing, one observational study found lower (<0.8 g/kg/d) protein dose before day 3 followed by higher (>0.8 g/kg/d) dose thereafter was associated with mortality benefit (Koekkoek WAC et al. Clin Nutr.2019;38:883-890).
Until stronger data are available to guide optimal protein dose and timing, it is reasonable to observe the 2016 ASPEN/SCCM guideline protein recommendation of at least 1.2 g/kg/day. The 2018 ESPEN guideline recommends a similar dose of 1.3 g/kg/day.
Future research and summary
Many questions remain unanswered and present opportunities for future research. Priorities for critical care nutrition research include studying the impact of combined nutrition and exercise in the acute and late phases of critical illness and identifying best tools to differentiate responses to caloric and protein intake.
In summary, critical illness has acute and late phases. The acute phase is a hypercatabolic state leading to negative energy and nitrogen balance and gut dysfunction. Take-home points for nutrition support in the acute phase of critical illness are:
1. It is reasonable to start early hypocaloric EN with an isosmotic formula with slow up-titration over the first week of critical illness while monitoring for refeeding syndrome and feeding intolerance.
2. Use exclusive PN in ICU patients with pre-existing malnutrition when EN is contraindicated or not tolerated.
3. Supplemental PN should probably be restricted during the acute phase of critical illness.
4. Optimal protein dose and timing are unknown. It is reasonable to start with at least 1.2 g/kg ABW/day in non-obese patients.
Dr. Patel is with the Department of Medicine, Division of Pulmonary, Critical Care, and Sleep Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin.
Dr. Rice is with the Department of Medicine, Division of Pulmonary, Critical Care, and Sleep Medicine, Vanderbilt University, Nashville, Tennessee.
Many critically ill patients you care for cannot maintain volitional oral intake. Therefore, nutrition support, through enteral or parenteral routes, remains a cornerstone in ensuring our critically ill patients receive substrates like glucose and protein. To understand the supportive role of nutrition during critical illness, let’s identify and contextualize the different phases of critical illness.
Phases of critical illness
The European Society of Parenteral and Enteral Nutrition’s (ESPEN) 2018 critical care nutrition guideline incorporates stages of critical illness in making nutrition recommendations (Singer P et al. Clin Nutr. 2019;38:48-79). The first week of critical illness is the acute phase and hallmarked by catabolism and metabolic and hemodynamic instability. The late phase is thereafter and hallmarked by rehabilitation and anabolism or chronic critical illness. The acute phase is further divided into early (days 1-2) and late acute phase (days 3-7). The time-points are arbitrary and merely serve as placeholders. An objective marker to distinguish phases does not exist and transition periods will be different for each patient.
Acute phase
Critical illness defining conditions like circulatory shock, respiratory failure, and trauma are stressors and lead to two key acute phase perturbations that nutrition may have a role in altering:
The first is hypercatabolism. Critical illness defining conditions activate neuroendocrine, inflammatory/immune, adipokine, and GI tract hormone pathways that increase serum glucagon, cortisol, and catecholamines to promote glycogenolysis, gluconeogenesis, insulin resistance, protein catabolism, and restricted/impaired anabolism.
The second is gut dysfunction. During health, there is cross-talk signaling that occurs between commensal bacteria, epithelium, and the immune system, which maintains gut barrier functions, achieved, for example, by promoting tight junction protein production. Acute critical illness pathophysiology loosens epithelial tight junctions, and the gut barrier is breached, creating an opportunity for downstream migration of pancreatic enzymes and cytokines. Furthermore, the microbiome morphs into a virulent pathobiome, which induces gut-derived inflammation.
When, where, and how much should we feed critically ill patients?
Since the acute phase of critical illness begins a series of events leading to negative energy balance and gut dysfunction, you might find early nutrition provision intuitive. Indeed, the 2016 ASPEN/SCCM and 2018 ESPEN critical care nutrition guidelines recommend early (within 24-48 hours of ICU admission) enteral nutrition (EN), delivered into the stomach, for all critically ill patients unable to maintain volitional intake. Meta-analyses of randomized controlled trials (RCT) conducted between1979 and 2013 show early EN reduces both mortality and infectious complications, compared with no early nutrition (McClave SA et al. JPEN. 2016;40:159-211).
RCT level data do not show superiority of EN over parenteral nutrition (PN). Nonetheless, early EN is recommended over PN because it maintains epithelial barrier function and supports immunity.
What is the optimal nutrition dose? The 2016 ASPEN/SCCM guideline recommends getting to >80% estimated energy goal within 48-72 hours in patients with high nutrition risk while the 2018 ESPEN guideline suggests maintaining a hypocaloric, or not exceeding 70% of prescribed energy goal, during the early acute phase. The recommendation is based on meta-analyses of RCTs conducted between 2011 and 2017, which shows no mortality difference between hypocaloric and isocaloric nutrition.
Biologically plausible rationale for starting hypocaloric, as opposed to full dose nutrition, during the acute phase of critical illness includes: (a) the acute phase represents a period of hemodynamic instability and mitochondrial dysfunction, and full-dose EN may lead to feeding intolerance and lack of substrate utilization, respectively; (b) in those with risk factors (like pre-existing malnutrition), starting full dose nutrition may lead to refeeding syndrome; and (c) endogenous glucose production occurs during the acute phase, and full dose nutrition may worsen hyperglycemia.
Therefore, during the early acute phase of critical illness, hypocaloric feeding using an isosmotic formula, with a slow up-titration to goal rate thereafter, while monitoring for feeding intolerance and refeeding syndrome is a reasonable starting point.
What is the role of parenteral nutrition in critical illness?
PN can be exclusive or supplemental (in a patient receiving EN). Historically, providers may have been reluctant to utilize PN for fear of infectious morbidity; however, contemporary pragmatic-design RCTs demonstrate safety with exclusive PN (Harvey SE et al. N Engl J Med. 2014;371:1673-84). When your patient has a contraindication for EN or does not tolerate it despite a trial of small bowel feeding, meta-analyses have shown a mortality benefit of early exclusive PN in malnourished patients, as compared with no nutrition (Braunschweig C et al. Am J Clin Nutr.2001;74:534-42).
As for supplemental PN (SPN), the 2016 ASPEN/SCCM guideline does not recommend it until day 7 in all critically ill patients, while the 2018 ESPEN guideline recommends its use on a case-by-case basis. Since, two trials inform SPN use. The EAT-ICU trial showed no difference in 6-month physical function between EN group and early-goal-directed nutritiongroup, which included SPN to achieve estimated energy requirement during the first week of critical illness (Allingstrup MJ et al. Intensive Care Med. 2017;43:1637-47). The TOP-UP trial compared EN alone with EN plus SPN in nutritionally high risk patients (ie, those who stand to have more complications as a result of undernutrition) and found those with a BMI < 25 kg/m2 and those with a NUTRIC score >5 who received supplemental PN atop EN had improved 30-day mortality, as compared with EN alone (Wischmeyer P et al. Crit Care. 2017;21:142). Mortality was a secondary outcome, and further study of supplemental PN in nutritionally high-risk patients is warranted. Until further data are available, supplemental PN should probably be restricted during the acute phase of critical illness.
Protein may be the important substrate
Proteolysis is the rule during critical illness, and amino acids are liberated from skeletal muscle breakdown. Using ultrasound, Puthucheary et al found a 17.7% reduction in rectus femoris cross-sectional area in 63 critically ill adults and identified muscle cellular infiltration at ICU day 10, suggesting critical illness leads to quantitative and qualitative muscle defects (Puthucheary Z et al. JAMA. 2013;15:1591-1600).
Since survivorship from critical illness is increasing, acquired loss of muscle mass may contribute to post-ICU physical functioning impairments. Thus, protein may be the most important substrate to deliver during critical illness. The 2016 ASPEN/SCCM guideline recommends 1.2 – 2.0 g/kg actual body weight (ABW)/day in nonobese critically ill patients.
Unfortunately, the optimal protein dose and the timing of intake are unknown. Observational studies suggest benefit with lower and higher doses, which creates equipoise for protein dose. The signal may be lost in heterogeneity, and observational data suggest higher protein dose may benefit patients with high nutritional risk. In terms of timing, one observational study found lower (<0.8 g/kg/d) protein dose before day 3 followed by higher (>0.8 g/kg/d) dose thereafter was associated with mortality benefit (Koekkoek WAC et al. Clin Nutr.2019;38:883-890).
Until stronger data are available to guide optimal protein dose and timing, it is reasonable to observe the 2016 ASPEN/SCCM guideline protein recommendation of at least 1.2 g/kg/day. The 2018 ESPEN guideline recommends a similar dose of 1.3 g/kg/day.
Future research and summary
Many questions remain unanswered and present opportunities for future research. Priorities for critical care nutrition research include studying the impact of combined nutrition and exercise in the acute and late phases of critical illness and identifying best tools to differentiate responses to caloric and protein intake.
In summary, critical illness has acute and late phases. The acute phase is a hypercatabolic state leading to negative energy and nitrogen balance and gut dysfunction. Take-home points for nutrition support in the acute phase of critical illness are:
1. It is reasonable to start early hypocaloric EN with an isosmotic formula with slow up-titration over the first week of critical illness while monitoring for refeeding syndrome and feeding intolerance.
2. Use exclusive PN in ICU patients with pre-existing malnutrition when EN is contraindicated or not tolerated.
3. Supplemental PN should probably be restricted during the acute phase of critical illness.
4. Optimal protein dose and timing are unknown. It is reasonable to start with at least 1.2 g/kg ABW/day in non-obese patients.
Dr. Patel is with the Department of Medicine, Division of Pulmonary, Critical Care, and Sleep Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin.
Dr. Rice is with the Department of Medicine, Division of Pulmonary, Critical Care, and Sleep Medicine, Vanderbilt University, Nashville, Tennessee.
CPAP vs noninvasive ventilation for obesity hypoventilation syndrome
The conventional approach to treat hypoventilation has been to use noninvasive ventilation (NIV), while continuous positive airway pressure (CPAP) that does not augment alveolar ventilation improves gas exchange by maintaining upper airway patency and increasing functional residual capacity.
To understand this rationale, it is important to first review the pathophysiology of OHS.The hallmark of OHS is resting daytime awake arterial PaCO2of 45 mm Hg or greater in an obese patient (BMI > 30 kg/m2) in absence of any other identifiable cause. To recognize why only some but not all obese subjects develop OHS, it is important to understand the different components of pathophysiology that contribute to hypoventilation: (1) obesity-related reduction in functional residual capacity and lung compliance with resultant increase in work of breathing; (2) central hypoventilation related to leptin resistance and reduction in respiratory drive with REM hypoventilation; and (3) upper airway obstruction caused by upper airway fat deposition along with low FRC contributing to pharyngeal airway narrowing and increased airway collapsibility (Masa JF, et al. Eur Respir Rev. 2019; 28:180097).
CPAP vs NIV for OHS
Let us examine some of the studies that have compared the short-term efficacy of CPAP vs NIV in patients with OHS. In a small randomized controlled trial (RCT), the effectiveness of CPAP and NIV was compared in 36 patients with OHS (Piper AJ, et al. Thorax. 2008;63:395). Reduction in PaCO2 at 3 months was similar between the two groups. However, patients with persistent nocturnal desaturation despite optimal CPAP were excluded from the study. In another RCT of 60 patients with OHS who were either in stable condition or after an episode of acute on chronic hypercapnic respiratory failure, the use of CPAP or NIV showed similar improvements at 3 months in daytime PaCO2, quality of life, and sleep parameters (Howard ME, et al. Thorax. 2017;72:437).
In one of the largest randomized control trials, the Spanish Pickwick study randomized 221 patients with OHS and AHI >30/h to NIV, CPAP, and lifestyle modification (Masa JF, et al. Am J Respir Crit Care Med. 2015:192:86). PAP therapy included NIV that consisted of in-lab titration with bilevel PAP therapy targeted to tidal volume 5-6 mL/kg of actual body weight or CPAP. Life style modification served as the control group. Primary outcome was the change in PaCO2 at 2 months. Secondary outcomes were symptoms, HRQOL, polysomnographic parameters, spirometry, and 6-min walk distance (6 MWD). Mean AHI was 69/h, and mean PAP settings for NIV and CPAP were 20/7.7 cm and 11 cm H2O, respectively. NIV provided the greatest improvement in PaCO2 and serum HCO3 as compared with control group but not relative to CPAP group. CPAP improved PaCO2 as compared with control group only after adjustment of PAP use. Spirometry and 6 MWD and some HRQOL measures improved slightly more with NIV as compared to CPAP. Improvement in symptoms and polysomnographic parameters was similar between the two groups.
In another related study by the same group (Masa JF, et al. Thorax. 2016;71:899), 86 patients with OHS and mild OSA (AHI <30/h), were randomized to NIV and lifestyle modification. Mean AHI was 14/h and mean baseline PaCO2 was 49 +/-4 mm Hg. The NIV group with mean PAP adherence at 6 hours showed greater improvement in PaCO2 as compared with lifestyle modification (6 mm vs 2.8 mm Hg). They concluded that NIV was better than lifestyle modification in patients with OHS and mild OSA.
To determine the long-term clinical effectiveness of CPAP vs NIV, patients in the Pickwick study, who were initially assigned to either CPAP or NIV treatment group, were continued on their respective treatments, while subjects in the control group were again randomized at 2 months to either CPAP or NIV (Masa JF, et al. Lancet. 2019;393:1721). All subjects (CPAP n=107; NIV n=97) were followed for a minimum of 3 years. CPAP and NIV settings (pressure-targeted to desired tidal volume) were determined by in-lab titration without transcutaneous CO2 monitor, and daytime adjustment of PAP to improve oxygen saturation. Primary outcome was the number of hospitalization days per year. Mean CPAP was 10.7 cm H2O pressure and NIV 19.7/8.18 cm H2O pressure with an average respiratory rate of 14/min. Median PAP use and adherence > 4 h, respectively, were similar between the two groups (CPAP 6.0 h, adherence > 4 h 67% vs NIV 6.0/h, adherence >4 h 61%). Median duration of follow-up was 5.44 years (IOR 4.45-6.37 years) for both groups. Mean hospitalization days per patient-year were similar between the two groups (CPAP 1.63 vs NIV 1.44 days; adj RR 0.78, 95% CI 0.34-1.77; p=0.561). Overall mortality, adverse cardiovascular events, and arterial blood gas parameters were similar between the two groups, suggesting equal efficacy of CPAP and NIV in this group of stable patients with OHS with an AHI >30/h. Given the low complexity and cost of CPAP vs NIV, the authors concluded that CPAP may be the preferred PAP treatment modality until more studies are available.
An accompanying editorial (Murphy PB, et al. Lancet. 2019; 393:1674), discussed that since this study was powered for superiority as opposed to noninferiority of NIV (20% reduction in hospitalization with NIV when compared with CPAP), superiority could not be shown, due to the low event rate for hospitalization (NIV 1.44 days vs CPAP 1.63 days). It is also possible optimum NIV titration may not have been determined since TCO2 was not used. Furthermore, since this study was done only in patients with OHS and AHI >30/h, these results may not be applicable to patients with OHS and low AHI < 30/h that are more likely to have central hypoventilation and comorbidities, and this group may benefit from NIV as compared with CPAP.
Novel modes of bi-level PAP therapy
There are limited data on the use of the new bi-level PAP modalities, such as volume-targeted pressure support ventilation (PS) with fixed or auto-EPAP. The use of intelligent volume-assured pressure support ventilation (iVAPS) vs standard fixed pressure support ventilation in select OHS patients (n=18) showed equivalent control of chronic respiratory failure with no worsening of sleep quality and better PAP adherence (Kelly JL, et al. Respirology. 2014;19:596). In another small randomized, double-blind, crossover study, done on two consecutive nights in 11 patients with OHS, the use of auto-adjusting EPAP was noninferior to fixed EPAP (10.8 cm vs 11.8 cm H2O pressure), with no differences in sleep quality and patient preference (McArdle N. Sleep. 2017;40:1). Although the data are limited, these small studies suggest the use of new PAP modalities, such as variable PS to deliver target volumes and auto EPAP could offer the potential to initiate bi-level PAP therapy in outpatients without the in-lab titration. More studies are needed before bi-level PAP therapy can be safely initiated in outpatients with OHS.
Summary
In summary, how can we utilize the most effective PAP therapy for patients with OHS? Can we use a phenotype-dependent approach to PAP treatment options? The answer is probably yes. Recently published ATS Clinical Practice Guideline (Am J Respir Crit Care Med. 2019;200:e6-e24) suggests the use of PAP therapy for stable ambulatory patients with OHS as compared with no PAP therapy, and patients with OHS with AHI >30/h (approximately 70% of the OHS patients) can be initially started on CPAP instead of NIV. Patients who have persistent nocturnal desaturation despite optimum CPAP can be switched to NIV. On the other hand, data are limited on the use of CPAP in patients with OHS with AHI <30/h, and these patients can be started on NIV. PAP adherence >5-6 h, and weight loss using a multidisciplinary approach should be encouraged for all patients with OHS.
Dr. Dewan is Professor and Program Director, Sleep Medicine; Division of Pulmonary, Critical Care and Sleep Medicine; Chief, Pulmonary Section VA Medical Center; Creighton University, Omaha, Nebraska.
The conventional approach to treat hypoventilation has been to use noninvasive ventilation (NIV), while continuous positive airway pressure (CPAP) that does not augment alveolar ventilation improves gas exchange by maintaining upper airway patency and increasing functional residual capacity.
To understand this rationale, it is important to first review the pathophysiology of OHS.The hallmark of OHS is resting daytime awake arterial PaCO2of 45 mm Hg or greater in an obese patient (BMI > 30 kg/m2) in absence of any other identifiable cause. To recognize why only some but not all obese subjects develop OHS, it is important to understand the different components of pathophysiology that contribute to hypoventilation: (1) obesity-related reduction in functional residual capacity and lung compliance with resultant increase in work of breathing; (2) central hypoventilation related to leptin resistance and reduction in respiratory drive with REM hypoventilation; and (3) upper airway obstruction caused by upper airway fat deposition along with low FRC contributing to pharyngeal airway narrowing and increased airway collapsibility (Masa JF, et al. Eur Respir Rev. 2019; 28:180097).
CPAP vs NIV for OHS
Let us examine some of the studies that have compared the short-term efficacy of CPAP vs NIV in patients with OHS. In a small randomized controlled trial (RCT), the effectiveness of CPAP and NIV was compared in 36 patients with OHS (Piper AJ, et al. Thorax. 2008;63:395). Reduction in PaCO2 at 3 months was similar between the two groups. However, patients with persistent nocturnal desaturation despite optimal CPAP were excluded from the study. In another RCT of 60 patients with OHS who were either in stable condition or after an episode of acute on chronic hypercapnic respiratory failure, the use of CPAP or NIV showed similar improvements at 3 months in daytime PaCO2, quality of life, and sleep parameters (Howard ME, et al. Thorax. 2017;72:437).
In one of the largest randomized control trials, the Spanish Pickwick study randomized 221 patients with OHS and AHI >30/h to NIV, CPAP, and lifestyle modification (Masa JF, et al. Am J Respir Crit Care Med. 2015:192:86). PAP therapy included NIV that consisted of in-lab titration with bilevel PAP therapy targeted to tidal volume 5-6 mL/kg of actual body weight or CPAP. Life style modification served as the control group. Primary outcome was the change in PaCO2 at 2 months. Secondary outcomes were symptoms, HRQOL, polysomnographic parameters, spirometry, and 6-min walk distance (6 MWD). Mean AHI was 69/h, and mean PAP settings for NIV and CPAP were 20/7.7 cm and 11 cm H2O, respectively. NIV provided the greatest improvement in PaCO2 and serum HCO3 as compared with control group but not relative to CPAP group. CPAP improved PaCO2 as compared with control group only after adjustment of PAP use. Spirometry and 6 MWD and some HRQOL measures improved slightly more with NIV as compared to CPAP. Improvement in symptoms and polysomnographic parameters was similar between the two groups.
In another related study by the same group (Masa JF, et al. Thorax. 2016;71:899), 86 patients with OHS and mild OSA (AHI <30/h), were randomized to NIV and lifestyle modification. Mean AHI was 14/h and mean baseline PaCO2 was 49 +/-4 mm Hg. The NIV group with mean PAP adherence at 6 hours showed greater improvement in PaCO2 as compared with lifestyle modification (6 mm vs 2.8 mm Hg). They concluded that NIV was better than lifestyle modification in patients with OHS and mild OSA.
To determine the long-term clinical effectiveness of CPAP vs NIV, patients in the Pickwick study, who were initially assigned to either CPAP or NIV treatment group, were continued on their respective treatments, while subjects in the control group were again randomized at 2 months to either CPAP or NIV (Masa JF, et al. Lancet. 2019;393:1721). All subjects (CPAP n=107; NIV n=97) were followed for a minimum of 3 years. CPAP and NIV settings (pressure-targeted to desired tidal volume) were determined by in-lab titration without transcutaneous CO2 monitor, and daytime adjustment of PAP to improve oxygen saturation. Primary outcome was the number of hospitalization days per year. Mean CPAP was 10.7 cm H2O pressure and NIV 19.7/8.18 cm H2O pressure with an average respiratory rate of 14/min. Median PAP use and adherence > 4 h, respectively, were similar between the two groups (CPAP 6.0 h, adherence > 4 h 67% vs NIV 6.0/h, adherence >4 h 61%). Median duration of follow-up was 5.44 years (IOR 4.45-6.37 years) for both groups. Mean hospitalization days per patient-year were similar between the two groups (CPAP 1.63 vs NIV 1.44 days; adj RR 0.78, 95% CI 0.34-1.77; p=0.561). Overall mortality, adverse cardiovascular events, and arterial blood gas parameters were similar between the two groups, suggesting equal efficacy of CPAP and NIV in this group of stable patients with OHS with an AHI >30/h. Given the low complexity and cost of CPAP vs NIV, the authors concluded that CPAP may be the preferred PAP treatment modality until more studies are available.
An accompanying editorial (Murphy PB, et al. Lancet. 2019; 393:1674), discussed that since this study was powered for superiority as opposed to noninferiority of NIV (20% reduction in hospitalization with NIV when compared with CPAP), superiority could not be shown, due to the low event rate for hospitalization (NIV 1.44 days vs CPAP 1.63 days). It is also possible optimum NIV titration may not have been determined since TCO2 was not used. Furthermore, since this study was done only in patients with OHS and AHI >30/h, these results may not be applicable to patients with OHS and low AHI < 30/h that are more likely to have central hypoventilation and comorbidities, and this group may benefit from NIV as compared with CPAP.
Novel modes of bi-level PAP therapy
There are limited data on the use of the new bi-level PAP modalities, such as volume-targeted pressure support ventilation (PS) with fixed or auto-EPAP. The use of intelligent volume-assured pressure support ventilation (iVAPS) vs standard fixed pressure support ventilation in select OHS patients (n=18) showed equivalent control of chronic respiratory failure with no worsening of sleep quality and better PAP adherence (Kelly JL, et al. Respirology. 2014;19:596). In another small randomized, double-blind, crossover study, done on two consecutive nights in 11 patients with OHS, the use of auto-adjusting EPAP was noninferior to fixed EPAP (10.8 cm vs 11.8 cm H2O pressure), with no differences in sleep quality and patient preference (McArdle N. Sleep. 2017;40:1). Although the data are limited, these small studies suggest the use of new PAP modalities, such as variable PS to deliver target volumes and auto EPAP could offer the potential to initiate bi-level PAP therapy in outpatients without the in-lab titration. More studies are needed before bi-level PAP therapy can be safely initiated in outpatients with OHS.
Summary
In summary, how can we utilize the most effective PAP therapy for patients with OHS? Can we use a phenotype-dependent approach to PAP treatment options? The answer is probably yes. Recently published ATS Clinical Practice Guideline (Am J Respir Crit Care Med. 2019;200:e6-e24) suggests the use of PAP therapy for stable ambulatory patients with OHS as compared with no PAP therapy, and patients with OHS with AHI >30/h (approximately 70% of the OHS patients) can be initially started on CPAP instead of NIV. Patients who have persistent nocturnal desaturation despite optimum CPAP can be switched to NIV. On the other hand, data are limited on the use of CPAP in patients with OHS with AHI <30/h, and these patients can be started on NIV. PAP adherence >5-6 h, and weight loss using a multidisciplinary approach should be encouraged for all patients with OHS.
Dr. Dewan is Professor and Program Director, Sleep Medicine; Division of Pulmonary, Critical Care and Sleep Medicine; Chief, Pulmonary Section VA Medical Center; Creighton University, Omaha, Nebraska.
The conventional approach to treat hypoventilation has been to use noninvasive ventilation (NIV), while continuous positive airway pressure (CPAP) that does not augment alveolar ventilation improves gas exchange by maintaining upper airway patency and increasing functional residual capacity.
To understand this rationale, it is important to first review the pathophysiology of OHS.The hallmark of OHS is resting daytime awake arterial PaCO2of 45 mm Hg or greater in an obese patient (BMI > 30 kg/m2) in absence of any other identifiable cause. To recognize why only some but not all obese subjects develop OHS, it is important to understand the different components of pathophysiology that contribute to hypoventilation: (1) obesity-related reduction in functional residual capacity and lung compliance with resultant increase in work of breathing; (2) central hypoventilation related to leptin resistance and reduction in respiratory drive with REM hypoventilation; and (3) upper airway obstruction caused by upper airway fat deposition along with low FRC contributing to pharyngeal airway narrowing and increased airway collapsibility (Masa JF, et al. Eur Respir Rev. 2019; 28:180097).
CPAP vs NIV for OHS
Let us examine some of the studies that have compared the short-term efficacy of CPAP vs NIV in patients with OHS. In a small randomized controlled trial (RCT), the effectiveness of CPAP and NIV was compared in 36 patients with OHS (Piper AJ, et al. Thorax. 2008;63:395). Reduction in PaCO2 at 3 months was similar between the two groups. However, patients with persistent nocturnal desaturation despite optimal CPAP were excluded from the study. In another RCT of 60 patients with OHS who were either in stable condition or after an episode of acute on chronic hypercapnic respiratory failure, the use of CPAP or NIV showed similar improvements at 3 months in daytime PaCO2, quality of life, and sleep parameters (Howard ME, et al. Thorax. 2017;72:437).
In one of the largest randomized control trials, the Spanish Pickwick study randomized 221 patients with OHS and AHI >30/h to NIV, CPAP, and lifestyle modification (Masa JF, et al. Am J Respir Crit Care Med. 2015:192:86). PAP therapy included NIV that consisted of in-lab titration with bilevel PAP therapy targeted to tidal volume 5-6 mL/kg of actual body weight or CPAP. Life style modification served as the control group. Primary outcome was the change in PaCO2 at 2 months. Secondary outcomes were symptoms, HRQOL, polysomnographic parameters, spirometry, and 6-min walk distance (6 MWD). Mean AHI was 69/h, and mean PAP settings for NIV and CPAP were 20/7.7 cm and 11 cm H2O, respectively. NIV provided the greatest improvement in PaCO2 and serum HCO3 as compared with control group but not relative to CPAP group. CPAP improved PaCO2 as compared with control group only after adjustment of PAP use. Spirometry and 6 MWD and some HRQOL measures improved slightly more with NIV as compared to CPAP. Improvement in symptoms and polysomnographic parameters was similar between the two groups.
In another related study by the same group (Masa JF, et al. Thorax. 2016;71:899), 86 patients with OHS and mild OSA (AHI <30/h), were randomized to NIV and lifestyle modification. Mean AHI was 14/h and mean baseline PaCO2 was 49 +/-4 mm Hg. The NIV group with mean PAP adherence at 6 hours showed greater improvement in PaCO2 as compared with lifestyle modification (6 mm vs 2.8 mm Hg). They concluded that NIV was better than lifestyle modification in patients with OHS and mild OSA.
To determine the long-term clinical effectiveness of CPAP vs NIV, patients in the Pickwick study, who were initially assigned to either CPAP or NIV treatment group, were continued on their respective treatments, while subjects in the control group were again randomized at 2 months to either CPAP or NIV (Masa JF, et al. Lancet. 2019;393:1721). All subjects (CPAP n=107; NIV n=97) were followed for a minimum of 3 years. CPAP and NIV settings (pressure-targeted to desired tidal volume) were determined by in-lab titration without transcutaneous CO2 monitor, and daytime adjustment of PAP to improve oxygen saturation. Primary outcome was the number of hospitalization days per year. Mean CPAP was 10.7 cm H2O pressure and NIV 19.7/8.18 cm H2O pressure with an average respiratory rate of 14/min. Median PAP use and adherence > 4 h, respectively, were similar between the two groups (CPAP 6.0 h, adherence > 4 h 67% vs NIV 6.0/h, adherence >4 h 61%). Median duration of follow-up was 5.44 years (IOR 4.45-6.37 years) for both groups. Mean hospitalization days per patient-year were similar between the two groups (CPAP 1.63 vs NIV 1.44 days; adj RR 0.78, 95% CI 0.34-1.77; p=0.561). Overall mortality, adverse cardiovascular events, and arterial blood gas parameters were similar between the two groups, suggesting equal efficacy of CPAP and NIV in this group of stable patients with OHS with an AHI >30/h. Given the low complexity and cost of CPAP vs NIV, the authors concluded that CPAP may be the preferred PAP treatment modality until more studies are available.
An accompanying editorial (Murphy PB, et al. Lancet. 2019; 393:1674), discussed that since this study was powered for superiority as opposed to noninferiority of NIV (20% reduction in hospitalization with NIV when compared with CPAP), superiority could not be shown, due to the low event rate for hospitalization (NIV 1.44 days vs CPAP 1.63 days). It is also possible optimum NIV titration may not have been determined since TCO2 was not used. Furthermore, since this study was done only in patients with OHS and AHI >30/h, these results may not be applicable to patients with OHS and low AHI < 30/h that are more likely to have central hypoventilation and comorbidities, and this group may benefit from NIV as compared with CPAP.
Novel modes of bi-level PAP therapy
There are limited data on the use of the new bi-level PAP modalities, such as volume-targeted pressure support ventilation (PS) with fixed or auto-EPAP. The use of intelligent volume-assured pressure support ventilation (iVAPS) vs standard fixed pressure support ventilation in select OHS patients (n=18) showed equivalent control of chronic respiratory failure with no worsening of sleep quality and better PAP adherence (Kelly JL, et al. Respirology. 2014;19:596). In another small randomized, double-blind, crossover study, done on two consecutive nights in 11 patients with OHS, the use of auto-adjusting EPAP was noninferior to fixed EPAP (10.8 cm vs 11.8 cm H2O pressure), with no differences in sleep quality and patient preference (McArdle N. Sleep. 2017;40:1). Although the data are limited, these small studies suggest the use of new PAP modalities, such as variable PS to deliver target volumes and auto EPAP could offer the potential to initiate bi-level PAP therapy in outpatients without the in-lab titration. More studies are needed before bi-level PAP therapy can be safely initiated in outpatients with OHS.
Summary
In summary, how can we utilize the most effective PAP therapy for patients with OHS? Can we use a phenotype-dependent approach to PAP treatment options? The answer is probably yes. Recently published ATS Clinical Practice Guideline (Am J Respir Crit Care Med. 2019;200:e6-e24) suggests the use of PAP therapy for stable ambulatory patients with OHS as compared with no PAP therapy, and patients with OHS with AHI >30/h (approximately 70% of the OHS patients) can be initially started on CPAP instead of NIV. Patients who have persistent nocturnal desaturation despite optimum CPAP can be switched to NIV. On the other hand, data are limited on the use of CPAP in patients with OHS with AHI <30/h, and these patients can be started on NIV. PAP adherence >5-6 h, and weight loss using a multidisciplinary approach should be encouraged for all patients with OHS.
Dr. Dewan is Professor and Program Director, Sleep Medicine; Division of Pulmonary, Critical Care and Sleep Medicine; Chief, Pulmonary Section VA Medical Center; Creighton University, Omaha, Nebraska.
The emerging role of quantitative CT scans in ILD terms
The role of imaging for interstitial lung disease (ILD) is of paramount importance. With the growth of high resolution chest computed tomography (HRCT) imaging techniques, we are able to visualize nuances between individual ILDs more critically. HRCT is an essential component of an initial ILD evaluation and also has become part of the armamentarium of tools used for routine management of these patients. The technology of HRCT scans has evolved over the years, most recently with the advent of quantitative HRCT (qCT). The technology employs texture-based classification, which identifies and quantifies different radiographic findings. The arrival of qCT scanning has been slowly emerging as a new player in the ILD world. What exactly is qCT, and what role can, and will it serve for our ILD patients?
Quantitative CT scanning has been introduced since the 1980s, but only within the last 15 years has its use for ILD taken form. Human interpretation of CTs is fraught with subjectivity, based on the interpreting radiologist’s training, experience, and individual visual perception of images. This can result in significant variability in radiographic interpretations and, ultimately, affects a patient’s diagnosis, disease monitoring, treatment, and prognosis. Semiquantitative visual scoring by radiologists is highly variable, especially in areas with limited availability of chest radiologists. qCT employs an automated histogram signature technique that utilizes density and texture-based analysis of the lung parenchyma. Utilizing machine learning from pathologically confirmed datasets, computer programs were trained with specialized thoracic radiologists to distinguish some commonly found radiographic abnormalities into four major groups: ground glass, reticular, honeycombing, and emphysema. In addition, these categories are quantified and spatially depicted on an analysis (Bartholmai, et al. J Thorac Imaging. 2013;28[5]:298). Various computer programs have been built to streamline the process and expedite the interpretation of an individual’s HRCT scan. The more commonly familiar program, CALIPER (Computer-Aided Lung Informatics for Pathology Evaluation and Ratings), has been used in multiple research studies of qCT in ILD and IPF. Each patient’s CT scan is uploaded to the program, and a breakdown of the patient’s lungs into each category is presented. Not only is each abnormality quantified and precisely defined, it is also color-coded by segments to help with visual interpretation by the physician.
The benefit of qCT lies not only in the automated, objective evaluation of interstitial lung disease, but also in its possible use in prognostication and mortality prediction. Neither use has been fully validated as of yet. However, growing evidence shows a promising role in both realms. Thus far, there have been some studies correlating PFT data with qCT findings. A follow-up study of the Scleroderma Lung Study II examined qCT changes over 24 months and correlated those findings with PFTs and patient-reported outcomes. Patients in this study were either treated with cyclophosphamide (CYC) for 1 year/placebo 1 year vs mycophenolate mofetil (MMF) for 2 years. A large portion of patients receiving CYC or MMF had a significant correlation between improved or stable qCT scores and their FVC and TLC. Neither CYC nor MMF was superior in qCT scores, aligning with the findings of the study, which showed noninferiority of MMF compared with CYC (Goldin, et al. Ann Am Thorac Soc. 2018 Nov;15[11]:1286). Interestingly, the improvement of ground glass is often viewed by physicians as positive, since this finding is typically thought of as active inflammation. However, if qCT determines that the fibrosis score actually increases over time, despite an improvement in ground glass, this may more accurately reflect the development of subtle fibrosis that is not easily appreciated by the human eye (Goldin, et al. Ann Am Thorac Soc. 2018 Nov;15[11]:1286). In this context, it is feasible that parenchymal changes occur prior to deterioration on PFTs. Diffusing capacity for carbon monoxide (DLCO) correlates largely with the extent of lung involvement on qCT, but DLCO is not a specific biomarker in predicting severity of ILD (ie, because pHTN or anemia can confound DLCO). Forced vital capacity (FVC) in certain diseases may also confound CT correlation (ie, muscle weakness or extrathoracic restriction from skin disease in systemic sclerosis). The usefulness of PFT data as a clinical endpoint in research studies may be replaced by qCTs more consistent and precise detection of disease modification.
IPF has been an interesting area of exploration for the role of qCT in disease monitoring and possible prognostication. It is known that the presence of honeycombing on HRCT is associated with increased mortality. Patients with a progressive fibrotic ILD have similar mortality rates to those with IPF (Adegunsoye, et al. Ann Am Thorac Soc. 2019 May;16[5]:580). The ability to correlate radiographic findings with mortality could potentially become an important marker of clinical deterioration, especially in those patients who are unable to perform PFTs. In addition, it can also be beneficial in those with co-existent emphysema, since PFTs may be confounded by this overlap. Nakagawa and colleagues proposed a computer-aided method for qCT analysis of honeycombing in patients with IPF. The algorithm for the qCT analysis also has specific parameters to exclude emphysematous lesions on imaging. The %honeycomb area (HA) was correlated with a composite physiologic index derived from PFTs (calculated from FEV1, FVC and DLCO). This tool can accurately quantify the percentage of honeycombing and aid in monitoring IPF. Using this protocol, Nakagawa was able to demonstrate a significant correlation with 3-year mortality, with a marked difference found when using a cutoff value of 4.8% (Nakagawa, et al. Plos One. 2019 Mar; 14[3]:e0214278). Furthermore, patient survival in IPF has been compared against the CALIPER program and PFTs. Mortality for patients was significantly associated with pulmonary vessel volume (PVV), an innovative tool that quantified the volume of the pulmonary artery and veins, which may become a new parameter used for disease monitoring. Using qCT in addition to PFTs provides more tangible evidence to help monitor patients with IPF, guide treatment decisions, and plan for transplant or palliative care. The growing use of PVV in qCT has yet to be fully elucidated, but it does have a promising role (Jacob, et al. Eur Respir J. 2017;49[1]. doi: 10.1183/13993003.01011-2016).
Despite the positive outlook for qCT, there are major issues that limit its widespread use. During the image acquisition process, there is a lack of consistency and quality control, stemming from multiple different manufacturers of CT scan machines, reconstitution methods, radiation doses, and noise or inspiratory efforts of patients. The Radiologic Society of North America (RSNA) is attempting to fix this issue by creating a standardized protocol for collecting images used for qCT (Castillo-Saldana, et al. J Thorac Imaging. 2019 Aug 7. doi: 10.1097/RTI.0000000000000440). In order to move forward with adaptation of qCT, a standardized approach and handling of images needs to be created.
Quantitative CT is an exciting new prospect for the care of patients with ILD. As these patients, and their management, becomes more complex, expanding the toolbox for physicians is much needed. It will be fascinating to see how the role of qCT takes shape over the coming years.
Dr. D’Annunzio is with Westmed Medical Group, Rye, N.Y.; Dr. Nayar is a Pulmonary/Critical Care Fellow at NYU School of Medicine; and Dr. Patel is with Columbia University Medical Center.
The role of imaging for interstitial lung disease (ILD) is of paramount importance. With the growth of high resolution chest computed tomography (HRCT) imaging techniques, we are able to visualize nuances between individual ILDs more critically. HRCT is an essential component of an initial ILD evaluation and also has become part of the armamentarium of tools used for routine management of these patients. The technology of HRCT scans has evolved over the years, most recently with the advent of quantitative HRCT (qCT). The technology employs texture-based classification, which identifies and quantifies different radiographic findings. The arrival of qCT scanning has been slowly emerging as a new player in the ILD world. What exactly is qCT, and what role can, and will it serve for our ILD patients?
Quantitative CT scanning has been introduced since the 1980s, but only within the last 15 years has its use for ILD taken form. Human interpretation of CTs is fraught with subjectivity, based on the interpreting radiologist’s training, experience, and individual visual perception of images. This can result in significant variability in radiographic interpretations and, ultimately, affects a patient’s diagnosis, disease monitoring, treatment, and prognosis. Semiquantitative visual scoring by radiologists is highly variable, especially in areas with limited availability of chest radiologists. qCT employs an automated histogram signature technique that utilizes density and texture-based analysis of the lung parenchyma. Utilizing machine learning from pathologically confirmed datasets, computer programs were trained with specialized thoracic radiologists to distinguish some commonly found radiographic abnormalities into four major groups: ground glass, reticular, honeycombing, and emphysema. In addition, these categories are quantified and spatially depicted on an analysis (Bartholmai, et al. J Thorac Imaging. 2013;28[5]:298). Various computer programs have been built to streamline the process and expedite the interpretation of an individual’s HRCT scan. The more commonly familiar program, CALIPER (Computer-Aided Lung Informatics for Pathology Evaluation and Ratings), has been used in multiple research studies of qCT in ILD and IPF. Each patient’s CT scan is uploaded to the program, and a breakdown of the patient’s lungs into each category is presented. Not only is each abnormality quantified and precisely defined, it is also color-coded by segments to help with visual interpretation by the physician.
The benefit of qCT lies not only in the automated, objective evaluation of interstitial lung disease, but also in its possible use in prognostication and mortality prediction. Neither use has been fully validated as of yet. However, growing evidence shows a promising role in both realms. Thus far, there have been some studies correlating PFT data with qCT findings. A follow-up study of the Scleroderma Lung Study II examined qCT changes over 24 months and correlated those findings with PFTs and patient-reported outcomes. Patients in this study were either treated with cyclophosphamide (CYC) for 1 year/placebo 1 year vs mycophenolate mofetil (MMF) for 2 years. A large portion of patients receiving CYC or MMF had a significant correlation between improved or stable qCT scores and their FVC and TLC. Neither CYC nor MMF was superior in qCT scores, aligning with the findings of the study, which showed noninferiority of MMF compared with CYC (Goldin, et al. Ann Am Thorac Soc. 2018 Nov;15[11]:1286). Interestingly, the improvement of ground glass is often viewed by physicians as positive, since this finding is typically thought of as active inflammation. However, if qCT determines that the fibrosis score actually increases over time, despite an improvement in ground glass, this may more accurately reflect the development of subtle fibrosis that is not easily appreciated by the human eye (Goldin, et al. Ann Am Thorac Soc. 2018 Nov;15[11]:1286). In this context, it is feasible that parenchymal changes occur prior to deterioration on PFTs. Diffusing capacity for carbon monoxide (DLCO) correlates largely with the extent of lung involvement on qCT, but DLCO is not a specific biomarker in predicting severity of ILD (ie, because pHTN or anemia can confound DLCO). Forced vital capacity (FVC) in certain diseases may also confound CT correlation (ie, muscle weakness or extrathoracic restriction from skin disease in systemic sclerosis). The usefulness of PFT data as a clinical endpoint in research studies may be replaced by qCTs more consistent and precise detection of disease modification.
IPF has been an interesting area of exploration for the role of qCT in disease monitoring and possible prognostication. It is known that the presence of honeycombing on HRCT is associated with increased mortality. Patients with a progressive fibrotic ILD have similar mortality rates to those with IPF (Adegunsoye, et al. Ann Am Thorac Soc. 2019 May;16[5]:580). The ability to correlate radiographic findings with mortality could potentially become an important marker of clinical deterioration, especially in those patients who are unable to perform PFTs. In addition, it can also be beneficial in those with co-existent emphysema, since PFTs may be confounded by this overlap. Nakagawa and colleagues proposed a computer-aided method for qCT analysis of honeycombing in patients with IPF. The algorithm for the qCT analysis also has specific parameters to exclude emphysematous lesions on imaging. The %honeycomb area (HA) was correlated with a composite physiologic index derived from PFTs (calculated from FEV1, FVC and DLCO). This tool can accurately quantify the percentage of honeycombing and aid in monitoring IPF. Using this protocol, Nakagawa was able to demonstrate a significant correlation with 3-year mortality, with a marked difference found when using a cutoff value of 4.8% (Nakagawa, et al. Plos One. 2019 Mar; 14[3]:e0214278). Furthermore, patient survival in IPF has been compared against the CALIPER program and PFTs. Mortality for patients was significantly associated with pulmonary vessel volume (PVV), an innovative tool that quantified the volume of the pulmonary artery and veins, which may become a new parameter used for disease monitoring. Using qCT in addition to PFTs provides more tangible evidence to help monitor patients with IPF, guide treatment decisions, and plan for transplant or palliative care. The growing use of PVV in qCT has yet to be fully elucidated, but it does have a promising role (Jacob, et al. Eur Respir J. 2017;49[1]. doi: 10.1183/13993003.01011-2016).
Despite the positive outlook for qCT, there are major issues that limit its widespread use. During the image acquisition process, there is a lack of consistency and quality control, stemming from multiple different manufacturers of CT scan machines, reconstitution methods, radiation doses, and noise or inspiratory efforts of patients. The Radiologic Society of North America (RSNA) is attempting to fix this issue by creating a standardized protocol for collecting images used for qCT (Castillo-Saldana, et al. J Thorac Imaging. 2019 Aug 7. doi: 10.1097/RTI.0000000000000440). In order to move forward with adaptation of qCT, a standardized approach and handling of images needs to be created.
Quantitative CT is an exciting new prospect for the care of patients with ILD. As these patients, and their management, becomes more complex, expanding the toolbox for physicians is much needed. It will be fascinating to see how the role of qCT takes shape over the coming years.
Dr. D’Annunzio is with Westmed Medical Group, Rye, N.Y.; Dr. Nayar is a Pulmonary/Critical Care Fellow at NYU School of Medicine; and Dr. Patel is with Columbia University Medical Center.
The role of imaging for interstitial lung disease (ILD) is of paramount importance. With the growth of high resolution chest computed tomography (HRCT) imaging techniques, we are able to visualize nuances between individual ILDs more critically. HRCT is an essential component of an initial ILD evaluation and also has become part of the armamentarium of tools used for routine management of these patients. The technology of HRCT scans has evolved over the years, most recently with the advent of quantitative HRCT (qCT). The technology employs texture-based classification, which identifies and quantifies different radiographic findings. The arrival of qCT scanning has been slowly emerging as a new player in the ILD world. What exactly is qCT, and what role can, and will it serve for our ILD patients?
Quantitative CT scanning has been introduced since the 1980s, but only within the last 15 years has its use for ILD taken form. Human interpretation of CTs is fraught with subjectivity, based on the interpreting radiologist’s training, experience, and individual visual perception of images. This can result in significant variability in radiographic interpretations and, ultimately, affects a patient’s diagnosis, disease monitoring, treatment, and prognosis. Semiquantitative visual scoring by radiologists is highly variable, especially in areas with limited availability of chest radiologists. qCT employs an automated histogram signature technique that utilizes density and texture-based analysis of the lung parenchyma. Utilizing machine learning from pathologically confirmed datasets, computer programs were trained with specialized thoracic radiologists to distinguish some commonly found radiographic abnormalities into four major groups: ground glass, reticular, honeycombing, and emphysema. In addition, these categories are quantified and spatially depicted on an analysis (Bartholmai, et al. J Thorac Imaging. 2013;28[5]:298). Various computer programs have been built to streamline the process and expedite the interpretation of an individual’s HRCT scan. The more commonly familiar program, CALIPER (Computer-Aided Lung Informatics for Pathology Evaluation and Ratings), has been used in multiple research studies of qCT in ILD and IPF. Each patient’s CT scan is uploaded to the program, and a breakdown of the patient’s lungs into each category is presented. Not only is each abnormality quantified and precisely defined, it is also color-coded by segments to help with visual interpretation by the physician.
The benefit of qCT lies not only in the automated, objective evaluation of interstitial lung disease, but also in its possible use in prognostication and mortality prediction. Neither use has been fully validated as of yet. However, growing evidence shows a promising role in both realms. Thus far, there have been some studies correlating PFT data with qCT findings. A follow-up study of the Scleroderma Lung Study II examined qCT changes over 24 months and correlated those findings with PFTs and patient-reported outcomes. Patients in this study were either treated with cyclophosphamide (CYC) for 1 year/placebo 1 year vs mycophenolate mofetil (MMF) for 2 years. A large portion of patients receiving CYC or MMF had a significant correlation between improved or stable qCT scores and their FVC and TLC. Neither CYC nor MMF was superior in qCT scores, aligning with the findings of the study, which showed noninferiority of MMF compared with CYC (Goldin, et al. Ann Am Thorac Soc. 2018 Nov;15[11]:1286). Interestingly, the improvement of ground glass is often viewed by physicians as positive, since this finding is typically thought of as active inflammation. However, if qCT determines that the fibrosis score actually increases over time, despite an improvement in ground glass, this may more accurately reflect the development of subtle fibrosis that is not easily appreciated by the human eye (Goldin, et al. Ann Am Thorac Soc. 2018 Nov;15[11]:1286). In this context, it is feasible that parenchymal changes occur prior to deterioration on PFTs. Diffusing capacity for carbon monoxide (DLCO) correlates largely with the extent of lung involvement on qCT, but DLCO is not a specific biomarker in predicting severity of ILD (ie, because pHTN or anemia can confound DLCO). Forced vital capacity (FVC) in certain diseases may also confound CT correlation (ie, muscle weakness or extrathoracic restriction from skin disease in systemic sclerosis). The usefulness of PFT data as a clinical endpoint in research studies may be replaced by qCTs more consistent and precise detection of disease modification.
IPF has been an interesting area of exploration for the role of qCT in disease monitoring and possible prognostication. It is known that the presence of honeycombing on HRCT is associated with increased mortality. Patients with a progressive fibrotic ILD have similar mortality rates to those with IPF (Adegunsoye, et al. Ann Am Thorac Soc. 2019 May;16[5]:580). The ability to correlate radiographic findings with mortality could potentially become an important marker of clinical deterioration, especially in those patients who are unable to perform PFTs. In addition, it can also be beneficial in those with co-existent emphysema, since PFTs may be confounded by this overlap. Nakagawa and colleagues proposed a computer-aided method for qCT analysis of honeycombing in patients with IPF. The algorithm for the qCT analysis also has specific parameters to exclude emphysematous lesions on imaging. The %honeycomb area (HA) was correlated with a composite physiologic index derived from PFTs (calculated from FEV1, FVC and DLCO). This tool can accurately quantify the percentage of honeycombing and aid in monitoring IPF. Using this protocol, Nakagawa was able to demonstrate a significant correlation with 3-year mortality, with a marked difference found when using a cutoff value of 4.8% (Nakagawa, et al. Plos One. 2019 Mar; 14[3]:e0214278). Furthermore, patient survival in IPF has been compared against the CALIPER program and PFTs. Mortality for patients was significantly associated with pulmonary vessel volume (PVV), an innovative tool that quantified the volume of the pulmonary artery and veins, which may become a new parameter used for disease monitoring. Using qCT in addition to PFTs provides more tangible evidence to help monitor patients with IPF, guide treatment decisions, and plan for transplant or palliative care. The growing use of PVV in qCT has yet to be fully elucidated, but it does have a promising role (Jacob, et al. Eur Respir J. 2017;49[1]. doi: 10.1183/13993003.01011-2016).
Despite the positive outlook for qCT, there are major issues that limit its widespread use. During the image acquisition process, there is a lack of consistency and quality control, stemming from multiple different manufacturers of CT scan machines, reconstitution methods, radiation doses, and noise or inspiratory efforts of patients. The Radiologic Society of North America (RSNA) is attempting to fix this issue by creating a standardized protocol for collecting images used for qCT (Castillo-Saldana, et al. J Thorac Imaging. 2019 Aug 7. doi: 10.1097/RTI.0000000000000440). In order to move forward with adaptation of qCT, a standardized approach and handling of images needs to be created.
Quantitative CT is an exciting new prospect for the care of patients with ILD. As these patients, and their management, becomes more complex, expanding the toolbox for physicians is much needed. It will be fascinating to see how the role of qCT takes shape over the coming years.
Dr. D’Annunzio is with Westmed Medical Group, Rye, N.Y.; Dr. Nayar is a Pulmonary/Critical Care Fellow at NYU School of Medicine; and Dr. Patel is with Columbia University Medical Center.
Should PEEP be titrated based on esophageal pressures?
Application of basic physiology principles at bedside has changed the approach to the treatment of patients with acute respiratory distress syndrome (ARDS) and refractory hypoxemia. Current standard of care for patients with ARDS includes a low tidal volume ventilation strategy (6 mL/kg of ideal body weight), keeping plateau pressures below 30 cm H2O (Brower RG, et al. N Engl J Med. 2000;342[18]:1301), driving pressures below 15 cm H2O and adequate positive end-expiratory pressures (PEEP) to keep the alveoli open without overdistension (Villar J, et al. Crit Care Med. 2006;34[5]:1311). However, at this time, despite the awareness of the importance of this intervention, there is no consensus regarding the best method to determine ideal PEEP at the individual patient level.
A thorough understanding of the basic physiologic concepts regarding respiratory pressures is of paramount importance to be able to formulate an opinion. The transpulmonary pressure (or lung distending pressure) is the gradient caused by the difference between alveolar (PA) and pleural pressure (PPL). In order to prevent lung collapse at end-expiration, PA must remain higher than PPL such that the gradient remains outward, preventing end-expiratory collapse and atelectotrauma. To accomplish that, it is necessary to know the end-expiratory PA and PPL. Esophageal balloon pressures (PES) represent central thoracic pressures, but, despite positional and regional variations, they are a good surrogate for average “effective” PPL (Baedorf KE, et al. Med Klin Intensivmed Notfmed. 2018;113[Suppl 1]:13).
Understanding that the value of the PES represents a practical PPL makes it easier to appreciate the potential usefulness of an esophageal balloon to titrate PEEP. The objective of PEEP titration is to prevent de-recruitment, maintain alveolar aeration, and improve the functional size of aerated alveoli. If the applied PEEP is lower than the PPL, the dependent lung regions will collapse. On the other hand, if PEEP is higher than the PPL, the lung would be overdistended, causing barotrauma and hemodynamic compromise.
The question is: Should we use esophageal balloons?Yes, we should.
A single center randomized control trial (EPVent) compared PEEP titration to achieve a positive PL vs standard of care lung protective ventilation (Talmor D, et al. N Engl J Med. 2008;359:2095). The PEEP titration group used significantly higher levels of PEEP, with improved oxygenation and lung compliance. However, there was no significant difference in ventilator-free days or mortality between the groups.
Obese patients are also likely to benefit from PEEP titration guided by an esophageal balloon, as they often have higher levels of intrinsic PEEP. Therefore, the application of higher levels of PEEP to compensate for the higher levels of intrinsic PEEP may help reduce work of breathing and prevent tidal recruitment-de-recruitment and atelectasis. Additionally, low to negative transpulmonary pressures measured using the actual values of PES in obese patients and obese animal models predicted lung collapse and tidal opening and closing (Fumagalli J, et al. Crit Care Med. 2017;45[8]:1374). It is useful to remember that the compliance of the respiratory system (Crs) is the total of the sum of the compliance of the chest wall (Ccw) and the lung compliance (CL). In obese patients, Ccw has a much more significant contribution to the total Crs, and the clinician should be really interested in the CL. At the bedside, esophageal manometry can be very useful to distinguish the contribution of CL and Ccw to the total Crs.
No, we shouldn’t.
Another randomized controlled trial (EPVent-2), by the same group, compared PEEP titration guided by esophageal pressure with empirical PEEP titration, in patients with moderate to severe ARDS (Beitler JR, et al. JAMA. 2019;321[9]:846). The primary outcomes of interest, death, and mechanical ventilator-free days through day 28 were not different between the groups.
Additionally, placement of an esophageal balloon is challenging and operator-dependent. The balloon portion of the esophageal catheter should be positioned in the lower third of the esophagus, behind the heart. Catheter placement is typically performed by inserting it into the stomach to a depth of about 60 cm, and gently pressing on the abdomen and observing a sudden increase in pressure on the ventilator screen. It is then withdrawn to about 40 cm, while looking for cardiac oscillations and pressure change (Talmor D, et al. N Engl J Med. 2008;359:2095). One can see how easily it would be to insert the esophageal balloon incorrectly. A misplaced balloon won’t provide accurate PES and can potentially cause harm.
Final answer: It depends on each individual patient.
Arguments for and against using an esophageal balloon to titrate PEEP in patients with ARDS and refractory hypoxemia are ongoing. Even the two most cited and applied trials on the matter (EPVent and EPVent-2) reported contradictory results. However, when analyzed in depth, both showed better oxygenation with the use of esophageal balloon. EPVent had improvement in oxygenation as its primary endpoint, and it was significant in the esophageal balloon group. EPVent-2 had oxygenation goals, in the form of need for rescue therapies for refractory hypoxemia, as secondary endpoints. Nonetheless, the patients in the esophageal balloon group in EPVent-2 required prone positioning less frequently, had lower use of pulmonary vasodilators, and a lower rate of ECMO consultations. Even though those trials did not show a mortality benefit, both showed an oxygenation benefit.
The ideal single tool that would indicate the “perfect “PEEP for each patient remains to be described. Until then, PEEP titration guided by a combination of ARDSnet PEEP tables, while maintaining a plateau pressure below 30 cm H2O and considering a driving pressure below 15 cm H2O should be a clinician’s goal. In patients in the extremes of height and body weight, and/or with conditions that would increase intra-abdominal pressure, such as ascites, a well-placed esophageal balloon while patient is supine might be beneficial.
The truth of the matter is, PEEP should be titrated by a trained intensivist in conjunction with the multidisciplinary ICU team, at patients’ bedside taking into consideration each individual’s unique physiologic and pathophysiologic characteristics at that moment.
Dr. Gallo de Moraes is Assistant Professor of Medicine, and Dr Oeckler is Assistant Professor of Medicine, Division of Pulmonary and Critical Care, Mayo Clinic, Rochester, Minnesota.
Application of basic physiology principles at bedside has changed the approach to the treatment of patients with acute respiratory distress syndrome (ARDS) and refractory hypoxemia. Current standard of care for patients with ARDS includes a low tidal volume ventilation strategy (6 mL/kg of ideal body weight), keeping plateau pressures below 30 cm H2O (Brower RG, et al. N Engl J Med. 2000;342[18]:1301), driving pressures below 15 cm H2O and adequate positive end-expiratory pressures (PEEP) to keep the alveoli open without overdistension (Villar J, et al. Crit Care Med. 2006;34[5]:1311). However, at this time, despite the awareness of the importance of this intervention, there is no consensus regarding the best method to determine ideal PEEP at the individual patient level.
A thorough understanding of the basic physiologic concepts regarding respiratory pressures is of paramount importance to be able to formulate an opinion. The transpulmonary pressure (or lung distending pressure) is the gradient caused by the difference between alveolar (PA) and pleural pressure (PPL). In order to prevent lung collapse at end-expiration, PA must remain higher than PPL such that the gradient remains outward, preventing end-expiratory collapse and atelectotrauma. To accomplish that, it is necessary to know the end-expiratory PA and PPL. Esophageal balloon pressures (PES) represent central thoracic pressures, but, despite positional and regional variations, they are a good surrogate for average “effective” PPL (Baedorf KE, et al. Med Klin Intensivmed Notfmed. 2018;113[Suppl 1]:13).
Understanding that the value of the PES represents a practical PPL makes it easier to appreciate the potential usefulness of an esophageal balloon to titrate PEEP. The objective of PEEP titration is to prevent de-recruitment, maintain alveolar aeration, and improve the functional size of aerated alveoli. If the applied PEEP is lower than the PPL, the dependent lung regions will collapse. On the other hand, if PEEP is higher than the PPL, the lung would be overdistended, causing barotrauma and hemodynamic compromise.
The question is: Should we use esophageal balloons?Yes, we should.
A single center randomized control trial (EPVent) compared PEEP titration to achieve a positive PL vs standard of care lung protective ventilation (Talmor D, et al. N Engl J Med. 2008;359:2095). The PEEP titration group used significantly higher levels of PEEP, with improved oxygenation and lung compliance. However, there was no significant difference in ventilator-free days or mortality between the groups.
Obese patients are also likely to benefit from PEEP titration guided by an esophageal balloon, as they often have higher levels of intrinsic PEEP. Therefore, the application of higher levels of PEEP to compensate for the higher levels of intrinsic PEEP may help reduce work of breathing and prevent tidal recruitment-de-recruitment and atelectasis. Additionally, low to negative transpulmonary pressures measured using the actual values of PES in obese patients and obese animal models predicted lung collapse and tidal opening and closing (Fumagalli J, et al. Crit Care Med. 2017;45[8]:1374). It is useful to remember that the compliance of the respiratory system (Crs) is the total of the sum of the compliance of the chest wall (Ccw) and the lung compliance (CL). In obese patients, Ccw has a much more significant contribution to the total Crs, and the clinician should be really interested in the CL. At the bedside, esophageal manometry can be very useful to distinguish the contribution of CL and Ccw to the total Crs.
No, we shouldn’t.
Another randomized controlled trial (EPVent-2), by the same group, compared PEEP titration guided by esophageal pressure with empirical PEEP titration, in patients with moderate to severe ARDS (Beitler JR, et al. JAMA. 2019;321[9]:846). The primary outcomes of interest, death, and mechanical ventilator-free days through day 28 were not different between the groups.
Additionally, placement of an esophageal balloon is challenging and operator-dependent. The balloon portion of the esophageal catheter should be positioned in the lower third of the esophagus, behind the heart. Catheter placement is typically performed by inserting it into the stomach to a depth of about 60 cm, and gently pressing on the abdomen and observing a sudden increase in pressure on the ventilator screen. It is then withdrawn to about 40 cm, while looking for cardiac oscillations and pressure change (Talmor D, et al. N Engl J Med. 2008;359:2095). One can see how easily it would be to insert the esophageal balloon incorrectly. A misplaced balloon won’t provide accurate PES and can potentially cause harm.
Final answer: It depends on each individual patient.
Arguments for and against using an esophageal balloon to titrate PEEP in patients with ARDS and refractory hypoxemia are ongoing. Even the two most cited and applied trials on the matter (EPVent and EPVent-2) reported contradictory results. However, when analyzed in depth, both showed better oxygenation with the use of esophageal balloon. EPVent had improvement in oxygenation as its primary endpoint, and it was significant in the esophageal balloon group. EPVent-2 had oxygenation goals, in the form of need for rescue therapies for refractory hypoxemia, as secondary endpoints. Nonetheless, the patients in the esophageal balloon group in EPVent-2 required prone positioning less frequently, had lower use of pulmonary vasodilators, and a lower rate of ECMO consultations. Even though those trials did not show a mortality benefit, both showed an oxygenation benefit.
The ideal single tool that would indicate the “perfect “PEEP for each patient remains to be described. Until then, PEEP titration guided by a combination of ARDSnet PEEP tables, while maintaining a plateau pressure below 30 cm H2O and considering a driving pressure below 15 cm H2O should be a clinician’s goal. In patients in the extremes of height and body weight, and/or with conditions that would increase intra-abdominal pressure, such as ascites, a well-placed esophageal balloon while patient is supine might be beneficial.
The truth of the matter is, PEEP should be titrated by a trained intensivist in conjunction with the multidisciplinary ICU team, at patients’ bedside taking into consideration each individual’s unique physiologic and pathophysiologic characteristics at that moment.
Dr. Gallo de Moraes is Assistant Professor of Medicine, and Dr Oeckler is Assistant Professor of Medicine, Division of Pulmonary and Critical Care, Mayo Clinic, Rochester, Minnesota.
Application of basic physiology principles at bedside has changed the approach to the treatment of patients with acute respiratory distress syndrome (ARDS) and refractory hypoxemia. Current standard of care for patients with ARDS includes a low tidal volume ventilation strategy (6 mL/kg of ideal body weight), keeping plateau pressures below 30 cm H2O (Brower RG, et al. N Engl J Med. 2000;342[18]:1301), driving pressures below 15 cm H2O and adequate positive end-expiratory pressures (PEEP) to keep the alveoli open without overdistension (Villar J, et al. Crit Care Med. 2006;34[5]:1311). However, at this time, despite the awareness of the importance of this intervention, there is no consensus regarding the best method to determine ideal PEEP at the individual patient level.
A thorough understanding of the basic physiologic concepts regarding respiratory pressures is of paramount importance to be able to formulate an opinion. The transpulmonary pressure (or lung distending pressure) is the gradient caused by the difference between alveolar (PA) and pleural pressure (PPL). In order to prevent lung collapse at end-expiration, PA must remain higher than PPL such that the gradient remains outward, preventing end-expiratory collapse and atelectotrauma. To accomplish that, it is necessary to know the end-expiratory PA and PPL. Esophageal balloon pressures (PES) represent central thoracic pressures, but, despite positional and regional variations, they are a good surrogate for average “effective” PPL (Baedorf KE, et al. Med Klin Intensivmed Notfmed. 2018;113[Suppl 1]:13).
Understanding that the value of the PES represents a practical PPL makes it easier to appreciate the potential usefulness of an esophageal balloon to titrate PEEP. The objective of PEEP titration is to prevent de-recruitment, maintain alveolar aeration, and improve the functional size of aerated alveoli. If the applied PEEP is lower than the PPL, the dependent lung regions will collapse. On the other hand, if PEEP is higher than the PPL, the lung would be overdistended, causing barotrauma and hemodynamic compromise.
The question is: Should we use esophageal balloons?Yes, we should.
A single center randomized control trial (EPVent) compared PEEP titration to achieve a positive PL vs standard of care lung protective ventilation (Talmor D, et al. N Engl J Med. 2008;359:2095). The PEEP titration group used significantly higher levels of PEEP, with improved oxygenation and lung compliance. However, there was no significant difference in ventilator-free days or mortality between the groups.
Obese patients are also likely to benefit from PEEP titration guided by an esophageal balloon, as they often have higher levels of intrinsic PEEP. Therefore, the application of higher levels of PEEP to compensate for the higher levels of intrinsic PEEP may help reduce work of breathing and prevent tidal recruitment-de-recruitment and atelectasis. Additionally, low to negative transpulmonary pressures measured using the actual values of PES in obese patients and obese animal models predicted lung collapse and tidal opening and closing (Fumagalli J, et al. Crit Care Med. 2017;45[8]:1374). It is useful to remember that the compliance of the respiratory system (Crs) is the total of the sum of the compliance of the chest wall (Ccw) and the lung compliance (CL). In obese patients, Ccw has a much more significant contribution to the total Crs, and the clinician should be really interested in the CL. At the bedside, esophageal manometry can be very useful to distinguish the contribution of CL and Ccw to the total Crs.
No, we shouldn’t.
Another randomized controlled trial (EPVent-2), by the same group, compared PEEP titration guided by esophageal pressure with empirical PEEP titration, in patients with moderate to severe ARDS (Beitler JR, et al. JAMA. 2019;321[9]:846). The primary outcomes of interest, death, and mechanical ventilator-free days through day 28 were not different between the groups.
Additionally, placement of an esophageal balloon is challenging and operator-dependent. The balloon portion of the esophageal catheter should be positioned in the lower third of the esophagus, behind the heart. Catheter placement is typically performed by inserting it into the stomach to a depth of about 60 cm, and gently pressing on the abdomen and observing a sudden increase in pressure on the ventilator screen. It is then withdrawn to about 40 cm, while looking for cardiac oscillations and pressure change (Talmor D, et al. N Engl J Med. 2008;359:2095). One can see how easily it would be to insert the esophageal balloon incorrectly. A misplaced balloon won’t provide accurate PES and can potentially cause harm.
Final answer: It depends on each individual patient.
Arguments for and against using an esophageal balloon to titrate PEEP in patients with ARDS and refractory hypoxemia are ongoing. Even the two most cited and applied trials on the matter (EPVent and EPVent-2) reported contradictory results. However, when analyzed in depth, both showed better oxygenation with the use of esophageal balloon. EPVent had improvement in oxygenation as its primary endpoint, and it was significant in the esophageal balloon group. EPVent-2 had oxygenation goals, in the form of need for rescue therapies for refractory hypoxemia, as secondary endpoints. Nonetheless, the patients in the esophageal balloon group in EPVent-2 required prone positioning less frequently, had lower use of pulmonary vasodilators, and a lower rate of ECMO consultations. Even though those trials did not show a mortality benefit, both showed an oxygenation benefit.
The ideal single tool that would indicate the “perfect “PEEP for each patient remains to be described. Until then, PEEP titration guided by a combination of ARDSnet PEEP tables, while maintaining a plateau pressure below 30 cm H2O and considering a driving pressure below 15 cm H2O should be a clinician’s goal. In patients in the extremes of height and body weight, and/or with conditions that would increase intra-abdominal pressure, such as ascites, a well-placed esophageal balloon while patient is supine might be beneficial.
The truth of the matter is, PEEP should be titrated by a trained intensivist in conjunction with the multidisciplinary ICU team, at patients’ bedside taking into consideration each individual’s unique physiologic and pathophysiologic characteristics at that moment.
Dr. Gallo de Moraes is Assistant Professor of Medicine, and Dr Oeckler is Assistant Professor of Medicine, Division of Pulmonary and Critical Care, Mayo Clinic, Rochester, Minnesota.
Noninvasive ventilation: Redefining insurance guidelines
Noninvasive ventilation (NIV) supports patient’s breathing without the immediate need for tracheotomy or intubation. The Center for Medicare and Medicaid Services (CMS) defines respiratory assist devices (RAD) as bi-level devices with back-up respiratory rate capability, which provide noninvasive modes of ventilation for respiratory insufficiency or sleep-related respiratory disorders in a home or hospital setting (21 CFR 868.5895). These devices are smaller in size with provision of the external battery (if needed) but limited by inability to offer daytime ventilatory mode (ie, mouthpiece ventilation). Currently, respiratory assist devices have been in DMEPOS Competitive Bidding Program since 2011, (similar to PAP devices for sleep apnea syndromes), which puts a 13-month capped rental in which the patient gets the device, supplies, and services for 13 months subsequent to which patient owns the device and supplies are paid separately by CMS (https://www.dmecompetitivebid.com/cbic/cbic.nsf/DocsCat/Home).
On the other hand, CMS defines home mechanical ventilators (HMV) as life supporting/sustaining devices for patients of all age groups used in various settings, included but not limited to home, hospital, institutional setting, transportation, or wherever portability is needed. The ventilators have increased portability due to external and internal battery, provision of mouthpiece ventilation, and at least six pressure modes and three volumes modes. Currently, the ventilators are under the frequently and substantially serviced act [42 U.S.C. § 1395m(a)(3)]. Under this act, the patient never owns the device but the device, ancillary supplies, clinical support (trained respiratory therapists), and servicing of the device are included in the monthly payments, which can last indefinitely. Thus, ventilators have both higher reimbursement rates and uncapped rental periods; beneficiaries not only pay higher monthly co-payments for these devices but also pay over a longer rental period. Nonetheless, these services are vital in keeping a certain subset of patients comfortable at home and out of higher cost settings. The current populations that directly benefit from this service are patients with polio, amyotrophic lateral sclerosis, muscular dystrophies, spinal muscle atrophy, thoracic restrictive disorder, and chronic hypercapnic respiratory failure due to COPD, to name a few. Thus, HMV has been vital in “freeing” these frail and vulnerable patient populations from their hospital beds, improving the quality of life, as well as mortality.
With the advent of technologic advancements, HMV, especially the noninvasive pressure support ventilator, is now capable of doing multiple modes, including CPAP, RAD modes, and ventilator modes. This could create a potential of abuse when the durable medical equipment supplier bills CMS for the ventilator but clinically, a lower cost CPAP, auto bi-level PAP, or RAD is indicated. The 2016 report from the Office of Inspector General (OIG) noted that CMS paid 85 times more claims for noninvasive pressure support ventilators in 2015 than in 2009 (from $3.8 million to $340 million). [https://tinyurl.com/y3ckskrb]. Expenditure increased from 2014 to 2015 alone accounted for 47% of the entire $337 million increase from 2009 to 2015. But, the report could not implicate reduced prices for CPAP devices and RADs under the Competitive Bidding Program to be driving increased billing for ventilators. They did find that the diagnoses used for these claims have shifted dramatically from neuromuscular diseases to other chronic respiratory conditions.
Since then, in January 2016, CMS consolidated billing codes for ventilators, and also reduced the reimbursement amount for noninvasive pressure support ventilators. After this change, between 2015 and 2016, median monthly rental rate of products decreased from $1,561 to $1,055; a reduction of 32% [https://tinyurl.com/y3ckskrb]. CMS presently is proposing to include HMV in the competitive bidding program to help with misuse and cost reduction. But proposed addition of the home ventilators in competitive bidding risks elimination of the vital services that are so important to keep a very “vulnerable and frail” population out of higher cost facilities. Because of this, CMS would see increased costs due to frequent emergency rooms visits, frequent intubations, intensive care unit stays, and admissions to long-term care at skilled nursing on one hand, but negatively impacting the quality of life of these patients on the other hand. This addition would have serious unintended consequences on Medicaid recipients, especially the pediatric population.
As a clinical guide, RADs are used for similar clinical conditions as HMV, but are meant for less severe respiratory conditions. Ideally, getting a RAD device for a patient should be governed by the physician’s clinical judgment rather than rigorous qualification criteria, nonetheless current RAD coverage policy in not only difficult but includes unnecessary qualification criteria, and as a result pushing the patient towards more costly ventilators. Unfortunately, CMS policies have not kept up with the technological advances of noninvasive ventilation. This has led to increased costs and utilization of noninvasive ventilators. In our opinion, including noninvasive ventilators in competitive bidding to reduce cost utilization is not the solution.
CMS needs to work with medical providers, beneficiaries, and various stakeholders to revise the current respiratory assist device and home mechanical ventilator guidelines in order to ensure that the appropriate patient is eligible for the correct device, without putting a very vulnerable patient population at risk.
Dr. Sahni is Clinical Assistant Professor, Division of Pulmonary, Critical Care, and Sleep Medicine at the University of Illinois at Chicago; Dr. Wolfe is Associate Professor of Medicine (Pulmonary & Critical Care) and Neurology (Sleep Medicine), Northwestern University, Chicago, Illinois.
Noninvasive ventilation (NIV) supports patient’s breathing without the immediate need for tracheotomy or intubation. The Center for Medicare and Medicaid Services (CMS) defines respiratory assist devices (RAD) as bi-level devices with back-up respiratory rate capability, which provide noninvasive modes of ventilation for respiratory insufficiency or sleep-related respiratory disorders in a home or hospital setting (21 CFR 868.5895). These devices are smaller in size with provision of the external battery (if needed) but limited by inability to offer daytime ventilatory mode (ie, mouthpiece ventilation). Currently, respiratory assist devices have been in DMEPOS Competitive Bidding Program since 2011, (similar to PAP devices for sleep apnea syndromes), which puts a 13-month capped rental in which the patient gets the device, supplies, and services for 13 months subsequent to which patient owns the device and supplies are paid separately by CMS (https://www.dmecompetitivebid.com/cbic/cbic.nsf/DocsCat/Home).
On the other hand, CMS defines home mechanical ventilators (HMV) as life supporting/sustaining devices for patients of all age groups used in various settings, included but not limited to home, hospital, institutional setting, transportation, or wherever portability is needed. The ventilators have increased portability due to external and internal battery, provision of mouthpiece ventilation, and at least six pressure modes and three volumes modes. Currently, the ventilators are under the frequently and substantially serviced act [42 U.S.C. § 1395m(a)(3)]. Under this act, the patient never owns the device but the device, ancillary supplies, clinical support (trained respiratory therapists), and servicing of the device are included in the monthly payments, which can last indefinitely. Thus, ventilators have both higher reimbursement rates and uncapped rental periods; beneficiaries not only pay higher monthly co-payments for these devices but also pay over a longer rental period. Nonetheless, these services are vital in keeping a certain subset of patients comfortable at home and out of higher cost settings. The current populations that directly benefit from this service are patients with polio, amyotrophic lateral sclerosis, muscular dystrophies, spinal muscle atrophy, thoracic restrictive disorder, and chronic hypercapnic respiratory failure due to COPD, to name a few. Thus, HMV has been vital in “freeing” these frail and vulnerable patient populations from their hospital beds, improving the quality of life, as well as mortality.
With the advent of technologic advancements, HMV, especially the noninvasive pressure support ventilator, is now capable of doing multiple modes, including CPAP, RAD modes, and ventilator modes. This could create a potential of abuse when the durable medical equipment supplier bills CMS for the ventilator but clinically, a lower cost CPAP, auto bi-level PAP, or RAD is indicated. The 2016 report from the Office of Inspector General (OIG) noted that CMS paid 85 times more claims for noninvasive pressure support ventilators in 2015 than in 2009 (from $3.8 million to $340 million). [https://tinyurl.com/y3ckskrb]. Expenditure increased from 2014 to 2015 alone accounted for 47% of the entire $337 million increase from 2009 to 2015. But, the report could not implicate reduced prices for CPAP devices and RADs under the Competitive Bidding Program to be driving increased billing for ventilators. They did find that the diagnoses used for these claims have shifted dramatically from neuromuscular diseases to other chronic respiratory conditions.
Since then, in January 2016, CMS consolidated billing codes for ventilators, and also reduced the reimbursement amount for noninvasive pressure support ventilators. After this change, between 2015 and 2016, median monthly rental rate of products decreased from $1,561 to $1,055; a reduction of 32% [https://tinyurl.com/y3ckskrb]. CMS presently is proposing to include HMV in the competitive bidding program to help with misuse and cost reduction. But proposed addition of the home ventilators in competitive bidding risks elimination of the vital services that are so important to keep a very “vulnerable and frail” population out of higher cost facilities. Because of this, CMS would see increased costs due to frequent emergency rooms visits, frequent intubations, intensive care unit stays, and admissions to long-term care at skilled nursing on one hand, but negatively impacting the quality of life of these patients on the other hand. This addition would have serious unintended consequences on Medicaid recipients, especially the pediatric population.
As a clinical guide, RADs are used for similar clinical conditions as HMV, but are meant for less severe respiratory conditions. Ideally, getting a RAD device for a patient should be governed by the physician’s clinical judgment rather than rigorous qualification criteria, nonetheless current RAD coverage policy in not only difficult but includes unnecessary qualification criteria, and as a result pushing the patient towards more costly ventilators. Unfortunately, CMS policies have not kept up with the technological advances of noninvasive ventilation. This has led to increased costs and utilization of noninvasive ventilators. In our opinion, including noninvasive ventilators in competitive bidding to reduce cost utilization is not the solution.
CMS needs to work with medical providers, beneficiaries, and various stakeholders to revise the current respiratory assist device and home mechanical ventilator guidelines in order to ensure that the appropriate patient is eligible for the correct device, without putting a very vulnerable patient population at risk.
Dr. Sahni is Clinical Assistant Professor, Division of Pulmonary, Critical Care, and Sleep Medicine at the University of Illinois at Chicago; Dr. Wolfe is Associate Professor of Medicine (Pulmonary & Critical Care) and Neurology (Sleep Medicine), Northwestern University, Chicago, Illinois.
Noninvasive ventilation (NIV) supports patient’s breathing without the immediate need for tracheotomy or intubation. The Center for Medicare and Medicaid Services (CMS) defines respiratory assist devices (RAD) as bi-level devices with back-up respiratory rate capability, which provide noninvasive modes of ventilation for respiratory insufficiency or sleep-related respiratory disorders in a home or hospital setting (21 CFR 868.5895). These devices are smaller in size with provision of the external battery (if needed) but limited by inability to offer daytime ventilatory mode (ie, mouthpiece ventilation). Currently, respiratory assist devices have been in DMEPOS Competitive Bidding Program since 2011, (similar to PAP devices for sleep apnea syndromes), which puts a 13-month capped rental in which the patient gets the device, supplies, and services for 13 months subsequent to which patient owns the device and supplies are paid separately by CMS (https://www.dmecompetitivebid.com/cbic/cbic.nsf/DocsCat/Home).
On the other hand, CMS defines home mechanical ventilators (HMV) as life supporting/sustaining devices for patients of all age groups used in various settings, included but not limited to home, hospital, institutional setting, transportation, or wherever portability is needed. The ventilators have increased portability due to external and internal battery, provision of mouthpiece ventilation, and at least six pressure modes and three volumes modes. Currently, the ventilators are under the frequently and substantially serviced act [42 U.S.C. § 1395m(a)(3)]. Under this act, the patient never owns the device but the device, ancillary supplies, clinical support (trained respiratory therapists), and servicing of the device are included in the monthly payments, which can last indefinitely. Thus, ventilators have both higher reimbursement rates and uncapped rental periods; beneficiaries not only pay higher monthly co-payments for these devices but also pay over a longer rental period. Nonetheless, these services are vital in keeping a certain subset of patients comfortable at home and out of higher cost settings. The current populations that directly benefit from this service are patients with polio, amyotrophic lateral sclerosis, muscular dystrophies, spinal muscle atrophy, thoracic restrictive disorder, and chronic hypercapnic respiratory failure due to COPD, to name a few. Thus, HMV has been vital in “freeing” these frail and vulnerable patient populations from their hospital beds, improving the quality of life, as well as mortality.
With the advent of technologic advancements, HMV, especially the noninvasive pressure support ventilator, is now capable of doing multiple modes, including CPAP, RAD modes, and ventilator modes. This could create a potential of abuse when the durable medical equipment supplier bills CMS for the ventilator but clinically, a lower cost CPAP, auto bi-level PAP, or RAD is indicated. The 2016 report from the Office of Inspector General (OIG) noted that CMS paid 85 times more claims for noninvasive pressure support ventilators in 2015 than in 2009 (from $3.8 million to $340 million). [https://tinyurl.com/y3ckskrb]. Expenditure increased from 2014 to 2015 alone accounted for 47% of the entire $337 million increase from 2009 to 2015. But, the report could not implicate reduced prices for CPAP devices and RADs under the Competitive Bidding Program to be driving increased billing for ventilators. They did find that the diagnoses used for these claims have shifted dramatically from neuromuscular diseases to other chronic respiratory conditions.
Since then, in January 2016, CMS consolidated billing codes for ventilators, and also reduced the reimbursement amount for noninvasive pressure support ventilators. After this change, between 2015 and 2016, median monthly rental rate of products decreased from $1,561 to $1,055; a reduction of 32% [https://tinyurl.com/y3ckskrb]. CMS presently is proposing to include HMV in the competitive bidding program to help with misuse and cost reduction. But proposed addition of the home ventilators in competitive bidding risks elimination of the vital services that are so important to keep a very “vulnerable and frail” population out of higher cost facilities. Because of this, CMS would see increased costs due to frequent emergency rooms visits, frequent intubations, intensive care unit stays, and admissions to long-term care at skilled nursing on one hand, but negatively impacting the quality of life of these patients on the other hand. This addition would have serious unintended consequences on Medicaid recipients, especially the pediatric population.
As a clinical guide, RADs are used for similar clinical conditions as HMV, but are meant for less severe respiratory conditions. Ideally, getting a RAD device for a patient should be governed by the physician’s clinical judgment rather than rigorous qualification criteria, nonetheless current RAD coverage policy in not only difficult but includes unnecessary qualification criteria, and as a result pushing the patient towards more costly ventilators. Unfortunately, CMS policies have not kept up with the technological advances of noninvasive ventilation. This has led to increased costs and utilization of noninvasive ventilators. In our opinion, including noninvasive ventilators in competitive bidding to reduce cost utilization is not the solution.
CMS needs to work with medical providers, beneficiaries, and various stakeholders to revise the current respiratory assist device and home mechanical ventilator guidelines in order to ensure that the appropriate patient is eligible for the correct device, without putting a very vulnerable patient population at risk.
Dr. Sahni is Clinical Assistant Professor, Division of Pulmonary, Critical Care, and Sleep Medicine at the University of Illinois at Chicago; Dr. Wolfe is Associate Professor of Medicine (Pulmonary & Critical Care) and Neurology (Sleep Medicine), Northwestern University, Chicago, Illinois.
Changing clinical practice to maximize success of ICU airway management
Airway management is a complex process that, if not performed in a proper and timely manner, may result in significant morbidity or mortality. The risk of intubation failure and associated adverse events is higher in critically ill patients due to differences in patient condition, environment, and practitioner experience. Even when controlling for provider experience, intubating conditions are worse and success rates are lower in the ICU compared with the controlled environment of the operating room (Taboada, et al. Anesthesiology. 2018;129[2]:321). Furthermore, the risk of injury and adverse events increases with the number of intubation attempts during an emergency (Sakles JC, et al. Acad Emerg Med. 2013;20[1]:71). Unfortunately, the paucity of high-grade evidence leads practitioners to rely on practice patterns developed during training and predicated on common sense airway management principles. The difficulty in evaluating airway management in the critically ill lies in the multi-step and complex nature of the process, including the pre-intubation, intubation, and post-intubation activities (Fig 1). Several recent publications have the potential to change airway management practice in the ICU. We will address the latest information on preoxygenation, use of neuromuscular blockade (NMB), and checklists in this setting.
Preoxygenation: Overrated?
Rapid-sequence intubation (RSI) is a technique intended to minimize the time from induction to intubation and reduce the risk of aspiration by primarily avoiding ventilation. The avoidance of bag-mask ventilation during this apneic period is common, due to concerns that positive pressure can produce gastric insufflation and regurgitation that may lead to aspiration. To attenuate the risk for critical desaturation, preoxygenation is classically provided prior to induction of anesthesia in the operative procedural areas. Although the benefit can be seen in patients undergoing elective intubation, critically ill patients often have difficulty in significantly raising the blood oxygen content despite preoxygenation with 100% oxygen delivered via face mask. As a result, the oxygen saturation can drop precipitously during the process of ICU intubation, especially if multiple or prolonged intubation attempts are required. These factors all contribute to the risk of hypoxemia and cardiac arrest during ICU intubations (De Jong A, et al. Crit Care Med. 2018;46[4]:532), which has led to the debate about the avoidance of ventilation during RSI in the critically ill. Recently, Casey and colleagues (Casey JD, et al. N Engl J Med. 2019;380[9]:811) evaluated the use of bag-mask ventilation (BMV) during RSI. In this ICU study, intubations were randomized to either include BMV or no ventilation after induction. The results suggested that the frequency of critical desaturation was lower in the patients receiving BMV after induction without a concomitant increase in frequency of aspiration. Although not powered to evaluate the difference in the incidence of aspiration, this study supports the use of BMV during the apneic phase of intubation, thereby decreasing the risk for critical desaturation.
Neuromuscular blockade: Yes or no?
Awake intubation, with or without sedation, is often employed for managing the airway in high-risk patients. This technique allows the patient to maintain spontaneous ventilation in the event of repeated intubation attempts and has a lower hypotension risk. However, many critically ill patients cannot be managed in this manner due to lack of patient cooperation, emergent airway management requirements, or practitioner inexperience with this technique. As a result, many of these patients will require an induction agent, and concomitant administration of a neuromuscular blocking agent (NMB) to optimize intubating conditions. However, the avoidance of NMBs in emergent airway scenarios was not uncommon among attending physicians and trainees (Schmidt UH, et al. Anesthesiology. 2008;109[6]:973). The American College of Chest Physicians (CHEST) Difficult Airway Course faculty also recommended to not use NMB because of the high risk of failure to ventilate/oxygenate. Without NMB, the patient might be allowed to recover to spontaneous ventilation. This approach is taken in the American Society of Anesthesiologists Practice Guidelines for the Management of the Difficult Airway but is not necessarily applicable to the critically ill patient (Apfelbaum JL, et al. Anesthesiology. 2013;118[2]:251-70). In the event of “can’t intubate, can’t oxygenate” (CICO), the critically ill patient in extremis may not tolerate an attempt to return to spontaneous ventilation because spontaneous ventilation may have been initially inadequate.
In 2010, Jaber and colleagues demonstrated a lower incidence of hypoxemia and severe hemodynamic collapse with the implementation of an intubation bundle that included the use of NMBs for all rapid-sequence inductions (Jaber S, et al. Int Care Med. 2010;36:248). The safety of using paralytics in critically ill patients was later investigated by Wilcox and colleagues in a prospective, observational study that suggested a decrease in the incidence of hypoxemia and complications when employing NMB (Wilcox SR, et al. Crit Care Med. 2012;40[6]:1808). Although Wilcox et al.’s study was hypothesis-generating by the nature of its design, it was consistent with both Jaber’s findings and a more recent observational study performed by Moser et al (Mosier JM, et al. Ann Am Thorac Soc. 2015;12[5]:734). Furthermore, there is no evidence that NMBs worsen bag mask ventilation in the critically ill patient. NMBs in addition to induction agents might be associated with optimal intubating conditions, reduced complications, and allow for placement of a supraglottic airway device or surgical airway in the event of a CICO (Higgs A, et al. Br J Anaesth. 2018;120[2]:323).
Checking the checklists
Checklists are another intervention with the potential to improve outcomes or reduce adverse events. Airway management is often a complex process with significant opportunities for failure. Therefore, having reminders or checklists available to the provider may encourage the use of best practices. Jaber demonstrated that a straightforward, 10-point intubation bundle reduced the incidence of severe complications associated with emergent intubation in the ICU. In the 4th National Audit Project of the Royal College of Anaesthetists and Difficult Airway Society, the use of checklists was recommended as a method to reduce adverse events and increase successful airway management (Cook TM, et al. Br J Anaesth. 2011;106[5]:632). In fact, several mnemonics have been developed to aid the practitioner, including the ‘7 Ps’ in the Manual of Emergency Airway Management (Walls RM, et al. Manual of Emergency Airway Management. 2012) and APPROACH from the CHEST Airway Management Training Team. More recently, Janz and colleagues developed and employed a checklist in a multicenter study and compared it with usual practice (Janz DR, et al. Chest. 2018;153[4]:816). Although the checklist was associated with improved provider compliance with airway assessment, preparation, and verbalization of a plan, it did not go far enough to include the known interventions for optimizing preoxygenation and hemodynamic stability. Two elements that might be included in a checklist include fluids and vasopressors administration during the pre-intubation and post-intubation period, and preoxygenation with noninvasive ventilation. The former is associated with a lower incidence of hypotension, while the latter may reduce the incidence of severe hypoxemia in ICU intubations (Baillard C, et al. Am J Respir Crit Care Med. 2006;174[2]:171).
Keeping apprised of evidence and adjusting practice are crucial to the competent clinician engaging in airway management, as they minimize the risk of harm while maximizing the benefit to the patient. However, the methods to achieve these goals are not always intuitive. Definitive high-level evidence is sparse. The use of neuromuscular blockade and BMV after induction has historically been controversial, but more recent evidence is favoring these approaches for RSI. The use of checklists or guidelines may ensure that the necessary safety steps are followed, especially at institutions that may not have experts in airway management. Over time, the hope is that many of our traditional practices are either supported by quality evidence or better techniques evolve.
Dr. Tokarczyk is with the Department of Anesthesia, NorthShore University HealthSystem; and Clinical Assistant Professor, University of Chicago, Pritzker School of Medicine. Dr. Greenberg is Editor-in-Chief, Anesthesia Patient Safety Foundation (APSF) Newsletter; Vice Chairperson, Education, Department of Anesthesiology; Director of Critical Care Services, Evanston Hospital; NorthShore University HealthSystem; and Clinical Professor, Department of Anesthesiology Critical Care, University of Chicago, Pritzker School of Medicine.
Airway management is a complex process that, if not performed in a proper and timely manner, may result in significant morbidity or mortality. The risk of intubation failure and associated adverse events is higher in critically ill patients due to differences in patient condition, environment, and practitioner experience. Even when controlling for provider experience, intubating conditions are worse and success rates are lower in the ICU compared with the controlled environment of the operating room (Taboada, et al. Anesthesiology. 2018;129[2]:321). Furthermore, the risk of injury and adverse events increases with the number of intubation attempts during an emergency (Sakles JC, et al. Acad Emerg Med. 2013;20[1]:71). Unfortunately, the paucity of high-grade evidence leads practitioners to rely on practice patterns developed during training and predicated on common sense airway management principles. The difficulty in evaluating airway management in the critically ill lies in the multi-step and complex nature of the process, including the pre-intubation, intubation, and post-intubation activities (Fig 1). Several recent publications have the potential to change airway management practice in the ICU. We will address the latest information on preoxygenation, use of neuromuscular blockade (NMB), and checklists in this setting.
Preoxygenation: Overrated?
Rapid-sequence intubation (RSI) is a technique intended to minimize the time from induction to intubation and reduce the risk of aspiration by primarily avoiding ventilation. The avoidance of bag-mask ventilation during this apneic period is common, due to concerns that positive pressure can produce gastric insufflation and regurgitation that may lead to aspiration. To attenuate the risk for critical desaturation, preoxygenation is classically provided prior to induction of anesthesia in the operative procedural areas. Although the benefit can be seen in patients undergoing elective intubation, critically ill patients often have difficulty in significantly raising the blood oxygen content despite preoxygenation with 100% oxygen delivered via face mask. As a result, the oxygen saturation can drop precipitously during the process of ICU intubation, especially if multiple or prolonged intubation attempts are required. These factors all contribute to the risk of hypoxemia and cardiac arrest during ICU intubations (De Jong A, et al. Crit Care Med. 2018;46[4]:532), which has led to the debate about the avoidance of ventilation during RSI in the critically ill. Recently, Casey and colleagues (Casey JD, et al. N Engl J Med. 2019;380[9]:811) evaluated the use of bag-mask ventilation (BMV) during RSI. In this ICU study, intubations were randomized to either include BMV or no ventilation after induction. The results suggested that the frequency of critical desaturation was lower in the patients receiving BMV after induction without a concomitant increase in frequency of aspiration. Although not powered to evaluate the difference in the incidence of aspiration, this study supports the use of BMV during the apneic phase of intubation, thereby decreasing the risk for critical desaturation.
Neuromuscular blockade: Yes or no?
Awake intubation, with or without sedation, is often employed for managing the airway in high-risk patients. This technique allows the patient to maintain spontaneous ventilation in the event of repeated intubation attempts and has a lower hypotension risk. However, many critically ill patients cannot be managed in this manner due to lack of patient cooperation, emergent airway management requirements, or practitioner inexperience with this technique. As a result, many of these patients will require an induction agent, and concomitant administration of a neuromuscular blocking agent (NMB) to optimize intubating conditions. However, the avoidance of NMBs in emergent airway scenarios was not uncommon among attending physicians and trainees (Schmidt UH, et al. Anesthesiology. 2008;109[6]:973). The American College of Chest Physicians (CHEST) Difficult Airway Course faculty also recommended to not use NMB because of the high risk of failure to ventilate/oxygenate. Without NMB, the patient might be allowed to recover to spontaneous ventilation. This approach is taken in the American Society of Anesthesiologists Practice Guidelines for the Management of the Difficult Airway but is not necessarily applicable to the critically ill patient (Apfelbaum JL, et al. Anesthesiology. 2013;118[2]:251-70). In the event of “can’t intubate, can’t oxygenate” (CICO), the critically ill patient in extremis may not tolerate an attempt to return to spontaneous ventilation because spontaneous ventilation may have been initially inadequate.
In 2010, Jaber and colleagues demonstrated a lower incidence of hypoxemia and severe hemodynamic collapse with the implementation of an intubation bundle that included the use of NMBs for all rapid-sequence inductions (Jaber S, et al. Int Care Med. 2010;36:248). The safety of using paralytics in critically ill patients was later investigated by Wilcox and colleagues in a prospective, observational study that suggested a decrease in the incidence of hypoxemia and complications when employing NMB (Wilcox SR, et al. Crit Care Med. 2012;40[6]:1808). Although Wilcox et al.’s study was hypothesis-generating by the nature of its design, it was consistent with both Jaber’s findings and a more recent observational study performed by Moser et al (Mosier JM, et al. Ann Am Thorac Soc. 2015;12[5]:734). Furthermore, there is no evidence that NMBs worsen bag mask ventilation in the critically ill patient. NMBs in addition to induction agents might be associated with optimal intubating conditions, reduced complications, and allow for placement of a supraglottic airway device or surgical airway in the event of a CICO (Higgs A, et al. Br J Anaesth. 2018;120[2]:323).
Checking the checklists
Checklists are another intervention with the potential to improve outcomes or reduce adverse events. Airway management is often a complex process with significant opportunities for failure. Therefore, having reminders or checklists available to the provider may encourage the use of best practices. Jaber demonstrated that a straightforward, 10-point intubation bundle reduced the incidence of severe complications associated with emergent intubation in the ICU. In the 4th National Audit Project of the Royal College of Anaesthetists and Difficult Airway Society, the use of checklists was recommended as a method to reduce adverse events and increase successful airway management (Cook TM, et al. Br J Anaesth. 2011;106[5]:632). In fact, several mnemonics have been developed to aid the practitioner, including the ‘7 Ps’ in the Manual of Emergency Airway Management (Walls RM, et al. Manual of Emergency Airway Management. 2012) and APPROACH from the CHEST Airway Management Training Team. More recently, Janz and colleagues developed and employed a checklist in a multicenter study and compared it with usual practice (Janz DR, et al. Chest. 2018;153[4]:816). Although the checklist was associated with improved provider compliance with airway assessment, preparation, and verbalization of a plan, it did not go far enough to include the known interventions for optimizing preoxygenation and hemodynamic stability. Two elements that might be included in a checklist include fluids and vasopressors administration during the pre-intubation and post-intubation period, and preoxygenation with noninvasive ventilation. The former is associated with a lower incidence of hypotension, while the latter may reduce the incidence of severe hypoxemia in ICU intubations (Baillard C, et al. Am J Respir Crit Care Med. 2006;174[2]:171).
Keeping apprised of evidence and adjusting practice are crucial to the competent clinician engaging in airway management, as they minimize the risk of harm while maximizing the benefit to the patient. However, the methods to achieve these goals are not always intuitive. Definitive high-level evidence is sparse. The use of neuromuscular blockade and BMV after induction has historically been controversial, but more recent evidence is favoring these approaches for RSI. The use of checklists or guidelines may ensure that the necessary safety steps are followed, especially at institutions that may not have experts in airway management. Over time, the hope is that many of our traditional practices are either supported by quality evidence or better techniques evolve.
Dr. Tokarczyk is with the Department of Anesthesia, NorthShore University HealthSystem; and Clinical Assistant Professor, University of Chicago, Pritzker School of Medicine. Dr. Greenberg is Editor-in-Chief, Anesthesia Patient Safety Foundation (APSF) Newsletter; Vice Chairperson, Education, Department of Anesthesiology; Director of Critical Care Services, Evanston Hospital; NorthShore University HealthSystem; and Clinical Professor, Department of Anesthesiology Critical Care, University of Chicago, Pritzker School of Medicine.
Airway management is a complex process that, if not performed in a proper and timely manner, may result in significant morbidity or mortality. The risk of intubation failure and associated adverse events is higher in critically ill patients due to differences in patient condition, environment, and practitioner experience. Even when controlling for provider experience, intubating conditions are worse and success rates are lower in the ICU compared with the controlled environment of the operating room (Taboada, et al. Anesthesiology. 2018;129[2]:321). Furthermore, the risk of injury and adverse events increases with the number of intubation attempts during an emergency (Sakles JC, et al. Acad Emerg Med. 2013;20[1]:71). Unfortunately, the paucity of high-grade evidence leads practitioners to rely on practice patterns developed during training and predicated on common sense airway management principles. The difficulty in evaluating airway management in the critically ill lies in the multi-step and complex nature of the process, including the pre-intubation, intubation, and post-intubation activities (Fig 1). Several recent publications have the potential to change airway management practice in the ICU. We will address the latest information on preoxygenation, use of neuromuscular blockade (NMB), and checklists in this setting.
Preoxygenation: Overrated?
Rapid-sequence intubation (RSI) is a technique intended to minimize the time from induction to intubation and reduce the risk of aspiration by primarily avoiding ventilation. The avoidance of bag-mask ventilation during this apneic period is common, due to concerns that positive pressure can produce gastric insufflation and regurgitation that may lead to aspiration. To attenuate the risk for critical desaturation, preoxygenation is classically provided prior to induction of anesthesia in the operative procedural areas. Although the benefit can be seen in patients undergoing elective intubation, critically ill patients often have difficulty in significantly raising the blood oxygen content despite preoxygenation with 100% oxygen delivered via face mask. As a result, the oxygen saturation can drop precipitously during the process of ICU intubation, especially if multiple or prolonged intubation attempts are required. These factors all contribute to the risk of hypoxemia and cardiac arrest during ICU intubations (De Jong A, et al. Crit Care Med. 2018;46[4]:532), which has led to the debate about the avoidance of ventilation during RSI in the critically ill. Recently, Casey and colleagues (Casey JD, et al. N Engl J Med. 2019;380[9]:811) evaluated the use of bag-mask ventilation (BMV) during RSI. In this ICU study, intubations were randomized to either include BMV or no ventilation after induction. The results suggested that the frequency of critical desaturation was lower in the patients receiving BMV after induction without a concomitant increase in frequency of aspiration. Although not powered to evaluate the difference in the incidence of aspiration, this study supports the use of BMV during the apneic phase of intubation, thereby decreasing the risk for critical desaturation.
Neuromuscular blockade: Yes or no?
Awake intubation, with or without sedation, is often employed for managing the airway in high-risk patients. This technique allows the patient to maintain spontaneous ventilation in the event of repeated intubation attempts and has a lower hypotension risk. However, many critically ill patients cannot be managed in this manner due to lack of patient cooperation, emergent airway management requirements, or practitioner inexperience with this technique. As a result, many of these patients will require an induction agent, and concomitant administration of a neuromuscular blocking agent (NMB) to optimize intubating conditions. However, the avoidance of NMBs in emergent airway scenarios was not uncommon among attending physicians and trainees (Schmidt UH, et al. Anesthesiology. 2008;109[6]:973). The American College of Chest Physicians (CHEST) Difficult Airway Course faculty also recommended to not use NMB because of the high risk of failure to ventilate/oxygenate. Without NMB, the patient might be allowed to recover to spontaneous ventilation. This approach is taken in the American Society of Anesthesiologists Practice Guidelines for the Management of the Difficult Airway but is not necessarily applicable to the critically ill patient (Apfelbaum JL, et al. Anesthesiology. 2013;118[2]:251-70). In the event of “can’t intubate, can’t oxygenate” (CICO), the critically ill patient in extremis may not tolerate an attempt to return to spontaneous ventilation because spontaneous ventilation may have been initially inadequate.
In 2010, Jaber and colleagues demonstrated a lower incidence of hypoxemia and severe hemodynamic collapse with the implementation of an intubation bundle that included the use of NMBs for all rapid-sequence inductions (Jaber S, et al. Int Care Med. 2010;36:248). The safety of using paralytics in critically ill patients was later investigated by Wilcox and colleagues in a prospective, observational study that suggested a decrease in the incidence of hypoxemia and complications when employing NMB (Wilcox SR, et al. Crit Care Med. 2012;40[6]:1808). Although Wilcox et al.’s study was hypothesis-generating by the nature of its design, it was consistent with both Jaber’s findings and a more recent observational study performed by Moser et al (Mosier JM, et al. Ann Am Thorac Soc. 2015;12[5]:734). Furthermore, there is no evidence that NMBs worsen bag mask ventilation in the critically ill patient. NMBs in addition to induction agents might be associated with optimal intubating conditions, reduced complications, and allow for placement of a supraglottic airway device or surgical airway in the event of a CICO (Higgs A, et al. Br J Anaesth. 2018;120[2]:323).
Checking the checklists
Checklists are another intervention with the potential to improve outcomes or reduce adverse events. Airway management is often a complex process with significant opportunities for failure. Therefore, having reminders or checklists available to the provider may encourage the use of best practices. Jaber demonstrated that a straightforward, 10-point intubation bundle reduced the incidence of severe complications associated with emergent intubation in the ICU. In the 4th National Audit Project of the Royal College of Anaesthetists and Difficult Airway Society, the use of checklists was recommended as a method to reduce adverse events and increase successful airway management (Cook TM, et al. Br J Anaesth. 2011;106[5]:632). In fact, several mnemonics have been developed to aid the practitioner, including the ‘7 Ps’ in the Manual of Emergency Airway Management (Walls RM, et al. Manual of Emergency Airway Management. 2012) and APPROACH from the CHEST Airway Management Training Team. More recently, Janz and colleagues developed and employed a checklist in a multicenter study and compared it with usual practice (Janz DR, et al. Chest. 2018;153[4]:816). Although the checklist was associated with improved provider compliance with airway assessment, preparation, and verbalization of a plan, it did not go far enough to include the known interventions for optimizing preoxygenation and hemodynamic stability. Two elements that might be included in a checklist include fluids and vasopressors administration during the pre-intubation and post-intubation period, and preoxygenation with noninvasive ventilation. The former is associated with a lower incidence of hypotension, while the latter may reduce the incidence of severe hypoxemia in ICU intubations (Baillard C, et al. Am J Respir Crit Care Med. 2006;174[2]:171).
Keeping apprised of evidence and adjusting practice are crucial to the competent clinician engaging in airway management, as they minimize the risk of harm while maximizing the benefit to the patient. However, the methods to achieve these goals are not always intuitive. Definitive high-level evidence is sparse. The use of neuromuscular blockade and BMV after induction has historically been controversial, but more recent evidence is favoring these approaches for RSI. The use of checklists or guidelines may ensure that the necessary safety steps are followed, especially at institutions that may not have experts in airway management. Over time, the hope is that many of our traditional practices are either supported by quality evidence or better techniques evolve.
Dr. Tokarczyk is with the Department of Anesthesia, NorthShore University HealthSystem; and Clinical Assistant Professor, University of Chicago, Pritzker School of Medicine. Dr. Greenberg is Editor-in-Chief, Anesthesia Patient Safety Foundation (APSF) Newsletter; Vice Chairperson, Education, Department of Anesthesiology; Director of Critical Care Services, Evanston Hospital; NorthShore University HealthSystem; and Clinical Professor, Department of Anesthesiology Critical Care, University of Chicago, Pritzker School of Medicine.
Vaping in 2019: Risk vs. reward
The prevalence and popularity of electronic cigarettes or “vaping” have grown dramatically over the last several years in the United States. Although new studies targeting these products are being done at increasing frequency, there remains a relative paucity of data regarding the long-term risks. Proponents argue that they can be used as a cessation tool for smokers, or failing that, a safer replacement for traditional cigarettes. Opponents make the case that the perception of safety could contribute to increased use in people who may have otherwise never smoked, leading to an overall increase in nicotine use and addiction. This is most readily seen in the adolescent population, where use has skyrocketed, leading to concerns about how electronic cigarettes are marketed to youth, as well as the ease of access.
Basics of vaping (devices)
In its most basic form, an electronic cigarette consists of a battery that powers a heating coil. This heating coil applies heat to a wick, which is soaked in liquid, “vape juice,” converting it into a vapor that is then directly inhaled. However, there can be many variations on this simple theme. Early generation products resembled traditional cigarettes in size and shape and were marketed as smoking cessation aids. Newer devices have abandoned this look and strategy. Preloaded cartridges have been replaced by large tanks that the user can fill with the liquid of their choosing. Multiple tanks can be purchased for a single device, enabling the user to have multiple flavors or various levels of nicotine dosing on hand for quick changing, depending on user preference or mood. Additionally, there are variable voltage settings, resulting in different styles of vapor and/or “throat hit” (the description of the desired burning vs smooth effect of the vapor on the oropharynx). This type of device invites experimentation. Multiple flavors can be used in isolation or mixed together at various temperatures. It no longer resembles classic cigarettes, and the flavor and experience are more prominently promoted. One can see that this device has more appeal to a “never smoker” than the original products, and there is concern that it is being marketed as such with some success (Dinakar C, et al. N Engl J Med. 2016;375[14]:1372).
E-liquid
Perhaps more important than the devices themselves is an understanding of the components of the liquid used to generate the inhaled aerosol.
Typically, four components are present:
• Propylene glycol
• Vegetable glycerin
• Flavoring
• Nicotine
The first two components are generally considered nontoxic, based on their use as food additives. However, inhalation is a novel route of entry and the long-term effects on the respiratory tract are unclear.
The third component, “flavorings,” is a catch-all term for the hundreds of different flavors and styles of e-liquids available today, ranging from menthol to fruit or candy and everything in between. It is difficult to account for all the potential effects of the numerous flavorings being used, especially when some are combined by the end user to various degrees.
Nicotine is present, specified in varying doses. However, vaping style, experience, and type of device used can dramatically affect how much is absorbed, making dosages difficult to predict. Additionally, labeled doses are prone to wide ranges of error (Schraufnagel DE, et al. Am J Respir Crit Care Med. 2014;190[6]:611).
What are the risks?
Cancer
A handful of known carcinogens can be found in inhaled vapor, including formaldehyde, acetaldehyde, acrolein, toluene, and nitrosamines. However, they are present in far lower concentrations than in traditional cigarettes (Goniewicz ML, et al. JAMA Netw Open. 2018;1[8]e185937). This leads to the natural assumption that vaping, while not benign, poses a much lower cancer risk when compared with smoking. Whether that is borne out in the long term remains to be seen.
Pulmonary function
The long-term effect on pulmonary function is not known. Small studies have shown no significant changes to spirometry after acute exposure to vapor. More data are needed in this area (Palazzolo DL. Frontiers Public Health. 2013;1[56]1-20).
Wound healing
An animal study has shown evidence of poor wound healing extrapolated from skin flap necrosis in rats. Exposure to vapor vs smoke yielded similar results, and both were worse than the sham arm (Troiano C, et al. JAMA Facial Plast Surg. 2019;21[1]:5). While it is difficult to know how to apply this clinically, it may be prudent to advise patients to abstain while in preparation for elective surgery.
Cardiovascular/stroke
Much of the cardiovascular toxicity from cigarette use is tied to the myriad of complex toxic particles produced in inhaled smoke, the vast majority of which are not present in e-cigarette vapor. While nicotine itself has known acute cardiovascular effects, including tachycardia and vasoconstriction, a tolerance to these effects occurs over time. Previous evaluations of nicotine replacement therapies and smokeless tobacco for their cardiovascular effects have had mixed results. But, there appears to be a trend toward minimal cardiovascular risk when using “cleaner” products, such as nicotine replacement therapy compared with smokeless tobacco (Benowitz NL, et al. Nature Rev Cardiol. 2017;14[8]:447). Whether this can be extrapolated to electronic cigarette use is unknown but is encouraging.
Alternative toxicity
In addition to the above risks that are in comparison to traditional smoking, vaping also introduces novel toxicities. There are case reports of lipoid pneumonia, ARDS, hypersensitivity pneumonitis, eosinophilic pneumonia, and diffuse alveola hemorrhage. Burns from malfunctioning devices must also be considered, as there is a wide array of products available, at differing levels of build quality.
Toxic oral ingestion of nicotine, especially by children, has led to increased calls to poison centers. For a small child, this can be fatal. Regulation of labels and containers could curtail this issue. But, public education regarding the toxicity of these substances when ingested in large quantities is also important. If there is a lack of understanding about this danger, then typical safeguards are easily overlooked by individual users.
Are there benefits?
Smoking cessation
Compared with other products, such as nicotine patches, gum, and pharmaceutical methods, e-cigarettes most closely mimic the actual experience of smoking. For some, the habit and ritual of smoking is as much a part of the addiction as nicotine. Vaping has the potential to help alleviate this difficult aspect of cessation. Data involving early generation products failed to show a significant advantage. Newer devices that are more pleasurable to use and offer more efficient nicotine delivery may be more effective. Indeed, a recent study in the New England Journal of Medicine from this year demonstrated improved smoking cessation compared with traditional methods, using second generation vape devices (Hajek P, et al. N Engl J Med. 2019;380[7]629). It will be interesting to see if this can be repeatable going forward and if protocols can be established to maximize effectiveness.
As outlined above, it is difficult to make definitive conclusions or recommendations regarding electronic cigarette use at the present time. The risk of cancer and cardiopulmonary disease is likely to be significantly lower but not eliminated. Use as a smoking cessation aid is starting to show promise. Even without cessation, ongoing vaping is likely to be safer than ongoing smoking. Two caveats to this remain: some patients, in an effort to quit smoking, may take up vaping but eventually become “dual users.” This scenario has been associated with higher toxic exposure and possibly worse outcomes. The second caveat is that while there is promise to using this as a cessation tool, it should not yet replace other more well-studied, first-line agents in this regard. It should, perhaps, target patients who are motivated to quit but have failed more traditional methods. Finally, there continues to be concern that vaping could appeal to never smokers, given its perceived safety profile and ease of use in public places. This could lead to an overall increase in nicotine addiction, which could be a significant step backwards.
Dr. Clark is Assistant Professor, Pulmonary and Critical Care Medicine, UT Southwestern Medical Center, Dallas, Texas.
The prevalence and popularity of electronic cigarettes or “vaping” have grown dramatically over the last several years in the United States. Although new studies targeting these products are being done at increasing frequency, there remains a relative paucity of data regarding the long-term risks. Proponents argue that they can be used as a cessation tool for smokers, or failing that, a safer replacement for traditional cigarettes. Opponents make the case that the perception of safety could contribute to increased use in people who may have otherwise never smoked, leading to an overall increase in nicotine use and addiction. This is most readily seen in the adolescent population, where use has skyrocketed, leading to concerns about how electronic cigarettes are marketed to youth, as well as the ease of access.
Basics of vaping (devices)
In its most basic form, an electronic cigarette consists of a battery that powers a heating coil. This heating coil applies heat to a wick, which is soaked in liquid, “vape juice,” converting it into a vapor that is then directly inhaled. However, there can be many variations on this simple theme. Early generation products resembled traditional cigarettes in size and shape and were marketed as smoking cessation aids. Newer devices have abandoned this look and strategy. Preloaded cartridges have been replaced by large tanks that the user can fill with the liquid of their choosing. Multiple tanks can be purchased for a single device, enabling the user to have multiple flavors or various levels of nicotine dosing on hand for quick changing, depending on user preference or mood. Additionally, there are variable voltage settings, resulting in different styles of vapor and/or “throat hit” (the description of the desired burning vs smooth effect of the vapor on the oropharynx). This type of device invites experimentation. Multiple flavors can be used in isolation or mixed together at various temperatures. It no longer resembles classic cigarettes, and the flavor and experience are more prominently promoted. One can see that this device has more appeal to a “never smoker” than the original products, and there is concern that it is being marketed as such with some success (Dinakar C, et al. N Engl J Med. 2016;375[14]:1372).
E-liquid
Perhaps more important than the devices themselves is an understanding of the components of the liquid used to generate the inhaled aerosol.
Typically, four components are present:
• Propylene glycol
• Vegetable glycerin
• Flavoring
• Nicotine
The first two components are generally considered nontoxic, based on their use as food additives. However, inhalation is a novel route of entry and the long-term effects on the respiratory tract are unclear.
The third component, “flavorings,” is a catch-all term for the hundreds of different flavors and styles of e-liquids available today, ranging from menthol to fruit or candy and everything in between. It is difficult to account for all the potential effects of the numerous flavorings being used, especially when some are combined by the end user to various degrees.
Nicotine is present, specified in varying doses. However, vaping style, experience, and type of device used can dramatically affect how much is absorbed, making dosages difficult to predict. Additionally, labeled doses are prone to wide ranges of error (Schraufnagel DE, et al. Am J Respir Crit Care Med. 2014;190[6]:611).
What are the risks?
Cancer
A handful of known carcinogens can be found in inhaled vapor, including formaldehyde, acetaldehyde, acrolein, toluene, and nitrosamines. However, they are present in far lower concentrations than in traditional cigarettes (Goniewicz ML, et al. JAMA Netw Open. 2018;1[8]e185937). This leads to the natural assumption that vaping, while not benign, poses a much lower cancer risk when compared with smoking. Whether that is borne out in the long term remains to be seen.
Pulmonary function
The long-term effect on pulmonary function is not known. Small studies have shown no significant changes to spirometry after acute exposure to vapor. More data are needed in this area (Palazzolo DL. Frontiers Public Health. 2013;1[56]1-20).
Wound healing
An animal study has shown evidence of poor wound healing extrapolated from skin flap necrosis in rats. Exposure to vapor vs smoke yielded similar results, and both were worse than the sham arm (Troiano C, et al. JAMA Facial Plast Surg. 2019;21[1]:5). While it is difficult to know how to apply this clinically, it may be prudent to advise patients to abstain while in preparation for elective surgery.
Cardiovascular/stroke
Much of the cardiovascular toxicity from cigarette use is tied to the myriad of complex toxic particles produced in inhaled smoke, the vast majority of which are not present in e-cigarette vapor. While nicotine itself has known acute cardiovascular effects, including tachycardia and vasoconstriction, a tolerance to these effects occurs over time. Previous evaluations of nicotine replacement therapies and smokeless tobacco for their cardiovascular effects have had mixed results. But, there appears to be a trend toward minimal cardiovascular risk when using “cleaner” products, such as nicotine replacement therapy compared with smokeless tobacco (Benowitz NL, et al. Nature Rev Cardiol. 2017;14[8]:447). Whether this can be extrapolated to electronic cigarette use is unknown but is encouraging.
Alternative toxicity
In addition to the above risks that are in comparison to traditional smoking, vaping also introduces novel toxicities. There are case reports of lipoid pneumonia, ARDS, hypersensitivity pneumonitis, eosinophilic pneumonia, and diffuse alveola hemorrhage. Burns from malfunctioning devices must also be considered, as there is a wide array of products available, at differing levels of build quality.
Toxic oral ingestion of nicotine, especially by children, has led to increased calls to poison centers. For a small child, this can be fatal. Regulation of labels and containers could curtail this issue. But, public education regarding the toxicity of these substances when ingested in large quantities is also important. If there is a lack of understanding about this danger, then typical safeguards are easily overlooked by individual users.
Are there benefits?
Smoking cessation
Compared with other products, such as nicotine patches, gum, and pharmaceutical methods, e-cigarettes most closely mimic the actual experience of smoking. For some, the habit and ritual of smoking is as much a part of the addiction as nicotine. Vaping has the potential to help alleviate this difficult aspect of cessation. Data involving early generation products failed to show a significant advantage. Newer devices that are more pleasurable to use and offer more efficient nicotine delivery may be more effective. Indeed, a recent study in the New England Journal of Medicine from this year demonstrated improved smoking cessation compared with traditional methods, using second generation vape devices (Hajek P, et al. N Engl J Med. 2019;380[7]629). It will be interesting to see if this can be repeatable going forward and if protocols can be established to maximize effectiveness.
As outlined above, it is difficult to make definitive conclusions or recommendations regarding electronic cigarette use at the present time. The risk of cancer and cardiopulmonary disease is likely to be significantly lower but not eliminated. Use as a smoking cessation aid is starting to show promise. Even without cessation, ongoing vaping is likely to be safer than ongoing smoking. Two caveats to this remain: some patients, in an effort to quit smoking, may take up vaping but eventually become “dual users.” This scenario has been associated with higher toxic exposure and possibly worse outcomes. The second caveat is that while there is promise to using this as a cessation tool, it should not yet replace other more well-studied, first-line agents in this regard. It should, perhaps, target patients who are motivated to quit but have failed more traditional methods. Finally, there continues to be concern that vaping could appeal to never smokers, given its perceived safety profile and ease of use in public places. This could lead to an overall increase in nicotine addiction, which could be a significant step backwards.
Dr. Clark is Assistant Professor, Pulmonary and Critical Care Medicine, UT Southwestern Medical Center, Dallas, Texas.
The prevalence and popularity of electronic cigarettes or “vaping” have grown dramatically over the last several years in the United States. Although new studies targeting these products are being done at increasing frequency, there remains a relative paucity of data regarding the long-term risks. Proponents argue that they can be used as a cessation tool for smokers, or failing that, a safer replacement for traditional cigarettes. Opponents make the case that the perception of safety could contribute to increased use in people who may have otherwise never smoked, leading to an overall increase in nicotine use and addiction. This is most readily seen in the adolescent population, where use has skyrocketed, leading to concerns about how electronic cigarettes are marketed to youth, as well as the ease of access.
Basics of vaping (devices)
In its most basic form, an electronic cigarette consists of a battery that powers a heating coil. This heating coil applies heat to a wick, which is soaked in liquid, “vape juice,” converting it into a vapor that is then directly inhaled. However, there can be many variations on this simple theme. Early generation products resembled traditional cigarettes in size and shape and were marketed as smoking cessation aids. Newer devices have abandoned this look and strategy. Preloaded cartridges have been replaced by large tanks that the user can fill with the liquid of their choosing. Multiple tanks can be purchased for a single device, enabling the user to have multiple flavors or various levels of nicotine dosing on hand for quick changing, depending on user preference or mood. Additionally, there are variable voltage settings, resulting in different styles of vapor and/or “throat hit” (the description of the desired burning vs smooth effect of the vapor on the oropharynx). This type of device invites experimentation. Multiple flavors can be used in isolation or mixed together at various temperatures. It no longer resembles classic cigarettes, and the flavor and experience are more prominently promoted. One can see that this device has more appeal to a “never smoker” than the original products, and there is concern that it is being marketed as such with some success (Dinakar C, et al. N Engl J Med. 2016;375[14]:1372).
E-liquid
Perhaps more important than the devices themselves is an understanding of the components of the liquid used to generate the inhaled aerosol.
Typically, four components are present:
• Propylene glycol
• Vegetable glycerin
• Flavoring
• Nicotine
The first two components are generally considered nontoxic, based on their use as food additives. However, inhalation is a novel route of entry and the long-term effects on the respiratory tract are unclear.
The third component, “flavorings,” is a catch-all term for the hundreds of different flavors and styles of e-liquids available today, ranging from menthol to fruit or candy and everything in between. It is difficult to account for all the potential effects of the numerous flavorings being used, especially when some are combined by the end user to various degrees.
Nicotine is present, specified in varying doses. However, vaping style, experience, and type of device used can dramatically affect how much is absorbed, making dosages difficult to predict. Additionally, labeled doses are prone to wide ranges of error (Schraufnagel DE, et al. Am J Respir Crit Care Med. 2014;190[6]:611).
What are the risks?
Cancer
A handful of known carcinogens can be found in inhaled vapor, including formaldehyde, acetaldehyde, acrolein, toluene, and nitrosamines. However, they are present in far lower concentrations than in traditional cigarettes (Goniewicz ML, et al. JAMA Netw Open. 2018;1[8]e185937). This leads to the natural assumption that vaping, while not benign, poses a much lower cancer risk when compared with smoking. Whether that is borne out in the long term remains to be seen.
Pulmonary function
The long-term effect on pulmonary function is not known. Small studies have shown no significant changes to spirometry after acute exposure to vapor. More data are needed in this area (Palazzolo DL. Frontiers Public Health. 2013;1[56]1-20).
Wound healing
An animal study has shown evidence of poor wound healing extrapolated from skin flap necrosis in rats. Exposure to vapor vs smoke yielded similar results, and both were worse than the sham arm (Troiano C, et al. JAMA Facial Plast Surg. 2019;21[1]:5). While it is difficult to know how to apply this clinically, it may be prudent to advise patients to abstain while in preparation for elective surgery.
Cardiovascular/stroke
Much of the cardiovascular toxicity from cigarette use is tied to the myriad of complex toxic particles produced in inhaled smoke, the vast majority of which are not present in e-cigarette vapor. While nicotine itself has known acute cardiovascular effects, including tachycardia and vasoconstriction, a tolerance to these effects occurs over time. Previous evaluations of nicotine replacement therapies and smokeless tobacco for their cardiovascular effects have had mixed results. But, there appears to be a trend toward minimal cardiovascular risk when using “cleaner” products, such as nicotine replacement therapy compared with smokeless tobacco (Benowitz NL, et al. Nature Rev Cardiol. 2017;14[8]:447). Whether this can be extrapolated to electronic cigarette use is unknown but is encouraging.
Alternative toxicity
In addition to the above risks that are in comparison to traditional smoking, vaping also introduces novel toxicities. There are case reports of lipoid pneumonia, ARDS, hypersensitivity pneumonitis, eosinophilic pneumonia, and diffuse alveola hemorrhage. Burns from malfunctioning devices must also be considered, as there is a wide array of products available, at differing levels of build quality.
Toxic oral ingestion of nicotine, especially by children, has led to increased calls to poison centers. For a small child, this can be fatal. Regulation of labels and containers could curtail this issue. But, public education regarding the toxicity of these substances when ingested in large quantities is also important. If there is a lack of understanding about this danger, then typical safeguards are easily overlooked by individual users.
Are there benefits?
Smoking cessation
Compared with other products, such as nicotine patches, gum, and pharmaceutical methods, e-cigarettes most closely mimic the actual experience of smoking. For some, the habit and ritual of smoking is as much a part of the addiction as nicotine. Vaping has the potential to help alleviate this difficult aspect of cessation. Data involving early generation products failed to show a significant advantage. Newer devices that are more pleasurable to use and offer more efficient nicotine delivery may be more effective. Indeed, a recent study in the New England Journal of Medicine from this year demonstrated improved smoking cessation compared with traditional methods, using second generation vape devices (Hajek P, et al. N Engl J Med. 2019;380[7]629). It will be interesting to see if this can be repeatable going forward and if protocols can be established to maximize effectiveness.
As outlined above, it is difficult to make definitive conclusions or recommendations regarding electronic cigarette use at the present time. The risk of cancer and cardiopulmonary disease is likely to be significantly lower but not eliminated. Use as a smoking cessation aid is starting to show promise. Even without cessation, ongoing vaping is likely to be safer than ongoing smoking. Two caveats to this remain: some patients, in an effort to quit smoking, may take up vaping but eventually become “dual users.” This scenario has been associated with higher toxic exposure and possibly worse outcomes. The second caveat is that while there is promise to using this as a cessation tool, it should not yet replace other more well-studied, first-line agents in this regard. It should, perhaps, target patients who are motivated to quit but have failed more traditional methods. Finally, there continues to be concern that vaping could appeal to never smokers, given its perceived safety profile and ease of use in public places. This could lead to an overall increase in nicotine addiction, which could be a significant step backwards.
Dr. Clark is Assistant Professor, Pulmonary and Critical Care Medicine, UT Southwestern Medical Center, Dallas, Texas.
Restless legs syndrome: Update on evaluation and treatment
Restless legs syndrome (RLS) is a very common disease affecting about 10% of Caucasian adults with about one third of them having RLS symptoms severe enough to require treatment.
Although many patients still go undiagnosed or misdiagnosed, the diagnosis is easily established with the five diagnostic criteria that are simplified by the acronym URGES:
1. Urge to move the legs associated with unpleasant leg sensations.
2. Rest induces symptoms.
3. Gets better with activity.
4. Evening and nighttime worsening.
5. Solely not accounted by another medical or behavioral condition.
The diagnosis is based completely upon the history. However, supplemental tests can be helpful to rule out underlying conditions that increase the risk of RLS. Routine lab tests, such as serum creatinine (to rule out renal disease), TSH (to rule out thyroid disease), and a CBC/ferritin/iron with transferrin saturation (to rule out low iron stores) should be ordered if not done recently.
A polysomnographic sleep study should not be ordered unless there is a strong suspicion that sleep apnea is present. Even very frequent PLM (periodic limb movements) are not that helpful in confirming the diagnosis of RLS since they are nonspecific and often occurring with drug treatment (SSRIs, SNRIs) and many medical conditions such as sleep apnea, narcolepsy, and REM behavior disorder.
The paradigm for treating RLS has been presented in the consensus article published in 2013 (Silber MH, et al. Mayo Clin Proc. 2013 Sep;88[9]:977). Since 2013, there has been a gradual shift of that paradigm that recommended starting an approved dopamine agonist (pramipexole, ropinirole, or rotigotine) or an alpha-2-delta ligand (gabapentin enacarbil, gabapentin, or pregabalin) as first-line treatment. Although dopamine agonists provide excellent relief of RLS symptoms initially, with time, they tend to markedly worsen RLS. This process is called RLS augmentation and has become one of the most common causes of refractory RLS and difficult-to-treat patients.
RLS augmentation typically onsets a few months to several years after starting a short-acting dopamine agonist (DA) like pramipexole or ropinirole. It presents with symptoms occurring a few hours earlier than prior to starting the medication, symptoms becoming more intense with less rest time needed to trigger RLS symptoms, drugs becoming less effective both in effectiveness and duration of action, and spread of symptoms to other body parts (arms, trunk, and even head). The majority of physicians mistake this worsening of RLS for the natural progression of the disease and, thus, increase the dose of the DA, which provides temporary improvement. Further increases become progressively necessary until the patient is receiving very large doses, often exceeding 10 times the FDA maximum recommended doses. Eventually, further dose increments provide minimal additional benefit, leaving patients with severe, around the clock RLS symptoms causing extreme misery. To be more aware of augmentation, physicians should consider augmentation may be occurring whenever a patient who has been on a regimen of stable dopamine agonist treatment for at least 6 months requests more medication.
The incidence of augmentation for patients taking short-acting DA drugs is about 7% to 8% per year so that by 10 years, the vast majority of these patients with RLS are experiencing augmentation. Since it has been over 13 years since pramipexole and ropinirole have been approved for treating RLS, currently, over 75% of patients referred to national RLS experts are referred due to augmentation (although the actual referral diagnosis is often “refractory RLS”). Despite the concerns about augmentation, the short-acting DA drugs are by far the most commonly prescribed medications for initial treatment of RLS.
To help educate doctors about RLS augmentation, a consensus article was published in 2016 promoting guidelines for the prevention and treatment of RLS augmentation (Garcia-Borreguero D, et al. Sleep Med. 2016;21:1-11). Since augmentation occurs only with dopaminergic drugs (with the exception of tramadol), considering the use of nondopaminergic drugs for first-line therapy of RLS would dramatically decrease the occurrence of augmentation. This is a clear shift in the paradigm of choosing equally amongst the approved RLS drugs.
Unless contraindicated, the alpha-2-delta drugs should be the first consideration for treating new RLS patients. These drugs can be as effective as the DA drugs but cannot cause augmentation and, also, do not cause Impulse control disorders, which occur with the use of DAs. Furthermore, they reduce insomnia and anxiety that are both associated with RLS. The use of these drugs may be limited by their side effects, which include CNS depressive effects (sedation, dizziness, decreased balance or cognition) or depression.
When the alpha-2-delta ligands can’t be used due to lack of efficacy, side effects or cost, the DA drugs may then be appropriate. The rotigotine patch has the lowest incidence of augmentation, especially at the approved doses of up to 3 mg. If the rotigotine patch cannot be used (most often due to skin side effects or cost), then the short-acting DA drugs may be employed. Augmentation may be prevented or significantly delayed by starting these drugs at their lowest dose (.125 mg for pramipexole and .25 mg for ropinirole) and increasing the dose as little as possible, definitely not exceeding the approved RLS limits of .5 mg for pramipexole and 4 mg for ropinirole. My personal suggestion is not to exceed .25 mg for pramipexole and 1 mg for ropinirole as augmentation is dose-related but may occur at even the lowest doses. When patients need and request increased treatment for their RLS, rather than increasing the dose of the DA, instead, consider adding other medications, such as the alpha-2-delta ligands or even low dose opioids.
Managing augmentation is typically a very challenging problem for both the physician and patient; this is described in detail in the augmentation article referenced above. Decreasing, or better yet eliminating , the short-acting DA is the preferred method for treating augmentation. However, upon elimination of the DA, there is a short period of 1 to 4 weeks (average of 10-12 days) when the RLS symptoms get dramatically worse. Patients typically experience extremely severe RLS symptoms around the clock and may not be able to sleep at all until the RLS calms down. Most often, only low dose opioid treatment will enable them to get through this transition. The augmentation article (with its algorithm) may help physicians manage augmentation, but patients with severe augmentation may need referral to an RLS specialist who is experienced in this area and who is comfortable managing the disease with opioids.
Low iron levels are often associated with RLS, cause RLS symptoms to worsen, and increase the risk of augmentation (Allen RP, et al, and the International Restless Legs Syndrome Study Group (IRLSSG). Sleep Med. 2018;41:27). We typically suggest that patients with ferritin levels under 100 mcg/L should get supplemental iron. However, oral iron absorption is very limited when the patient’s ferritin is above 50 mcg/L and, therefore, most patients may require IV iron to improve their RLS symptoms. There are several IV iron preparations but only iron dextrose, iron carboxymaltose, and ferumoxytol are effective. When the ferritin level is increased to over 200 µg/L, RLS symptoms may be dramatically improved.
With the currently available treatment options, most patients should have their RLS symptoms well controlled without developing augmentation.
Dr. Buchfuhrer is with Stanford University, Department of Psychiatry and Behavioral Sciences in the School of Medicine, Division of Sleep Medicine, Stanford, Calif.
Restless legs syndrome (RLS) is a very common disease affecting about 10% of Caucasian adults with about one third of them having RLS symptoms severe enough to require treatment.
Although many patients still go undiagnosed or misdiagnosed, the diagnosis is easily established with the five diagnostic criteria that are simplified by the acronym URGES:
1. Urge to move the legs associated with unpleasant leg sensations.
2. Rest induces symptoms.
3. Gets better with activity.
4. Evening and nighttime worsening.
5. Solely not accounted by another medical or behavioral condition.
The diagnosis is based completely upon the history. However, supplemental tests can be helpful to rule out underlying conditions that increase the risk of RLS. Routine lab tests, such as serum creatinine (to rule out renal disease), TSH (to rule out thyroid disease), and a CBC/ferritin/iron with transferrin saturation (to rule out low iron stores) should be ordered if not done recently.
A polysomnographic sleep study should not be ordered unless there is a strong suspicion that sleep apnea is present. Even very frequent PLM (periodic limb movements) are not that helpful in confirming the diagnosis of RLS since they are nonspecific and often occurring with drug treatment (SSRIs, SNRIs) and many medical conditions such as sleep apnea, narcolepsy, and REM behavior disorder.
The paradigm for treating RLS has been presented in the consensus article published in 2013 (Silber MH, et al. Mayo Clin Proc. 2013 Sep;88[9]:977). Since 2013, there has been a gradual shift of that paradigm that recommended starting an approved dopamine agonist (pramipexole, ropinirole, or rotigotine) or an alpha-2-delta ligand (gabapentin enacarbil, gabapentin, or pregabalin) as first-line treatment. Although dopamine agonists provide excellent relief of RLS symptoms initially, with time, they tend to markedly worsen RLS. This process is called RLS augmentation and has become one of the most common causes of refractory RLS and difficult-to-treat patients.
RLS augmentation typically onsets a few months to several years after starting a short-acting dopamine agonist (DA) like pramipexole or ropinirole. It presents with symptoms occurring a few hours earlier than prior to starting the medication, symptoms becoming more intense with less rest time needed to trigger RLS symptoms, drugs becoming less effective both in effectiveness and duration of action, and spread of symptoms to other body parts (arms, trunk, and even head). The majority of physicians mistake this worsening of RLS for the natural progression of the disease and, thus, increase the dose of the DA, which provides temporary improvement. Further increases become progressively necessary until the patient is receiving very large doses, often exceeding 10 times the FDA maximum recommended doses. Eventually, further dose increments provide minimal additional benefit, leaving patients with severe, around the clock RLS symptoms causing extreme misery. To be more aware of augmentation, physicians should consider augmentation may be occurring whenever a patient who has been on a regimen of stable dopamine agonist treatment for at least 6 months requests more medication.
The incidence of augmentation for patients taking short-acting DA drugs is about 7% to 8% per year so that by 10 years, the vast majority of these patients with RLS are experiencing augmentation. Since it has been over 13 years since pramipexole and ropinirole have been approved for treating RLS, currently, over 75% of patients referred to national RLS experts are referred due to augmentation (although the actual referral diagnosis is often “refractory RLS”). Despite the concerns about augmentation, the short-acting DA drugs are by far the most commonly prescribed medications for initial treatment of RLS.
To help educate doctors about RLS augmentation, a consensus article was published in 2016 promoting guidelines for the prevention and treatment of RLS augmentation (Garcia-Borreguero D, et al. Sleep Med. 2016;21:1-11). Since augmentation occurs only with dopaminergic drugs (with the exception of tramadol), considering the use of nondopaminergic drugs for first-line therapy of RLS would dramatically decrease the occurrence of augmentation. This is a clear shift in the paradigm of choosing equally amongst the approved RLS drugs.
Unless contraindicated, the alpha-2-delta drugs should be the first consideration for treating new RLS patients. These drugs can be as effective as the DA drugs but cannot cause augmentation and, also, do not cause Impulse control disorders, which occur with the use of DAs. Furthermore, they reduce insomnia and anxiety that are both associated with RLS. The use of these drugs may be limited by their side effects, which include CNS depressive effects (sedation, dizziness, decreased balance or cognition) or depression.
When the alpha-2-delta ligands can’t be used due to lack of efficacy, side effects or cost, the DA drugs may then be appropriate. The rotigotine patch has the lowest incidence of augmentation, especially at the approved doses of up to 3 mg. If the rotigotine patch cannot be used (most often due to skin side effects or cost), then the short-acting DA drugs may be employed. Augmentation may be prevented or significantly delayed by starting these drugs at their lowest dose (.125 mg for pramipexole and .25 mg for ropinirole) and increasing the dose as little as possible, definitely not exceeding the approved RLS limits of .5 mg for pramipexole and 4 mg for ropinirole. My personal suggestion is not to exceed .25 mg for pramipexole and 1 mg for ropinirole as augmentation is dose-related but may occur at even the lowest doses. When patients need and request increased treatment for their RLS, rather than increasing the dose of the DA, instead, consider adding other medications, such as the alpha-2-delta ligands or even low dose opioids.
Managing augmentation is typically a very challenging problem for both the physician and patient; this is described in detail in the augmentation article referenced above. Decreasing, or better yet eliminating , the short-acting DA is the preferred method for treating augmentation. However, upon elimination of the DA, there is a short period of 1 to 4 weeks (average of 10-12 days) when the RLS symptoms get dramatically worse. Patients typically experience extremely severe RLS symptoms around the clock and may not be able to sleep at all until the RLS calms down. Most often, only low dose opioid treatment will enable them to get through this transition. The augmentation article (with its algorithm) may help physicians manage augmentation, but patients with severe augmentation may need referral to an RLS specialist who is experienced in this area and who is comfortable managing the disease with opioids.
Low iron levels are often associated with RLS, cause RLS symptoms to worsen, and increase the risk of augmentation (Allen RP, et al, and the International Restless Legs Syndrome Study Group (IRLSSG). Sleep Med. 2018;41:27). We typically suggest that patients with ferritin levels under 100 mcg/L should get supplemental iron. However, oral iron absorption is very limited when the patient’s ferritin is above 50 mcg/L and, therefore, most patients may require IV iron to improve their RLS symptoms. There are several IV iron preparations but only iron dextrose, iron carboxymaltose, and ferumoxytol are effective. When the ferritin level is increased to over 200 µg/L, RLS symptoms may be dramatically improved.
With the currently available treatment options, most patients should have their RLS symptoms well controlled without developing augmentation.
Dr. Buchfuhrer is with Stanford University, Department of Psychiatry and Behavioral Sciences in the School of Medicine, Division of Sleep Medicine, Stanford, Calif.
Restless legs syndrome (RLS) is a very common disease affecting about 10% of Caucasian adults with about one third of them having RLS symptoms severe enough to require treatment.
Although many patients still go undiagnosed or misdiagnosed, the diagnosis is easily established with the five diagnostic criteria that are simplified by the acronym URGES:
1. Urge to move the legs associated with unpleasant leg sensations.
2. Rest induces symptoms.
3. Gets better with activity.
4. Evening and nighttime worsening.
5. Solely not accounted by another medical or behavioral condition.
The diagnosis is based completely upon the history. However, supplemental tests can be helpful to rule out underlying conditions that increase the risk of RLS. Routine lab tests, such as serum creatinine (to rule out renal disease), TSH (to rule out thyroid disease), and a CBC/ferritin/iron with transferrin saturation (to rule out low iron stores) should be ordered if not done recently.
A polysomnographic sleep study should not be ordered unless there is a strong suspicion that sleep apnea is present. Even very frequent PLM (periodic limb movements) are not that helpful in confirming the diagnosis of RLS since they are nonspecific and often occurring with drug treatment (SSRIs, SNRIs) and many medical conditions such as sleep apnea, narcolepsy, and REM behavior disorder.
The paradigm for treating RLS has been presented in the consensus article published in 2013 (Silber MH, et al. Mayo Clin Proc. 2013 Sep;88[9]:977). Since 2013, there has been a gradual shift of that paradigm that recommended starting an approved dopamine agonist (pramipexole, ropinirole, or rotigotine) or an alpha-2-delta ligand (gabapentin enacarbil, gabapentin, or pregabalin) as first-line treatment. Although dopamine agonists provide excellent relief of RLS symptoms initially, with time, they tend to markedly worsen RLS. This process is called RLS augmentation and has become one of the most common causes of refractory RLS and difficult-to-treat patients.
RLS augmentation typically onsets a few months to several years after starting a short-acting dopamine agonist (DA) like pramipexole or ropinirole. It presents with symptoms occurring a few hours earlier than prior to starting the medication, symptoms becoming more intense with less rest time needed to trigger RLS symptoms, drugs becoming less effective both in effectiveness and duration of action, and spread of symptoms to other body parts (arms, trunk, and even head). The majority of physicians mistake this worsening of RLS for the natural progression of the disease and, thus, increase the dose of the DA, which provides temporary improvement. Further increases become progressively necessary until the patient is receiving very large doses, often exceeding 10 times the FDA maximum recommended doses. Eventually, further dose increments provide minimal additional benefit, leaving patients with severe, around the clock RLS symptoms causing extreme misery. To be more aware of augmentation, physicians should consider augmentation may be occurring whenever a patient who has been on a regimen of stable dopamine agonist treatment for at least 6 months requests more medication.
The incidence of augmentation for patients taking short-acting DA drugs is about 7% to 8% per year so that by 10 years, the vast majority of these patients with RLS are experiencing augmentation. Since it has been over 13 years since pramipexole and ropinirole have been approved for treating RLS, currently, over 75% of patients referred to national RLS experts are referred due to augmentation (although the actual referral diagnosis is often “refractory RLS”). Despite the concerns about augmentation, the short-acting DA drugs are by far the most commonly prescribed medications for initial treatment of RLS.
To help educate doctors about RLS augmentation, a consensus article was published in 2016 promoting guidelines for the prevention and treatment of RLS augmentation (Garcia-Borreguero D, et al. Sleep Med. 2016;21:1-11). Since augmentation occurs only with dopaminergic drugs (with the exception of tramadol), considering the use of nondopaminergic drugs for first-line therapy of RLS would dramatically decrease the occurrence of augmentation. This is a clear shift in the paradigm of choosing equally amongst the approved RLS drugs.
Unless contraindicated, the alpha-2-delta drugs should be the first consideration for treating new RLS patients. These drugs can be as effective as the DA drugs but cannot cause augmentation and, also, do not cause Impulse control disorders, which occur with the use of DAs. Furthermore, they reduce insomnia and anxiety that are both associated with RLS. The use of these drugs may be limited by their side effects, which include CNS depressive effects (sedation, dizziness, decreased balance or cognition) or depression.
When the alpha-2-delta ligands can’t be used due to lack of efficacy, side effects or cost, the DA drugs may then be appropriate. The rotigotine patch has the lowest incidence of augmentation, especially at the approved doses of up to 3 mg. If the rotigotine patch cannot be used (most often due to skin side effects or cost), then the short-acting DA drugs may be employed. Augmentation may be prevented or significantly delayed by starting these drugs at their lowest dose (.125 mg for pramipexole and .25 mg for ropinirole) and increasing the dose as little as possible, definitely not exceeding the approved RLS limits of .5 mg for pramipexole and 4 mg for ropinirole. My personal suggestion is not to exceed .25 mg for pramipexole and 1 mg for ropinirole as augmentation is dose-related but may occur at even the lowest doses. When patients need and request increased treatment for their RLS, rather than increasing the dose of the DA, instead, consider adding other medications, such as the alpha-2-delta ligands or even low dose opioids.
Managing augmentation is typically a very challenging problem for both the physician and patient; this is described in detail in the augmentation article referenced above. Decreasing, or better yet eliminating , the short-acting DA is the preferred method for treating augmentation. However, upon elimination of the DA, there is a short period of 1 to 4 weeks (average of 10-12 days) when the RLS symptoms get dramatically worse. Patients typically experience extremely severe RLS symptoms around the clock and may not be able to sleep at all until the RLS calms down. Most often, only low dose opioid treatment will enable them to get through this transition. The augmentation article (with its algorithm) may help physicians manage augmentation, but patients with severe augmentation may need referral to an RLS specialist who is experienced in this area and who is comfortable managing the disease with opioids.
Low iron levels are often associated with RLS, cause RLS symptoms to worsen, and increase the risk of augmentation (Allen RP, et al, and the International Restless Legs Syndrome Study Group (IRLSSG). Sleep Med. 2018;41:27). We typically suggest that patients with ferritin levels under 100 mcg/L should get supplemental iron. However, oral iron absorption is very limited when the patient’s ferritin is above 50 mcg/L and, therefore, most patients may require IV iron to improve their RLS symptoms. There are several IV iron preparations but only iron dextrose, iron carboxymaltose, and ferumoxytol are effective. When the ferritin level is increased to over 200 µg/L, RLS symptoms may be dramatically improved.
With the currently available treatment options, most patients should have their RLS symptoms well controlled without developing augmentation.
Dr. Buchfuhrer is with Stanford University, Department of Psychiatry and Behavioral Sciences in the School of Medicine, Division of Sleep Medicine, Stanford, Calif.
Endobronchial valves for lung volume reduction: What can we offer patients with advanced emphysema?
The global burden COPD is considerable. In the United States, it is the third most common cause of death and is associated with over $50 billion in annual direct and indirect health-care expenditures (Guarascio AJ, et al. Clinicoecon Outcomes Res. 2013;5:235). For patients with severe emphysema with hyperinflation, dyspnea is often a quality of life (QOL)-limiting symptom (O’Donnell DE, et al. Ann Am Thorac Soc. 2017;14:S30). Few proven palliation options exist, particularly for patients with dyspnea refractory to smoking cessation, medical management with bronchodilators, and pulmonary rehabilitation. The recent Food and Drug Administration (FDA) approval of two endobronchial valves for lung volume reduction has established the increasing importance of bronchoscopy as a management tool in advanced COPD.
Why were these valves developed?
For decades, lung volume reduction has been investigated as a mechanical approach to counter-act the physiologic effects of emphysematous hyperinflation. Its goal is to improve lung elastic recoil, respiratory muscle mechanical advantage and efficiency, and ventilation/perfusion matching. The landmark National Emphysema Treatment Trial (NETT), published in 2001 and 2003, demonstrated that in a select patient population (upper lobe-predominant emphysema and low exercise capacity), lung volume reduction surgery (LVRS) lowers mortality and improves QOL and exercise tolerance (Fishman A et al. N Engl J Med. 2003;348:2059). Despite the encouraging results in this study subpopulation, LVRS is per-formed infrequently (Decker MR, et al. J Thorac Cardiovasc Surg. 2014;148:2651). Concern about its morbidity and the specialized nature of the procedure has hindered widespread adoption. Subsequently, endobronchial techniques have been developed as an alternative to surgical lung volume reduction.
How does bronchoscopic lung volume reduction (BLVR) benefit patients with emphysema?
Valves used for ELVR are removable one-way flow devices placed by flexible bronchoscopy into selected airways supplying emphysematous lung. The valves block air entry but allow the exit of secretions and trapped air. This results in atelectasis of the targeted lobe and a decrease in lung volume.
Which endobronchial valves are available in the United States?
In 2018, two valves were approved by the FDA for bronchoscopic lung volume reduction (BLVR) – the Zephyr® EBV (Pulmonx) ( (Fig 1) and the Spiration® Valve System (Olympus) (IBV) (Fig 2). The Zephyr® EBV is a duckbill-shaped silicone valve mounted within a self-expanding nitinol (nickel titanium alloy) stent. It comes in three sizes for airways with a diameter 4 - 8.5 mm. The Spiration® IBV umbrella-shaped valve is com-posed of six nitinol struts surfaced with polyurethane. Its four sizes accommodate airway diameters 5 - 9 mm.
What’s the evidence behind BLVR?
Zephyr® Valves
The Endobronchial Valve for Emphysema Palliation Trial (VENT), the largest valve trial thus far, randomized patients with severe heterogeneous emphysema to receive unilateral Zephyr® valve placement or standard medical care (Sciurba FC, et al. N Engl J Med. 2010;363:1233). Overall improvement in spirometry and dyspnea scores was modest in the valve group. Post-hoc analysis identified an important subgroup of patients with significant clinical benefit, those with a complete fissure. This finding gave guidance to further EBV studies on patients with severe emphysema and absent collateral ventilation (CV).
Identifying a complete fissure on imaging is now used as a surrogate for assessing CV and is an integral part of the initial profiling of patients for EBV therapy (Koster TD, et al. Respiration. 2016;92(3):150).
In the STELVIO trial, 68 patients were randomized to Zephyr ® EBV placement or standard medical care (Klooster K, et al. N Engl J Med. 2015;373:2325). Those with EBV placement had significantly improved lung function and exercise capacity. TRANSFORM, a multicenter trial evaluating Zephyr® EBV placement in heterogeneous emphysema, showed similar results (Kemp SV, et al. Am J Respir Crit Care Med. 2017;196:1535).
The IMPACT trial compared patients with homogenous emphysema without CV to standard medical therapy alone. It showed improvement in FEV1, QOL scores, and exercise tolerance in the EBV group. This study affirmed that the absence of CV, rather than the pattern of emphysema, correlates with the clinical benefit from EBV therapy (Valipour A, et al. Am J Respir Crit Care Med. 2016;194(9):1073). Finally, LIBERATE, a multicenter study on the Zephyr® EBV, examined its placement in patients with heterogenous emphysema. This study demonstrated improvement in spirometry, QOL, and 6-minute walk test (6-MWT) distance (Criner GJ, et al. Am J Respir Crit Care Med. 2018;198:1151) over a longer period, 12 months, bolstering the findings of prior studies. These results prompted the Zephyr® valve’s FDA approval.
Spiration® Valves
Small trials have shown favorable results with the Spiration® IBV for BLVR, including a pilot multicenter cohort study of 30 patients with heterogeneous, upper-lobe emphysema who underwent valve placement (Wood DE, et al. J Thorac Cardiovasc Surg. 2007;133:65). In this trial, investigators found significant improvement in QOL scores, but no change in FEV1 or other physiologic parameters.
The EMPROVE trial is a multicenter, prospective, randomized, controlled study assessing BLVR with the Spiration® IBV. Six- and twelve-month data from the trial were presented in 2018 at the American Thoracic Society Conference and at the European Respiratory Society International Conference.
Collateral Ventilation
Identifying patients in whom there is no CV between lobes is critical to success with BLVR. Collateral ventilation allows air to bypass the valve occlusion distally, thereby negating the desired effect of valve placement, lobar atelectasis. High-resolution computed tomography (HRCT) scanning combined with quantitative software can be used to assess emphysema distribution and fissure integrity. Additionally, a proprietary technology, the Chartis System®, can be employed intra-procedure to estimate CV by measuring airway flow, resistance, and pressure in targeted balloon-occluded segments. Absence of CV based on Chartis evaluation was an inclusion criterion in the aforementioned valve studies.
Which patients with emphysema should be referred for consideration of valve placement?
The following criteria should be used in selecting patients for referral for BLVR:
• FEV1 15% - 45% of predicted value at baseline
• Evidence of hyperinflation: TLC greater than or equal to 100% and RV greater than or equal to 175%
• Baseline postpulmonary rehabilitation 6-MWT distance of 100 - 500 meters
• Clinically stable on < 20 mg prednisone (or equivalent) daily
• Nonsmoking for at least 4 months
• Integrity of one or both major fissures at least 75%
• Ability to provide informed consent and to tolerate bronchoscopy
Complications
The most common complication after valve placement is pneumothorax – a double-edged sword in that it typically indicates the achievement of atelectasis. In published trials, the frequency of pneumothorax varies. Some studies document rates below 10%. Others report rates of nearly 30% (Gompelmann D, et al. Respiration. 2014;87:485). In landmark trials, death related to pneumothorax occurred rarely. Most severe pneumothoraces occur within the first 72 hours after valve placement. This has prompted many centers to observe postprocedure patients in hospital for an extended period. Pneumonia and COPD exacerbations have also been reported after EBV placement. Therefore, in some trials, patients received prophylactic prednisolone and azithromycin. Other less common complications are hemoptysis, granulation tissue formation, and valve migration.
What’s ahead for ELVR?
Overall, valve technology for BLVR is an exciting option in the management of patients with severe emphysema and is now a staple for any advanced emphysema program. Key areas of future interest include management of patients with partial fissures, minimizing adverse procedural effects, and developing programs to optimize and streamline a multidisciplinary approach to timely and efficient referral, assessment, and intervention. As more patients with COPD undergo ELVR, one goal should be to create multi-institution prospective studies as well as registries to delineate further the optimal use of endobronchial valves for lung volume reduction.
Zephyr® Endobronchial Valve (Pulmonx)
Spiration® Valve System (Olympus)
The American College of Chest Physicians (CHEST) does not endorse or supp
The global burden COPD is considerable. In the United States, it is the third most common cause of death and is associated with over $50 billion in annual direct and indirect health-care expenditures (Guarascio AJ, et al. Clinicoecon Outcomes Res. 2013;5:235). For patients with severe emphysema with hyperinflation, dyspnea is often a quality of life (QOL)-limiting symptom (O’Donnell DE, et al. Ann Am Thorac Soc. 2017;14:S30). Few proven palliation options exist, particularly for patients with dyspnea refractory to smoking cessation, medical management with bronchodilators, and pulmonary rehabilitation. The recent Food and Drug Administration (FDA) approval of two endobronchial valves for lung volume reduction has established the increasing importance of bronchoscopy as a management tool in advanced COPD.
Why were these valves developed?
For decades, lung volume reduction has been investigated as a mechanical approach to counter-act the physiologic effects of emphysematous hyperinflation. Its goal is to improve lung elastic recoil, respiratory muscle mechanical advantage and efficiency, and ventilation/perfusion matching. The landmark National Emphysema Treatment Trial (NETT), published in 2001 and 2003, demonstrated that in a select patient population (upper lobe-predominant emphysema and low exercise capacity), lung volume reduction surgery (LVRS) lowers mortality and improves QOL and exercise tolerance (Fishman A et al. N Engl J Med. 2003;348:2059). Despite the encouraging results in this study subpopulation, LVRS is per-formed infrequently (Decker MR, et al. J Thorac Cardiovasc Surg. 2014;148:2651). Concern about its morbidity and the specialized nature of the procedure has hindered widespread adoption. Subsequently, endobronchial techniques have been developed as an alternative to surgical lung volume reduction.
How does bronchoscopic lung volume reduction (BLVR) benefit patients with emphysema?
Valves used for ELVR are removable one-way flow devices placed by flexible bronchoscopy into selected airways supplying emphysematous lung. The valves block air entry but allow the exit of secretions and trapped air. This results in atelectasis of the targeted lobe and a decrease in lung volume.
Which endobronchial valves are available in the United States?
In 2018, two valves were approved by the FDA for bronchoscopic lung volume reduction (BLVR) – the Zephyr® EBV (Pulmonx) ( (Fig 1) and the Spiration® Valve System (Olympus) (IBV) (Fig 2). The Zephyr® EBV is a duckbill-shaped silicone valve mounted within a self-expanding nitinol (nickel titanium alloy) stent. It comes in three sizes for airways with a diameter 4 - 8.5 mm. The Spiration® IBV umbrella-shaped valve is com-posed of six nitinol struts surfaced with polyurethane. Its four sizes accommodate airway diameters 5 - 9 mm.
What’s the evidence behind BLVR?
Zephyr® Valves
The Endobronchial Valve for Emphysema Palliation Trial (VENT), the largest valve trial thus far, randomized patients with severe heterogeneous emphysema to receive unilateral Zephyr® valve placement or standard medical care (Sciurba FC, et al. N Engl J Med. 2010;363:1233). Overall improvement in spirometry and dyspnea scores was modest in the valve group. Post-hoc analysis identified an important subgroup of patients with significant clinical benefit, those with a complete fissure. This finding gave guidance to further EBV studies on patients with severe emphysema and absent collateral ventilation (CV).
Identifying a complete fissure on imaging is now used as a surrogate for assessing CV and is an integral part of the initial profiling of patients for EBV therapy (Koster TD, et al. Respiration. 2016;92(3):150).
In the STELVIO trial, 68 patients were randomized to Zephyr ® EBV placement or standard medical care (Klooster K, et al. N Engl J Med. 2015;373:2325). Those with EBV placement had significantly improved lung function and exercise capacity. TRANSFORM, a multicenter trial evaluating Zephyr® EBV placement in heterogeneous emphysema, showed similar results (Kemp SV, et al. Am J Respir Crit Care Med. 2017;196:1535).
The IMPACT trial compared patients with homogenous emphysema without CV to standard medical therapy alone. It showed improvement in FEV1, QOL scores, and exercise tolerance in the EBV group. This study affirmed that the absence of CV, rather than the pattern of emphysema, correlates with the clinical benefit from EBV therapy (Valipour A, et al. Am J Respir Crit Care Med. 2016;194(9):1073). Finally, LIBERATE, a multicenter study on the Zephyr® EBV, examined its placement in patients with heterogenous emphysema. This study demonstrated improvement in spirometry, QOL, and 6-minute walk test (6-MWT) distance (Criner GJ, et al. Am J Respir Crit Care Med. 2018;198:1151) over a longer period, 12 months, bolstering the findings of prior studies. These results prompted the Zephyr® valve’s FDA approval.
Spiration® Valves
Small trials have shown favorable results with the Spiration® IBV for BLVR, including a pilot multicenter cohort study of 30 patients with heterogeneous, upper-lobe emphysema who underwent valve placement (Wood DE, et al. J Thorac Cardiovasc Surg. 2007;133:65). In this trial, investigators found significant improvement in QOL scores, but no change in FEV1 or other physiologic parameters.
The EMPROVE trial is a multicenter, prospective, randomized, controlled study assessing BLVR with the Spiration® IBV. Six- and twelve-month data from the trial were presented in 2018 at the American Thoracic Society Conference and at the European Respiratory Society International Conference.
Collateral Ventilation
Identifying patients in whom there is no CV between lobes is critical to success with BLVR. Collateral ventilation allows air to bypass the valve occlusion distally, thereby negating the desired effect of valve placement, lobar atelectasis. High-resolution computed tomography (HRCT) scanning combined with quantitative software can be used to assess emphysema distribution and fissure integrity. Additionally, a proprietary technology, the Chartis System®, can be employed intra-procedure to estimate CV by measuring airway flow, resistance, and pressure in targeted balloon-occluded segments. Absence of CV based on Chartis evaluation was an inclusion criterion in the aforementioned valve studies.
Which patients with emphysema should be referred for consideration of valve placement?
The following criteria should be used in selecting patients for referral for BLVR:
• FEV1 15% - 45% of predicted value at baseline
• Evidence of hyperinflation: TLC greater than or equal to 100% and RV greater than or equal to 175%
• Baseline postpulmonary rehabilitation 6-MWT distance of 100 - 500 meters
• Clinically stable on < 20 mg prednisone (or equivalent) daily
• Nonsmoking for at least 4 months
• Integrity of one or both major fissures at least 75%
• Ability to provide informed consent and to tolerate bronchoscopy
Complications
The most common complication after valve placement is pneumothorax – a double-edged sword in that it typically indicates the achievement of atelectasis. In published trials, the frequency of pneumothorax varies. Some studies document rates below 10%. Others report rates of nearly 30% (Gompelmann D, et al. Respiration. 2014;87:485). In landmark trials, death related to pneumothorax occurred rarely. Most severe pneumothoraces occur within the first 72 hours after valve placement. This has prompted many centers to observe postprocedure patients in hospital for an extended period. Pneumonia and COPD exacerbations have also been reported after EBV placement. Therefore, in some trials, patients received prophylactic prednisolone and azithromycin. Other less common complications are hemoptysis, granulation tissue formation, and valve migration.
What’s ahead for ELVR?
Overall, valve technology for BLVR is an exciting option in the management of patients with severe emphysema and is now a staple for any advanced emphysema program. Key areas of future interest include management of patients with partial fissures, minimizing adverse procedural effects, and developing programs to optimize and streamline a multidisciplinary approach to timely and efficient referral, assessment, and intervention. As more patients with COPD undergo ELVR, one goal should be to create multi-institution prospective studies as well as registries to delineate further the optimal use of endobronchial valves for lung volume reduction.
Zephyr® Endobronchial Valve (Pulmonx)
Spiration® Valve System (Olympus)
The American College of Chest Physicians (CHEST) does not endorse or supp
The global burden COPD is considerable. In the United States, it is the third most common cause of death and is associated with over $50 billion in annual direct and indirect health-care expenditures (Guarascio AJ, et al. Clinicoecon Outcomes Res. 2013;5:235). For patients with severe emphysema with hyperinflation, dyspnea is often a quality of life (QOL)-limiting symptom (O’Donnell DE, et al. Ann Am Thorac Soc. 2017;14:S30). Few proven palliation options exist, particularly for patients with dyspnea refractory to smoking cessation, medical management with bronchodilators, and pulmonary rehabilitation. The recent Food and Drug Administration (FDA) approval of two endobronchial valves for lung volume reduction has established the increasing importance of bronchoscopy as a management tool in advanced COPD.
Why were these valves developed?
For decades, lung volume reduction has been investigated as a mechanical approach to counter-act the physiologic effects of emphysematous hyperinflation. Its goal is to improve lung elastic recoil, respiratory muscle mechanical advantage and efficiency, and ventilation/perfusion matching. The landmark National Emphysema Treatment Trial (NETT), published in 2001 and 2003, demonstrated that in a select patient population (upper lobe-predominant emphysema and low exercise capacity), lung volume reduction surgery (LVRS) lowers mortality and improves QOL and exercise tolerance (Fishman A et al. N Engl J Med. 2003;348:2059). Despite the encouraging results in this study subpopulation, LVRS is per-formed infrequently (Decker MR, et al. J Thorac Cardiovasc Surg. 2014;148:2651). Concern about its morbidity and the specialized nature of the procedure has hindered widespread adoption. Subsequently, endobronchial techniques have been developed as an alternative to surgical lung volume reduction.
How does bronchoscopic lung volume reduction (BLVR) benefit patients with emphysema?
Valves used for ELVR are removable one-way flow devices placed by flexible bronchoscopy into selected airways supplying emphysematous lung. The valves block air entry but allow the exit of secretions and trapped air. This results in atelectasis of the targeted lobe and a decrease in lung volume.
Which endobronchial valves are available in the United States?
In 2018, two valves were approved by the FDA for bronchoscopic lung volume reduction (BLVR) – the Zephyr® EBV (Pulmonx) ( (Fig 1) and the Spiration® Valve System (Olympus) (IBV) (Fig 2). The Zephyr® EBV is a duckbill-shaped silicone valve mounted within a self-expanding nitinol (nickel titanium alloy) stent. It comes in three sizes for airways with a diameter 4 - 8.5 mm. The Spiration® IBV umbrella-shaped valve is com-posed of six nitinol struts surfaced with polyurethane. Its four sizes accommodate airway diameters 5 - 9 mm.
What’s the evidence behind BLVR?
Zephyr® Valves
The Endobronchial Valve for Emphysema Palliation Trial (VENT), the largest valve trial thus far, randomized patients with severe heterogeneous emphysema to receive unilateral Zephyr® valve placement or standard medical care (Sciurba FC, et al. N Engl J Med. 2010;363:1233). Overall improvement in spirometry and dyspnea scores was modest in the valve group. Post-hoc analysis identified an important subgroup of patients with significant clinical benefit, those with a complete fissure. This finding gave guidance to further EBV studies on patients with severe emphysema and absent collateral ventilation (CV).
Identifying a complete fissure on imaging is now used as a surrogate for assessing CV and is an integral part of the initial profiling of patients for EBV therapy (Koster TD, et al. Respiration. 2016;92(3):150).
In the STELVIO trial, 68 patients were randomized to Zephyr ® EBV placement or standard medical care (Klooster K, et al. N Engl J Med. 2015;373:2325). Those with EBV placement had significantly improved lung function and exercise capacity. TRANSFORM, a multicenter trial evaluating Zephyr® EBV placement in heterogeneous emphysema, showed similar results (Kemp SV, et al. Am J Respir Crit Care Med. 2017;196:1535).
The IMPACT trial compared patients with homogenous emphysema without CV to standard medical therapy alone. It showed improvement in FEV1, QOL scores, and exercise tolerance in the EBV group. This study affirmed that the absence of CV, rather than the pattern of emphysema, correlates with the clinical benefit from EBV therapy (Valipour A, et al. Am J Respir Crit Care Med. 2016;194(9):1073). Finally, LIBERATE, a multicenter study on the Zephyr® EBV, examined its placement in patients with heterogenous emphysema. This study demonstrated improvement in spirometry, QOL, and 6-minute walk test (6-MWT) distance (Criner GJ, et al. Am J Respir Crit Care Med. 2018;198:1151) over a longer period, 12 months, bolstering the findings of prior studies. These results prompted the Zephyr® valve’s FDA approval.
Spiration® Valves
Small trials have shown favorable results with the Spiration® IBV for BLVR, including a pilot multicenter cohort study of 30 patients with heterogeneous, upper-lobe emphysema who underwent valve placement (Wood DE, et al. J Thorac Cardiovasc Surg. 2007;133:65). In this trial, investigators found significant improvement in QOL scores, but no change in FEV1 or other physiologic parameters.
The EMPROVE trial is a multicenter, prospective, randomized, controlled study assessing BLVR with the Spiration® IBV. Six- and twelve-month data from the trial were presented in 2018 at the American Thoracic Society Conference and at the European Respiratory Society International Conference.
Collateral Ventilation
Identifying patients in whom there is no CV between lobes is critical to success with BLVR. Collateral ventilation allows air to bypass the valve occlusion distally, thereby negating the desired effect of valve placement, lobar atelectasis. High-resolution computed tomography (HRCT) scanning combined with quantitative software can be used to assess emphysema distribution and fissure integrity. Additionally, a proprietary technology, the Chartis System®, can be employed intra-procedure to estimate CV by measuring airway flow, resistance, and pressure in targeted balloon-occluded segments. Absence of CV based on Chartis evaluation was an inclusion criterion in the aforementioned valve studies.
Which patients with emphysema should be referred for consideration of valve placement?
The following criteria should be used in selecting patients for referral for BLVR:
• FEV1 15% - 45% of predicted value at baseline
• Evidence of hyperinflation: TLC greater than or equal to 100% and RV greater than or equal to 175%
• Baseline postpulmonary rehabilitation 6-MWT distance of 100 - 500 meters
• Clinically stable on < 20 mg prednisone (or equivalent) daily
• Nonsmoking for at least 4 months
• Integrity of one or both major fissures at least 75%
• Ability to provide informed consent and to tolerate bronchoscopy
Complications
The most common complication after valve placement is pneumothorax – a double-edged sword in that it typically indicates the achievement of atelectasis. In published trials, the frequency of pneumothorax varies. Some studies document rates below 10%. Others report rates of nearly 30% (Gompelmann D, et al. Respiration. 2014;87:485). In landmark trials, death related to pneumothorax occurred rarely. Most severe pneumothoraces occur within the first 72 hours after valve placement. This has prompted many centers to observe postprocedure patients in hospital for an extended period. Pneumonia and COPD exacerbations have also been reported after EBV placement. Therefore, in some trials, patients received prophylactic prednisolone and azithromycin. Other less common complications are hemoptysis, granulation tissue formation, and valve migration.
What’s ahead for ELVR?
Overall, valve technology for BLVR is an exciting option in the management of patients with severe emphysema and is now a staple for any advanced emphysema program. Key areas of future interest include management of patients with partial fissures, minimizing adverse procedural effects, and developing programs to optimize and streamline a multidisciplinary approach to timely and efficient referral, assessment, and intervention. As more patients with COPD undergo ELVR, one goal should be to create multi-institution prospective studies as well as registries to delineate further the optimal use of endobronchial valves for lung volume reduction.
Zephyr® Endobronchial Valve (Pulmonx)
Spiration® Valve System (Olympus)
The American College of Chest Physicians (CHEST) does not endorse or supp