Enhanced recovery pathways in gynecology

Article Type
Changed
Wed, 01/02/2019 - 09:41

 

Enhanced recovery surgical principles were first described in the 1990s.1 These principles postulate that the body’s stress response to surgical injury and deviation from normal physiology is the source of postoperative morbidity. Thus, enhanced recovery programs are designed around perioperative interventions that mitigate and help the body cope with the surgical stress response.

Many of these interventions run counter to traditional perioperative care paradigms. Enhanced recovery protocols are diverse but have common themes of avoiding preoperative fasting and bowel preparation, early oral intake, limiting use of drains and catheters, multimodal analgesia, early ambulation, and prioritizing euvolemia and normothermia. Individual interventions in these areas are combined to create a master protocol, which is implemented as a bundle to improve surgical outcomes.

Dr. Paola A. Gehrig
Dr. Paola A. Gehrig

Current components

Minimizing preoperative fasting, early postoperative refeeding, and preoperative carbohydrate-loading drinks are all key aspects of enhanced recovery protocols. “NPO after midnight” has been a longstanding rule due to the risk of aspiration with intubation. However, a Cochrane review found no evidence that a shortened period of fasting was associated with an increased risk of aspiration or related morbidity. Currently, the American Society of Anesthesiologists recommends only a 6-hour fast for solid foods and 2 hours for clear liquids.2,3

Preoperative fasting causes depletion of glycogen stores leading to insulin resistance and hyperglycemia, which are both associated with postoperative complications and morbidity.4 Preoperative carbohydrate-loading drinks can reverse some of the effects of limited preoperative fasting including preventing insulin resistance and hyperglycemia.5

Postoperative fasting should also be avoided. Early enteral intake is very important to decrease time spent in a catabolic state and decrease insulin resistance. In gynecology patients, early refeeding is associated with a faster return of bowel function and a decreased length of stay without an increase in postoperative complications.6 Notably, patients undergoing early feeding consistently experience more nausea and vomiting, but this is not associated with complications.7

The fluid management goal in enhanced recovery is to maintain perioperative euvolemia, as both hypovolemia and hypervolemia have negative physiologic consequences. When studied, fluid protocols designed to minimize the use of postoperative fluids have resulted in decreased cardiopulmonary complications, decreased postoperative morbidity, faster return of bowel function, and shorter hospital stays.8 Given the morbidity associated with fluid overload, enhanced recovery protocols recommend that minimal fluids be given in the operating room and intravenous fluids be removed as quickly as possible, often with first oral intake or postoperative day 1 at the latest.

Dr. Emma L. Barber
High-quality postoperative pain control is critical to achieve the goals of early mobilization and early feeding. Pain management strategies focus on decreasing the total amount of intravenous opioids by combining regional anesthetics techniques with multimodal pharmacologic pain management. Minimizing the surgical insult and other deviations from normal physiology are important as well. This includes avoiding peritoneal drains, no routine use of nasogastric tubes, quick removal of urinary catheters, and use of minimally invasive surgery and the smallest laparotomy incision if possible.

Engagement of the patient in their perioperative recovery with patient education materials and expectations for postoperative tasks, such as early refeeding, spirometry, and ambulation are all important components of enhanced recovery. Patients become partners in achieving postoperative milestones, and this results in improved outcomes such as decreased pain scores and shorter recoveries.

Evidence in gynecology

Enhanced recovery has been studied in many surgical disciplines including urology, colorectal surgery, hepatobiliary surgery, and gynecology. High-quality studies of abdominal and vaginal hysterectomy patients have consistently found a decrease in length of stay with no difference in readmission or postoperative complication rates.9 An interesting study also found that an enhanced recovery program was associated with decreased nursing time required for patient care.10

For ovarian cancer patients, enhanced recovery is associated with decreased length of stay, decreased time to return of bowel function, and improved quality of life. Enhanced recovery is also cost saving, saving $257-$697 per vaginal hysterectomy patient and $5,410-$7,600 per ovarian cancer patient.11

Enhanced recovery protocols are safe, evidenced based, cost saving, and are increasingly being adopted as clinicians and health systems become aware of their benefits.

References

1. Br J Anaesth. 1997 May;78(5):606-17.

2. Cochrane Database Syst Rev. 2003 Oct 20;(4):CD004423.

3. Anesthesiology. 1999 Mar;90(3):896-905.

4. J Am Coll Surg. 2012 Jan;214(1):68-80.

5. Clin Nutr. 1998 Apr;17(2):65-71.

6. Cochrane Database Syst Rev. 2007 Oct 17;(4):CD004508.

7. Obstet Gynecol. 1998 Jul;92(1):94-7.

8. Br J Surg. 2009 Apr;96(4):331-41.

9. Obstet Gynecol. 2013 Aug;122(2 Pt 1):319-28.

10. Qual Saf Health Care. 2009 Jun;18(3):236-40.

11. Gynecol Oncol. 2008 Feb;108(2):282-6.
 

Dr. Gehrig is professor and director of gynecologic oncology at the University of North Carolina at Chapel Hill. Dr. Barber is a third-year fellow in gynecologic oncology at the university. They reported having no financial disclosures relevant to this column. Email them at [email protected].

Publications
Topics
Sections

 

Enhanced recovery surgical principles were first described in the 1990s.1 These principles postulate that the body’s stress response to surgical injury and deviation from normal physiology is the source of postoperative morbidity. Thus, enhanced recovery programs are designed around perioperative interventions that mitigate and help the body cope with the surgical stress response.

Many of these interventions run counter to traditional perioperative care paradigms. Enhanced recovery protocols are diverse but have common themes of avoiding preoperative fasting and bowel preparation, early oral intake, limiting use of drains and catheters, multimodal analgesia, early ambulation, and prioritizing euvolemia and normothermia. Individual interventions in these areas are combined to create a master protocol, which is implemented as a bundle to improve surgical outcomes.

Dr. Paola A. Gehrig
Dr. Paola A. Gehrig

Current components

Minimizing preoperative fasting, early postoperative refeeding, and preoperative carbohydrate-loading drinks are all key aspects of enhanced recovery protocols. “NPO after midnight” has been a longstanding rule due to the risk of aspiration with intubation. However, a Cochrane review found no evidence that a shortened period of fasting was associated with an increased risk of aspiration or related morbidity. Currently, the American Society of Anesthesiologists recommends only a 6-hour fast for solid foods and 2 hours for clear liquids.2,3

Preoperative fasting causes depletion of glycogen stores leading to insulin resistance and hyperglycemia, which are both associated with postoperative complications and morbidity.4 Preoperative carbohydrate-loading drinks can reverse some of the effects of limited preoperative fasting including preventing insulin resistance and hyperglycemia.5

Postoperative fasting should also be avoided. Early enteral intake is very important to decrease time spent in a catabolic state and decrease insulin resistance. In gynecology patients, early refeeding is associated with a faster return of bowel function and a decreased length of stay without an increase in postoperative complications.6 Notably, patients undergoing early feeding consistently experience more nausea and vomiting, but this is not associated with complications.7

The fluid management goal in enhanced recovery is to maintain perioperative euvolemia, as both hypovolemia and hypervolemia have negative physiologic consequences. When studied, fluid protocols designed to minimize the use of postoperative fluids have resulted in decreased cardiopulmonary complications, decreased postoperative morbidity, faster return of bowel function, and shorter hospital stays.8 Given the morbidity associated with fluid overload, enhanced recovery protocols recommend that minimal fluids be given in the operating room and intravenous fluids be removed as quickly as possible, often with first oral intake or postoperative day 1 at the latest.

Dr. Emma L. Barber
High-quality postoperative pain control is critical to achieve the goals of early mobilization and early feeding. Pain management strategies focus on decreasing the total amount of intravenous opioids by combining regional anesthetics techniques with multimodal pharmacologic pain management. Minimizing the surgical insult and other deviations from normal physiology are important as well. This includes avoiding peritoneal drains, no routine use of nasogastric tubes, quick removal of urinary catheters, and use of minimally invasive surgery and the smallest laparotomy incision if possible.

Engagement of the patient in their perioperative recovery with patient education materials and expectations for postoperative tasks, such as early refeeding, spirometry, and ambulation are all important components of enhanced recovery. Patients become partners in achieving postoperative milestones, and this results in improved outcomes such as decreased pain scores and shorter recoveries.

Evidence in gynecology

Enhanced recovery has been studied in many surgical disciplines including urology, colorectal surgery, hepatobiliary surgery, and gynecology. High-quality studies of abdominal and vaginal hysterectomy patients have consistently found a decrease in length of stay with no difference in readmission or postoperative complication rates.9 An interesting study also found that an enhanced recovery program was associated with decreased nursing time required for patient care.10

For ovarian cancer patients, enhanced recovery is associated with decreased length of stay, decreased time to return of bowel function, and improved quality of life. Enhanced recovery is also cost saving, saving $257-$697 per vaginal hysterectomy patient and $5,410-$7,600 per ovarian cancer patient.11

Enhanced recovery protocols are safe, evidenced based, cost saving, and are increasingly being adopted as clinicians and health systems become aware of their benefits.

References

1. Br J Anaesth. 1997 May;78(5):606-17.

2. Cochrane Database Syst Rev. 2003 Oct 20;(4):CD004423.

3. Anesthesiology. 1999 Mar;90(3):896-905.

4. J Am Coll Surg. 2012 Jan;214(1):68-80.

5. Clin Nutr. 1998 Apr;17(2):65-71.

6. Cochrane Database Syst Rev. 2007 Oct 17;(4):CD004508.

7. Obstet Gynecol. 1998 Jul;92(1):94-7.

8. Br J Surg. 2009 Apr;96(4):331-41.

9. Obstet Gynecol. 2013 Aug;122(2 Pt 1):319-28.

10. Qual Saf Health Care. 2009 Jun;18(3):236-40.

11. Gynecol Oncol. 2008 Feb;108(2):282-6.
 

Dr. Gehrig is professor and director of gynecologic oncology at the University of North Carolina at Chapel Hill. Dr. Barber is a third-year fellow in gynecologic oncology at the university. They reported having no financial disclosures relevant to this column. Email them at [email protected].

 

Enhanced recovery surgical principles were first described in the 1990s.1 These principles postulate that the body’s stress response to surgical injury and deviation from normal physiology is the source of postoperative morbidity. Thus, enhanced recovery programs are designed around perioperative interventions that mitigate and help the body cope with the surgical stress response.

Many of these interventions run counter to traditional perioperative care paradigms. Enhanced recovery protocols are diverse but have common themes of avoiding preoperative fasting and bowel preparation, early oral intake, limiting use of drains and catheters, multimodal analgesia, early ambulation, and prioritizing euvolemia and normothermia. Individual interventions in these areas are combined to create a master protocol, which is implemented as a bundle to improve surgical outcomes.

Dr. Paola A. Gehrig
Dr. Paola A. Gehrig

Current components

Minimizing preoperative fasting, early postoperative refeeding, and preoperative carbohydrate-loading drinks are all key aspects of enhanced recovery protocols. “NPO after midnight” has been a longstanding rule due to the risk of aspiration with intubation. However, a Cochrane review found no evidence that a shortened period of fasting was associated with an increased risk of aspiration or related morbidity. Currently, the American Society of Anesthesiologists recommends only a 6-hour fast for solid foods and 2 hours for clear liquids.2,3

Preoperative fasting causes depletion of glycogen stores leading to insulin resistance and hyperglycemia, which are both associated with postoperative complications and morbidity.4 Preoperative carbohydrate-loading drinks can reverse some of the effects of limited preoperative fasting including preventing insulin resistance and hyperglycemia.5

Postoperative fasting should also be avoided. Early enteral intake is very important to decrease time spent in a catabolic state and decrease insulin resistance. In gynecology patients, early refeeding is associated with a faster return of bowel function and a decreased length of stay without an increase in postoperative complications.6 Notably, patients undergoing early feeding consistently experience more nausea and vomiting, but this is not associated with complications.7

The fluid management goal in enhanced recovery is to maintain perioperative euvolemia, as both hypovolemia and hypervolemia have negative physiologic consequences. When studied, fluid protocols designed to minimize the use of postoperative fluids have resulted in decreased cardiopulmonary complications, decreased postoperative morbidity, faster return of bowel function, and shorter hospital stays.8 Given the morbidity associated with fluid overload, enhanced recovery protocols recommend that minimal fluids be given in the operating room and intravenous fluids be removed as quickly as possible, often with first oral intake or postoperative day 1 at the latest.

Dr. Emma L. Barber
High-quality postoperative pain control is critical to achieve the goals of early mobilization and early feeding. Pain management strategies focus on decreasing the total amount of intravenous opioids by combining regional anesthetics techniques with multimodal pharmacologic pain management. Minimizing the surgical insult and other deviations from normal physiology are important as well. This includes avoiding peritoneal drains, no routine use of nasogastric tubes, quick removal of urinary catheters, and use of minimally invasive surgery and the smallest laparotomy incision if possible.

Engagement of the patient in their perioperative recovery with patient education materials and expectations for postoperative tasks, such as early refeeding, spirometry, and ambulation are all important components of enhanced recovery. Patients become partners in achieving postoperative milestones, and this results in improved outcomes such as decreased pain scores and shorter recoveries.

Evidence in gynecology

Enhanced recovery has been studied in many surgical disciplines including urology, colorectal surgery, hepatobiliary surgery, and gynecology. High-quality studies of abdominal and vaginal hysterectomy patients have consistently found a decrease in length of stay with no difference in readmission or postoperative complication rates.9 An interesting study also found that an enhanced recovery program was associated with decreased nursing time required for patient care.10

For ovarian cancer patients, enhanced recovery is associated with decreased length of stay, decreased time to return of bowel function, and improved quality of life. Enhanced recovery is also cost saving, saving $257-$697 per vaginal hysterectomy patient and $5,410-$7,600 per ovarian cancer patient.11

Enhanced recovery protocols are safe, evidenced based, cost saving, and are increasingly being adopted as clinicians and health systems become aware of their benefits.

References

1. Br J Anaesth. 1997 May;78(5):606-17.

2. Cochrane Database Syst Rev. 2003 Oct 20;(4):CD004423.

3. Anesthesiology. 1999 Mar;90(3):896-905.

4. J Am Coll Surg. 2012 Jan;214(1):68-80.

5. Clin Nutr. 1998 Apr;17(2):65-71.

6. Cochrane Database Syst Rev. 2007 Oct 17;(4):CD004508.

7. Obstet Gynecol. 1998 Jul;92(1):94-7.

8. Br J Surg. 2009 Apr;96(4):331-41.

9. Obstet Gynecol. 2013 Aug;122(2 Pt 1):319-28.

10. Qual Saf Health Care. 2009 Jun;18(3):236-40.

11. Gynecol Oncol. 2008 Feb;108(2):282-6.
 

Dr. Gehrig is professor and director of gynecologic oncology at the University of North Carolina at Chapel Hill. Dr. Barber is a third-year fellow in gynecologic oncology at the university. They reported having no financial disclosures relevant to this column. Email them at [email protected].

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads

FDA approves atezolizumab for advanced NSCLC

Article Type
Changed
Fri, 01/04/2019 - 13:25

 



The Food and Drug Administration has approved the programmed death-ligand 1 (PD-L1) blocking antibody atezolizumab for the treatment of patients with metastatic non–small cell lung cancer (NSCLC) whose disease has progressed during or following platinum-containing chemotherapy.

The FDA previously approved atezolizumab (Tecentriq) for the treatment of locally advanced or metastatic urothelial carcinoma that has progressed after platinum-containing chemotherapy.



Approval for treatment of NSCLC was based on results from the phase III OAK and phase II POPLAR trials that enrolled a total of 1,137 patients with NSCLC whose disease had progressed on platinum-containing chemotherapy. In OAK, median overall survival for patients assigned to atezolizumab was 13.8 months, compared with 9.6 months for patients assigned to docetaxel, as recently reported at the European Society for Medical Oncology Congress.

In POPLAR, overall survival was 12.6 months for patients receiving atezolizumab versus 9.7 months for those assigned to docetaxel, as reported at the European Cancer Congress in 2015.

The most common (greater than or equal to 20%) adverse reactions in patients treated with atezolizumab were fatigue, decreased appetite, dyspnea, cough, nausea, musculoskeletal pain, and constipation, according to the FDA website. The most common (greater than or equal to 2%) grade 3-4 adverse events in patients treated with atezolizumab were dyspnea, pneumonia, hypoxia, hyponatremia, fatigue, anemia, musculoskeletal pain, AST increase, ALT increase, dysphagia, and arthralgia. Clinically significant immune-related adverse events for patients receiving atezolizumab have included pneumonitis, hepatitis, colitis, and thyroid disease.

The recommended dose is 1,200 mg administered as an intravenous infusion over 60 minutes every 3 weeks until disease progression or unacceptable toxicity.

Patients with EGFR or ALK genomic tumor aberrations should not receive atezolizumab before having disease progression on FDA-approved therapy for these aberrations, the FDA said.

Full prescribing information is available on the FDA website.

Publications
Topics
Sections

 



The Food and Drug Administration has approved the programmed death-ligand 1 (PD-L1) blocking antibody atezolizumab for the treatment of patients with metastatic non–small cell lung cancer (NSCLC) whose disease has progressed during or following platinum-containing chemotherapy.

The FDA previously approved atezolizumab (Tecentriq) for the treatment of locally advanced or metastatic urothelial carcinoma that has progressed after platinum-containing chemotherapy.



Approval for treatment of NSCLC was based on results from the phase III OAK and phase II POPLAR trials that enrolled a total of 1,137 patients with NSCLC whose disease had progressed on platinum-containing chemotherapy. In OAK, median overall survival for patients assigned to atezolizumab was 13.8 months, compared with 9.6 months for patients assigned to docetaxel, as recently reported at the European Society for Medical Oncology Congress.

In POPLAR, overall survival was 12.6 months for patients receiving atezolizumab versus 9.7 months for those assigned to docetaxel, as reported at the European Cancer Congress in 2015.

The most common (greater than or equal to 20%) adverse reactions in patients treated with atezolizumab were fatigue, decreased appetite, dyspnea, cough, nausea, musculoskeletal pain, and constipation, according to the FDA website. The most common (greater than or equal to 2%) grade 3-4 adverse events in patients treated with atezolizumab were dyspnea, pneumonia, hypoxia, hyponatremia, fatigue, anemia, musculoskeletal pain, AST increase, ALT increase, dysphagia, and arthralgia. Clinically significant immune-related adverse events for patients receiving atezolizumab have included pneumonitis, hepatitis, colitis, and thyroid disease.

The recommended dose is 1,200 mg administered as an intravenous infusion over 60 minutes every 3 weeks until disease progression or unacceptable toxicity.

Patients with EGFR or ALK genomic tumor aberrations should not receive atezolizumab before having disease progression on FDA-approved therapy for these aberrations, the FDA said.

Full prescribing information is available on the FDA website.

 



The Food and Drug Administration has approved the programmed death-ligand 1 (PD-L1) blocking antibody atezolizumab for the treatment of patients with metastatic non–small cell lung cancer (NSCLC) whose disease has progressed during or following platinum-containing chemotherapy.

The FDA previously approved atezolizumab (Tecentriq) for the treatment of locally advanced or metastatic urothelial carcinoma that has progressed after platinum-containing chemotherapy.



Approval for treatment of NSCLC was based on results from the phase III OAK and phase II POPLAR trials that enrolled a total of 1,137 patients with NSCLC whose disease had progressed on platinum-containing chemotherapy. In OAK, median overall survival for patients assigned to atezolizumab was 13.8 months, compared with 9.6 months for patients assigned to docetaxel, as recently reported at the European Society for Medical Oncology Congress.

In POPLAR, overall survival was 12.6 months for patients receiving atezolizumab versus 9.7 months for those assigned to docetaxel, as reported at the European Cancer Congress in 2015.

The most common (greater than or equal to 20%) adverse reactions in patients treated with atezolizumab were fatigue, decreased appetite, dyspnea, cough, nausea, musculoskeletal pain, and constipation, according to the FDA website. The most common (greater than or equal to 2%) grade 3-4 adverse events in patients treated with atezolizumab were dyspnea, pneumonia, hypoxia, hyponatremia, fatigue, anemia, musculoskeletal pain, AST increase, ALT increase, dysphagia, and arthralgia. Clinically significant immune-related adverse events for patients receiving atezolizumab have included pneumonitis, hepatitis, colitis, and thyroid disease.

The recommended dose is 1,200 mg administered as an intravenous infusion over 60 minutes every 3 weeks until disease progression or unacceptable toxicity.

Patients with EGFR or ALK genomic tumor aberrations should not receive atezolizumab before having disease progression on FDA-approved therapy for these aberrations, the FDA said.

Full prescribing information is available on the FDA website.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads

Psychological stress did not harm IVF outcomes

Article Type
Changed
Fri, 01/18/2019 - 16:17

 

– Self-reported psychological distress and biological markers of stress did not affect the chances of pregnancy after in vitro fertilization, according to a single-center prospective cohort study of 186 women.

 
Thinkstock


Patients with infertility reported significantly worse depressive symptoms at baseline compared with oocyte donors, with average Beck Depression Inventory scores of 11, compared with 2.3 (P less than .01). This difference persisted at the time of vaginal oocyte retrieval. First-time and third-time IVF recipients tended to have similar Beck Depression Inventory scores until pregnancy testing, when the average scores of third-time recipients increased by about five additional points.

Scores on the daily stress questionnaire echoed these findings. Patients who were receiving IVF reported significantly more stress than did oocyte donors, with the highest increase at the time of the pregnancy test among patients with previous IVF failures. However, both recipients and donors reported mild to moderate anxiety based on the State-Trait Anxiety Inventory, Dr. Costantini-Ferrando said.

All groups had low interleukin-6 levels when starting IVF, but donors had the highest levels of both ACTH and cortisol. Levels of these hormones dropped over time, and then rose at the time of pregnancy testing. That trend suggests that IVF was an acute “procedural” stressor for both recipients and donors, but that the magnitude of the stress response lessened somewhat over time with repeated exposure to the stressor, Dr. Costantini-Ferrando noted.

“The complexity of this interaction may explain why it is difficult to elucidate the true impact of stress on IVF outcome,” she said.

Ultimately, these findings support a shift in focus, she added. Instead of asking if stress decreases reproductive potential, clinicians and researchers can instead explore how to address psychological distress to maximize reproductive potential. “This is important to keep patients engaged in infertility treatment and facilitate the difficult process of decision making during treatment,” she added.

The paper won a prize from the ASRM Mental Health Professional Group.

The National Institutes of Health provided funding for the study. Dr. Costantini-Ferrando reported having no relevant financial disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Self-reported psychological distress and biological markers of stress did not affect the chances of pregnancy after in vitro fertilization, according to a single-center prospective cohort study of 186 women.

 
Thinkstock


Patients with infertility reported significantly worse depressive symptoms at baseline compared with oocyte donors, with average Beck Depression Inventory scores of 11, compared with 2.3 (P less than .01). This difference persisted at the time of vaginal oocyte retrieval. First-time and third-time IVF recipients tended to have similar Beck Depression Inventory scores until pregnancy testing, when the average scores of third-time recipients increased by about five additional points.

Scores on the daily stress questionnaire echoed these findings. Patients who were receiving IVF reported significantly more stress than did oocyte donors, with the highest increase at the time of the pregnancy test among patients with previous IVF failures. However, both recipients and donors reported mild to moderate anxiety based on the State-Trait Anxiety Inventory, Dr. Costantini-Ferrando said.

All groups had low interleukin-6 levels when starting IVF, but donors had the highest levels of both ACTH and cortisol. Levels of these hormones dropped over time, and then rose at the time of pregnancy testing. That trend suggests that IVF was an acute “procedural” stressor for both recipients and donors, but that the magnitude of the stress response lessened somewhat over time with repeated exposure to the stressor, Dr. Costantini-Ferrando noted.

“The complexity of this interaction may explain why it is difficult to elucidate the true impact of stress on IVF outcome,” she said.

Ultimately, these findings support a shift in focus, she added. Instead of asking if stress decreases reproductive potential, clinicians and researchers can instead explore how to address psychological distress to maximize reproductive potential. “This is important to keep patients engaged in infertility treatment and facilitate the difficult process of decision making during treatment,” she added.

The paper won a prize from the ASRM Mental Health Professional Group.

The National Institutes of Health provided funding for the study. Dr. Costantini-Ferrando reported having no relevant financial disclosures.

 

– Self-reported psychological distress and biological markers of stress did not affect the chances of pregnancy after in vitro fertilization, according to a single-center prospective cohort study of 186 women.

 
Thinkstock


Patients with infertility reported significantly worse depressive symptoms at baseline compared with oocyte donors, with average Beck Depression Inventory scores of 11, compared with 2.3 (P less than .01). This difference persisted at the time of vaginal oocyte retrieval. First-time and third-time IVF recipients tended to have similar Beck Depression Inventory scores until pregnancy testing, when the average scores of third-time recipients increased by about five additional points.

Scores on the daily stress questionnaire echoed these findings. Patients who were receiving IVF reported significantly more stress than did oocyte donors, with the highest increase at the time of the pregnancy test among patients with previous IVF failures. However, both recipients and donors reported mild to moderate anxiety based on the State-Trait Anxiety Inventory, Dr. Costantini-Ferrando said.

All groups had low interleukin-6 levels when starting IVF, but donors had the highest levels of both ACTH and cortisol. Levels of these hormones dropped over time, and then rose at the time of pregnancy testing. That trend suggests that IVF was an acute “procedural” stressor for both recipients and donors, but that the magnitude of the stress response lessened somewhat over time with repeated exposure to the stressor, Dr. Costantini-Ferrando noted.

“The complexity of this interaction may explain why it is difficult to elucidate the true impact of stress on IVF outcome,” she said.

Ultimately, these findings support a shift in focus, she added. Instead of asking if stress decreases reproductive potential, clinicians and researchers can instead explore how to address psychological distress to maximize reproductive potential. “This is important to keep patients engaged in infertility treatment and facilitate the difficult process of decision making during treatment,” she added.

The paper won a prize from the ASRM Mental Health Professional Group.

The National Institutes of Health provided funding for the study. Dr. Costantini-Ferrando reported having no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Vitals

 

Key clinical point: Psychological distress does not appear to impact pregnancy rates after in vitro fertilization.

Major finding: Rates of pregnancy were 36% among first-time IVF recipients and 30% among third-time IVF recipients, and did not correlate with self-reported depressive symptoms, stress, anxiety, or serum levels of adrenocorticotropic hormone, cortisol, or interleukin-6.

Data source: A single-center prospective study of 186 IVF patients.

Disclosures: The National Institutes of Health provided funding for the study. Dr. Costantini-Ferrando reported having no financial disclosures.

Use of 2D bar coding with vaccines may be the future in pediatric practice

Article Type
Changed
Fri, 01/18/2019 - 16:17

 

ATLANTA– Since the first bar coded consumer product, a pack of gum, was scanned in June of 1974, the soon widespread use of bar codes changed little until 2D bar codes arrived toward the end of last century. Today, the increasing use of 2D bar code technology with vaccines offers practices the potential for greater accuracy and efficiency with vaccine administration and data entry – if they have the resources to take the plunge.

An overview of 2D bar code use with vaccines, presented at a conference sponsored by the Centers for Disease Control and Prevention, provided a glimpse into both the types of changes practices might see with adoption of the technology and the way some clinics have made the transition.

Courtesy National Center for Immunization and Respiratory Diseases


Ken Gerlach, MPH, of the Immunization Services Division at the CDC in Atlanta, outlined the history of bar code use in immunizations, starting with a November 1999 Institute of Medicine report that identified the contribution of human error to disease and led the Food and Drug Administration to begin requiring linear bar codes on pharmaceutical unit-of-use products to reduce errors.

Then, a meeting organized by the American Academy of Pediatrics in January 2009 with the FDA, CDC, vaccine manufacturers, and other stakeholders led to a bar code rule change by the FDA in August 2011 that allowed alternatives to the traditional linear bar codes on vaccine vials and syringes.

“They essentially indicated to the pharmaceutical companies that it’s okay to add 2D bar codes, and this is essentially the point where things began to take off,” Mr. Gerlach explained. Until then, there had been no 2D bar codes on vaccines, but today the majority of vaccine products have them, as do all Vaccine Information Statements. In addition to the standard information included on traditional bar codes – Global Trade Item Number (GTIN), lot and serial numbers, and the expiration date – 2D bar codes also can include most relevant patient information that would go into the EMR except the injection site and immunization route. But a practice cannot simply jump over to scanning the 2D bar codes without ensuring that its EMR system is configured to accept the scanning.

Mr. Gerlach described a three-part project by the CDC, from 2011 through 2017, that assesses the impact of 2D coding on vaccination data quality and work flow, facilitates the adoption of 2D bar code scanning in health care practices, and then assesses the potential for expanding 2D bar code use in a large health care system. The first part of the project, which ran from 2011 to 2014, involved two vaccine manufacturers and 217 health care practices with more than 1.4 million de-identified vaccination records, 18.1% of which had been 2D bar coded.

Analysis of data quality from that pilot revealed an 8% increase in the correctness of lot numbers and 11% increase for expiration dates, with a time savings of 3.4 seconds per vaccine administration. Among the 116 staff users who completed surveys, 86% agreed that 2D bar coding improves accuracy and completeness, and 60% agreed it was easy to integrate the bar coding into their usual data recording process.

The pilot revealed challenges as well, however: not all individuals units of vaccines were 2D bar coded, users did not always consistently scan the bar codes, and some bar codes were difficult to read, such as one that was brown and wouldn’t scan. Another obstacle was having different lot numbers on the unit of use versus the unit of sale with 10% of the vaccines. Further, because inventory management typically involves unit of sale, it does not always match well with scanning unit of use.
 

Clinicians’ beliefs and attitudes toward 2D bar coding

As more practices consider adopting the technology, buy-in will important. At the conference, Sharon Humiston, MD, and Jill Hernandez, MPH, of Children’s Mercy Hospital in Kansas City, Mo., shared the findings of an online questionnaire about 2D bar coding and practices’ current systems for vaccine inventory and recording patient immunization information. The researchers distributed the questionnaire link to various AAP sections and committees in listservs and emails. Those eligible to complete the 15-minute survey were primary care personnel who used EMRs but not 2D bar code scanning for vaccines. They also needed to be key decision makers in the process of purchasing technology for the practice, and their practice needed to be enrolled in the Vaccines for Children program.

Among the 77 respondents who met all the inclusion criteria (61% of all who started the survey), 1 in 5 were private practices with one or two physicians, just over a third (36%) were private practices with more than two physicians, and a quarter were multispecialty group practices. Overall, respondents administered an average 116 doses of DTaP and 50 doses of Tdap each month.

Protocols for immunization management varied considerably across the respondents. For recording vaccine information, 49% reported that an administrator pre-entered it into an EMR, but 43% reported that staff manually enter it into an EMR. About 55% of practices entered the information before vaccine administration, and 42% entered it afterward. Although 57% of respondents’ practices upload the vaccination information directly from the EMR to their state’s Immunization Information System (IIS), 30% must enter it both into the EMR and into the state IIS separately, and 11% don’t enter it into a state IIS.

More than half (56%) of the respondents were extremely interested in having a bar code scanner system, and 31% were moderately to strongly interested, rating a 6 to 9 on a scale of 1 to 10. If provided evidence that 2D bar codes reduced errors in vaccine documentation, 56% of respondents said it would greatly increase their interest, and 32% said it would somewhat increase it. Only 23% said their interest would greatly increase if the bar code technology allowed the vaccine information statement to be scanned into EMRs.

Nearly all the respondents agreed that 2D bar code scanning technology would improve efficiency and accuracy of entering vaccine information into medical records and tracking vaccine inventory. Further, 81% believed it would reduce medical malpractice liability, and 85% believed it would reduce risk of harm to patients. However, 23% thought bar code technology would disrupt office work flow, and a quarter believed the technology’s costs would exceeds its benefits.

Despite the strong interest overall, respondents reported a number of barriers to adopting 2D bar code technology. The greatest barrier, reported by more than 70%, was the upfront cost of purchasing software for the EMR interface, followed by the cost of the bar code scanners. Other barriers, reported by 25%-45% of respondents, were the need for staff training, the need to service and maintain electronics for the technology, and the purchase of additional computers for scanner sites. If a bar code system cost less than $5,000, then 80% of the respondents would definitely or probably adopt such a system. Few would adopt it if the system cost $10,000 or more, but 42% probably would if it cost between $5,000 and $9,999. Even this small survey of self-selected volunteers, however, suggested strong interest in using 2D bar code technology for vaccines – although initial costs for a system presented a significant barrier to most practices.
 

 

 

One influenza vaccine clinic’s experience

Interest based on hypothetical questions is one thing. The process of actually implementing a 2D bar code scanning system into a health care center is another. In a separate presentation, Jane Glaser, MSN, RN, executive director of Campbell County Public Health in Gillette, Wyo., reviewed how such a system was implemented for mass influenza vaccination.

Campbell County, in the northeast corner of Wyoming, covers more than 4,800 square miles, has a population base of nearly 50,000 people, and also serves individuals from Montana, South Dakota, and North Dakota. Although the community as a whole works 24/7 in the county because of the oil, mining, and farming industries, the mass flu clinic is open 7 a.m. to 7 p.m., during which it provides an estimated 700 to 1,500 flu vaccines daily. Personnel comprises 13 public health nurses, 5 administrative assistants, and 3-4 community volunteers.

After 20 years of using an IIS, the clinic’s leadership decided to begin using 2D bar code scanners in October 2011 after observing it at a state immunization conference. Their goals in changing systems were to increase clinic flow, decrease registration time, and decrease overtime due to data entry. The new work flow went as follows: Those with Wyoming driver licenses or state ID cards have the linear bar code on their ID scanned in the immunization registry, which automatically populates the patient’s record. Then the staff member enters the vaccine information directly into the IIS registry in real time after the client receives the vaccine.

Ms. Glaser describes a number of improvements that resulted from use of the bar code scanning system, starting with reduced time for clinic registration and improved clinic flow. They also found that using bar code scanning reduced manual entry errors and improved the efficiency of assessing vaccination status and needed vaccines. Entering data in real time at point of care reduced time spent on data entry later on, thereby leading to a decrease in overtime and subsequent cost savings.

For providers and practices interested in learning more about 2D bar coding, the CDC offers a current list of 2D bar coded vaccines, data from the pilot program, training materials, and AAP guidance about 2D bar code use.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

ATLANTA– Since the first bar coded consumer product, a pack of gum, was scanned in June of 1974, the soon widespread use of bar codes changed little until 2D bar codes arrived toward the end of last century. Today, the increasing use of 2D bar code technology with vaccines offers practices the potential for greater accuracy and efficiency with vaccine administration and data entry – if they have the resources to take the plunge.

An overview of 2D bar code use with vaccines, presented at a conference sponsored by the Centers for Disease Control and Prevention, provided a glimpse into both the types of changes practices might see with adoption of the technology and the way some clinics have made the transition.

Courtesy National Center for Immunization and Respiratory Diseases


Ken Gerlach, MPH, of the Immunization Services Division at the CDC in Atlanta, outlined the history of bar code use in immunizations, starting with a November 1999 Institute of Medicine report that identified the contribution of human error to disease and led the Food and Drug Administration to begin requiring linear bar codes on pharmaceutical unit-of-use products to reduce errors.

Then, a meeting organized by the American Academy of Pediatrics in January 2009 with the FDA, CDC, vaccine manufacturers, and other stakeholders led to a bar code rule change by the FDA in August 2011 that allowed alternatives to the traditional linear bar codes on vaccine vials and syringes.

“They essentially indicated to the pharmaceutical companies that it’s okay to add 2D bar codes, and this is essentially the point where things began to take off,” Mr. Gerlach explained. Until then, there had been no 2D bar codes on vaccines, but today the majority of vaccine products have them, as do all Vaccine Information Statements. In addition to the standard information included on traditional bar codes – Global Trade Item Number (GTIN), lot and serial numbers, and the expiration date – 2D bar codes also can include most relevant patient information that would go into the EMR except the injection site and immunization route. But a practice cannot simply jump over to scanning the 2D bar codes without ensuring that its EMR system is configured to accept the scanning.

Mr. Gerlach described a three-part project by the CDC, from 2011 through 2017, that assesses the impact of 2D coding on vaccination data quality and work flow, facilitates the adoption of 2D bar code scanning in health care practices, and then assesses the potential for expanding 2D bar code use in a large health care system. The first part of the project, which ran from 2011 to 2014, involved two vaccine manufacturers and 217 health care practices with more than 1.4 million de-identified vaccination records, 18.1% of which had been 2D bar coded.

Analysis of data quality from that pilot revealed an 8% increase in the correctness of lot numbers and 11% increase for expiration dates, with a time savings of 3.4 seconds per vaccine administration. Among the 116 staff users who completed surveys, 86% agreed that 2D bar coding improves accuracy and completeness, and 60% agreed it was easy to integrate the bar coding into their usual data recording process.

The pilot revealed challenges as well, however: not all individuals units of vaccines were 2D bar coded, users did not always consistently scan the bar codes, and some bar codes were difficult to read, such as one that was brown and wouldn’t scan. Another obstacle was having different lot numbers on the unit of use versus the unit of sale with 10% of the vaccines. Further, because inventory management typically involves unit of sale, it does not always match well with scanning unit of use.
 

Clinicians’ beliefs and attitudes toward 2D bar coding

As more practices consider adopting the technology, buy-in will important. At the conference, Sharon Humiston, MD, and Jill Hernandez, MPH, of Children’s Mercy Hospital in Kansas City, Mo., shared the findings of an online questionnaire about 2D bar coding and practices’ current systems for vaccine inventory and recording patient immunization information. The researchers distributed the questionnaire link to various AAP sections and committees in listservs and emails. Those eligible to complete the 15-minute survey were primary care personnel who used EMRs but not 2D bar code scanning for vaccines. They also needed to be key decision makers in the process of purchasing technology for the practice, and their practice needed to be enrolled in the Vaccines for Children program.

Among the 77 respondents who met all the inclusion criteria (61% of all who started the survey), 1 in 5 were private practices with one or two physicians, just over a third (36%) were private practices with more than two physicians, and a quarter were multispecialty group practices. Overall, respondents administered an average 116 doses of DTaP and 50 doses of Tdap each month.

Protocols for immunization management varied considerably across the respondents. For recording vaccine information, 49% reported that an administrator pre-entered it into an EMR, but 43% reported that staff manually enter it into an EMR. About 55% of practices entered the information before vaccine administration, and 42% entered it afterward. Although 57% of respondents’ practices upload the vaccination information directly from the EMR to their state’s Immunization Information System (IIS), 30% must enter it both into the EMR and into the state IIS separately, and 11% don’t enter it into a state IIS.

More than half (56%) of the respondents were extremely interested in having a bar code scanner system, and 31% were moderately to strongly interested, rating a 6 to 9 on a scale of 1 to 10. If provided evidence that 2D bar codes reduced errors in vaccine documentation, 56% of respondents said it would greatly increase their interest, and 32% said it would somewhat increase it. Only 23% said their interest would greatly increase if the bar code technology allowed the vaccine information statement to be scanned into EMRs.

Nearly all the respondents agreed that 2D bar code scanning technology would improve efficiency and accuracy of entering vaccine information into medical records and tracking vaccine inventory. Further, 81% believed it would reduce medical malpractice liability, and 85% believed it would reduce risk of harm to patients. However, 23% thought bar code technology would disrupt office work flow, and a quarter believed the technology’s costs would exceeds its benefits.

Despite the strong interest overall, respondents reported a number of barriers to adopting 2D bar code technology. The greatest barrier, reported by more than 70%, was the upfront cost of purchasing software for the EMR interface, followed by the cost of the bar code scanners. Other barriers, reported by 25%-45% of respondents, were the need for staff training, the need to service and maintain electronics for the technology, and the purchase of additional computers for scanner sites. If a bar code system cost less than $5,000, then 80% of the respondents would definitely or probably adopt such a system. Few would adopt it if the system cost $10,000 or more, but 42% probably would if it cost between $5,000 and $9,999. Even this small survey of self-selected volunteers, however, suggested strong interest in using 2D bar code technology for vaccines – although initial costs for a system presented a significant barrier to most practices.
 

 

 

One influenza vaccine clinic’s experience

Interest based on hypothetical questions is one thing. The process of actually implementing a 2D bar code scanning system into a health care center is another. In a separate presentation, Jane Glaser, MSN, RN, executive director of Campbell County Public Health in Gillette, Wyo., reviewed how such a system was implemented for mass influenza vaccination.

Campbell County, in the northeast corner of Wyoming, covers more than 4,800 square miles, has a population base of nearly 50,000 people, and also serves individuals from Montana, South Dakota, and North Dakota. Although the community as a whole works 24/7 in the county because of the oil, mining, and farming industries, the mass flu clinic is open 7 a.m. to 7 p.m., during which it provides an estimated 700 to 1,500 flu vaccines daily. Personnel comprises 13 public health nurses, 5 administrative assistants, and 3-4 community volunteers.

After 20 years of using an IIS, the clinic’s leadership decided to begin using 2D bar code scanners in October 2011 after observing it at a state immunization conference. Their goals in changing systems were to increase clinic flow, decrease registration time, and decrease overtime due to data entry. The new work flow went as follows: Those with Wyoming driver licenses or state ID cards have the linear bar code on their ID scanned in the immunization registry, which automatically populates the patient’s record. Then the staff member enters the vaccine information directly into the IIS registry in real time after the client receives the vaccine.

Ms. Glaser describes a number of improvements that resulted from use of the bar code scanning system, starting with reduced time for clinic registration and improved clinic flow. They also found that using bar code scanning reduced manual entry errors and improved the efficiency of assessing vaccination status and needed vaccines. Entering data in real time at point of care reduced time spent on data entry later on, thereby leading to a decrease in overtime and subsequent cost savings.

For providers and practices interested in learning more about 2D bar coding, the CDC offers a current list of 2D bar coded vaccines, data from the pilot program, training materials, and AAP guidance about 2D bar code use.

 

ATLANTA– Since the first bar coded consumer product, a pack of gum, was scanned in June of 1974, the soon widespread use of bar codes changed little until 2D bar codes arrived toward the end of last century. Today, the increasing use of 2D bar code technology with vaccines offers practices the potential for greater accuracy and efficiency with vaccine administration and data entry – if they have the resources to take the plunge.

An overview of 2D bar code use with vaccines, presented at a conference sponsored by the Centers for Disease Control and Prevention, provided a glimpse into both the types of changes practices might see with adoption of the technology and the way some clinics have made the transition.

Courtesy National Center for Immunization and Respiratory Diseases


Ken Gerlach, MPH, of the Immunization Services Division at the CDC in Atlanta, outlined the history of bar code use in immunizations, starting with a November 1999 Institute of Medicine report that identified the contribution of human error to disease and led the Food and Drug Administration to begin requiring linear bar codes on pharmaceutical unit-of-use products to reduce errors.

Then, a meeting organized by the American Academy of Pediatrics in January 2009 with the FDA, CDC, vaccine manufacturers, and other stakeholders led to a bar code rule change by the FDA in August 2011 that allowed alternatives to the traditional linear bar codes on vaccine vials and syringes.

“They essentially indicated to the pharmaceutical companies that it’s okay to add 2D bar codes, and this is essentially the point where things began to take off,” Mr. Gerlach explained. Until then, there had been no 2D bar codes on vaccines, but today the majority of vaccine products have them, as do all Vaccine Information Statements. In addition to the standard information included on traditional bar codes – Global Trade Item Number (GTIN), lot and serial numbers, and the expiration date – 2D bar codes also can include most relevant patient information that would go into the EMR except the injection site and immunization route. But a practice cannot simply jump over to scanning the 2D bar codes without ensuring that its EMR system is configured to accept the scanning.

Mr. Gerlach described a three-part project by the CDC, from 2011 through 2017, that assesses the impact of 2D coding on vaccination data quality and work flow, facilitates the adoption of 2D bar code scanning in health care practices, and then assesses the potential for expanding 2D bar code use in a large health care system. The first part of the project, which ran from 2011 to 2014, involved two vaccine manufacturers and 217 health care practices with more than 1.4 million de-identified vaccination records, 18.1% of which had been 2D bar coded.

Analysis of data quality from that pilot revealed an 8% increase in the correctness of lot numbers and 11% increase for expiration dates, with a time savings of 3.4 seconds per vaccine administration. Among the 116 staff users who completed surveys, 86% agreed that 2D bar coding improves accuracy and completeness, and 60% agreed it was easy to integrate the bar coding into their usual data recording process.

The pilot revealed challenges as well, however: not all individuals units of vaccines were 2D bar coded, users did not always consistently scan the bar codes, and some bar codes were difficult to read, such as one that was brown and wouldn’t scan. Another obstacle was having different lot numbers on the unit of use versus the unit of sale with 10% of the vaccines. Further, because inventory management typically involves unit of sale, it does not always match well with scanning unit of use.
 

Clinicians’ beliefs and attitudes toward 2D bar coding

As more practices consider adopting the technology, buy-in will important. At the conference, Sharon Humiston, MD, and Jill Hernandez, MPH, of Children’s Mercy Hospital in Kansas City, Mo., shared the findings of an online questionnaire about 2D bar coding and practices’ current systems for vaccine inventory and recording patient immunization information. The researchers distributed the questionnaire link to various AAP sections and committees in listservs and emails. Those eligible to complete the 15-minute survey were primary care personnel who used EMRs but not 2D bar code scanning for vaccines. They also needed to be key decision makers in the process of purchasing technology for the practice, and their practice needed to be enrolled in the Vaccines for Children program.

Among the 77 respondents who met all the inclusion criteria (61% of all who started the survey), 1 in 5 were private practices with one or two physicians, just over a third (36%) were private practices with more than two physicians, and a quarter were multispecialty group practices. Overall, respondents administered an average 116 doses of DTaP and 50 doses of Tdap each month.

Protocols for immunization management varied considerably across the respondents. For recording vaccine information, 49% reported that an administrator pre-entered it into an EMR, but 43% reported that staff manually enter it into an EMR. About 55% of practices entered the information before vaccine administration, and 42% entered it afterward. Although 57% of respondents’ practices upload the vaccination information directly from the EMR to their state’s Immunization Information System (IIS), 30% must enter it both into the EMR and into the state IIS separately, and 11% don’t enter it into a state IIS.

More than half (56%) of the respondents were extremely interested in having a bar code scanner system, and 31% were moderately to strongly interested, rating a 6 to 9 on a scale of 1 to 10. If provided evidence that 2D bar codes reduced errors in vaccine documentation, 56% of respondents said it would greatly increase their interest, and 32% said it would somewhat increase it. Only 23% said their interest would greatly increase if the bar code technology allowed the vaccine information statement to be scanned into EMRs.

Nearly all the respondents agreed that 2D bar code scanning technology would improve efficiency and accuracy of entering vaccine information into medical records and tracking vaccine inventory. Further, 81% believed it would reduce medical malpractice liability, and 85% believed it would reduce risk of harm to patients. However, 23% thought bar code technology would disrupt office work flow, and a quarter believed the technology’s costs would exceeds its benefits.

Despite the strong interest overall, respondents reported a number of barriers to adopting 2D bar code technology. The greatest barrier, reported by more than 70%, was the upfront cost of purchasing software for the EMR interface, followed by the cost of the bar code scanners. Other barriers, reported by 25%-45% of respondents, were the need for staff training, the need to service and maintain electronics for the technology, and the purchase of additional computers for scanner sites. If a bar code system cost less than $5,000, then 80% of the respondents would definitely or probably adopt such a system. Few would adopt it if the system cost $10,000 or more, but 42% probably would if it cost between $5,000 and $9,999. Even this small survey of self-selected volunteers, however, suggested strong interest in using 2D bar code technology for vaccines – although initial costs for a system presented a significant barrier to most practices.
 

 

 

One influenza vaccine clinic’s experience

Interest based on hypothetical questions is one thing. The process of actually implementing a 2D bar code scanning system into a health care center is another. In a separate presentation, Jane Glaser, MSN, RN, executive director of Campbell County Public Health in Gillette, Wyo., reviewed how such a system was implemented for mass influenza vaccination.

Campbell County, in the northeast corner of Wyoming, covers more than 4,800 square miles, has a population base of nearly 50,000 people, and also serves individuals from Montana, South Dakota, and North Dakota. Although the community as a whole works 24/7 in the county because of the oil, mining, and farming industries, the mass flu clinic is open 7 a.m. to 7 p.m., during which it provides an estimated 700 to 1,500 flu vaccines daily. Personnel comprises 13 public health nurses, 5 administrative assistants, and 3-4 community volunteers.

After 20 years of using an IIS, the clinic’s leadership decided to begin using 2D bar code scanners in October 2011 after observing it at a state immunization conference. Their goals in changing systems were to increase clinic flow, decrease registration time, and decrease overtime due to data entry. The new work flow went as follows: Those with Wyoming driver licenses or state ID cards have the linear bar code on their ID scanned in the immunization registry, which automatically populates the patient’s record. Then the staff member enters the vaccine information directly into the IIS registry in real time after the client receives the vaccine.

Ms. Glaser describes a number of improvements that resulted from use of the bar code scanning system, starting with reduced time for clinic registration and improved clinic flow. They also found that using bar code scanning reduced manual entry errors and improved the efficiency of assessing vaccination status and needed vaccines. Entering data in real time at point of care reduced time spent on data entry later on, thereby leading to a decrease in overtime and subsequent cost savings.

For providers and practices interested in learning more about 2D bar coding, the CDC offers a current list of 2D bar coded vaccines, data from the pilot program, training materials, and AAP guidance about 2D bar code use.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

EXPERT ANALYSIS FROM AAP 16

Disallow All Ads
Vitals

 

Key clinical point: 2D bar coding with vaccines offers benefits and challenges.

Major finding: 56% of pediatric practice personnel are very interested in 2D bar coding use with immunizations, but 70% named cost a major barrier.

Data source: A CDC study, an online questionnaire, and experience in a Wyoming flu clinic.

Disclosures: None of three presentations noted external funding, and all researchers reported no financial relationships with companies that profit from bar code scanning technology. Deloitte Consulting was involved in the three-part project conducted by the CDC.

FDA grants accelerated approval to olaratumab for soft tissue sarcoma

Article Type
Changed
Fri, 01/04/2019 - 13:25

 

The Food and Drug Administration has granted accelerated approval to olaratumab with doxorubicin for the treatment of adult patients with certain types of soft tissue sarcoma.

“This is the first new therapy approved by the FDA for the initial treatment of soft tissue sarcoma since doxorubicin’s approval more than 40 years ago,” Richard Pazdur, MD, director of the office of hematology and oncology products in the FDA Center for Drug Evaluation and Research and acting director of the FDA Oncology Center of Excellence, said in a statement.

Olaratumab, a platelet-derived growth factor (PDGF) receptor-alpha blocking antibody, is approved for use with doxorubicin for the treatment of patients with soft tissue sarcoma who cannot be cured with radiation or surgery and who have a type of soft tissue sarcoma for which an anthracycline is an appropriate treatment, according to the FDA announcement.

Approval was based on a statistically significant improvement in survival in a randomized trial involving 133 patients with more than 25 different subtypes of metastatic soft tissue sarcoma. Patients received olaratumab (Lartruvo) with doxorubicin or doxorubicin alone. Median survival was 26.5 months for patients who received both drugs, compared with 14.7 months for patients who received doxorubicin alone. Median progression-free survival was 8.2 months for patients who received both drugs and 4.4 months for patients who received doxorubicin alone.

The most common adverse reactions from olaratumab were nausea, fatigue, neutropenia, musculoskeletal pain, mucositis, alopecia, vomiting, diarrhea, decreased appetite, abdominal pain, neuropathy, and headache. There are serious risks of infusion-related reactions and embryo-fetal harm, the FDA warned.

Olaratumab was approved under the FDA’s accelerated approval program after receiving fast track designation, breakthrough therapy designation, and a priority review status. The drug also received an orphan drug designation. Drug maker Eli Lilly is currently conducting a larger study of olaratumab across multiple subtypes of soft tissue sarcoma.

Publications
Topics
Sections

 

The Food and Drug Administration has granted accelerated approval to olaratumab with doxorubicin for the treatment of adult patients with certain types of soft tissue sarcoma.

“This is the first new therapy approved by the FDA for the initial treatment of soft tissue sarcoma since doxorubicin’s approval more than 40 years ago,” Richard Pazdur, MD, director of the office of hematology and oncology products in the FDA Center for Drug Evaluation and Research and acting director of the FDA Oncology Center of Excellence, said in a statement.

Olaratumab, a platelet-derived growth factor (PDGF) receptor-alpha blocking antibody, is approved for use with doxorubicin for the treatment of patients with soft tissue sarcoma who cannot be cured with radiation or surgery and who have a type of soft tissue sarcoma for which an anthracycline is an appropriate treatment, according to the FDA announcement.

Approval was based on a statistically significant improvement in survival in a randomized trial involving 133 patients with more than 25 different subtypes of metastatic soft tissue sarcoma. Patients received olaratumab (Lartruvo) with doxorubicin or doxorubicin alone. Median survival was 26.5 months for patients who received both drugs, compared with 14.7 months for patients who received doxorubicin alone. Median progression-free survival was 8.2 months for patients who received both drugs and 4.4 months for patients who received doxorubicin alone.

The most common adverse reactions from olaratumab were nausea, fatigue, neutropenia, musculoskeletal pain, mucositis, alopecia, vomiting, diarrhea, decreased appetite, abdominal pain, neuropathy, and headache. There are serious risks of infusion-related reactions and embryo-fetal harm, the FDA warned.

Olaratumab was approved under the FDA’s accelerated approval program after receiving fast track designation, breakthrough therapy designation, and a priority review status. The drug also received an orphan drug designation. Drug maker Eli Lilly is currently conducting a larger study of olaratumab across multiple subtypes of soft tissue sarcoma.

 

The Food and Drug Administration has granted accelerated approval to olaratumab with doxorubicin for the treatment of adult patients with certain types of soft tissue sarcoma.

“This is the first new therapy approved by the FDA for the initial treatment of soft tissue sarcoma since doxorubicin’s approval more than 40 years ago,” Richard Pazdur, MD, director of the office of hematology and oncology products in the FDA Center for Drug Evaluation and Research and acting director of the FDA Oncology Center of Excellence, said in a statement.

Olaratumab, a platelet-derived growth factor (PDGF) receptor-alpha blocking antibody, is approved for use with doxorubicin for the treatment of patients with soft tissue sarcoma who cannot be cured with radiation or surgery and who have a type of soft tissue sarcoma for which an anthracycline is an appropriate treatment, according to the FDA announcement.

Approval was based on a statistically significant improvement in survival in a randomized trial involving 133 patients with more than 25 different subtypes of metastatic soft tissue sarcoma. Patients received olaratumab (Lartruvo) with doxorubicin or doxorubicin alone. Median survival was 26.5 months for patients who received both drugs, compared with 14.7 months for patients who received doxorubicin alone. Median progression-free survival was 8.2 months for patients who received both drugs and 4.4 months for patients who received doxorubicin alone.

The most common adverse reactions from olaratumab were nausea, fatigue, neutropenia, musculoskeletal pain, mucositis, alopecia, vomiting, diarrhea, decreased appetite, abdominal pain, neuropathy, and headache. There are serious risks of infusion-related reactions and embryo-fetal harm, the FDA warned.

Olaratumab was approved under the FDA’s accelerated approval program after receiving fast track designation, breakthrough therapy designation, and a priority review status. The drug also received an orphan drug designation. Drug maker Eli Lilly is currently conducting a larger study of olaratumab across multiple subtypes of soft tissue sarcoma.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads

Uninsured rate lowest in Massachusetts

Article Type
Changed
Thu, 03/28/2019 - 15:01

 

Massachusetts had the nation’s lowest uninsured rate in 2015, and Texas had the highest, according to the personal finance website WalletHub.

Massachusetts’ uninsured rate of 2.8% was followed by Vermont at 3.8%, Hawaii at 4%, Minnesota at 4.5%, and Iowa at 5%, WalletHub reported.

The other end of the scale offered more proof that everything is bigger in Texas: The state’s 17.1% of population without insurance was the country’s highest. Alaska was 49th with a rate of 14.9%, which was preceded by Oklahoma at 13.9%, Georgia at 13.6%, and Florida at 13.3%, according to the U.S. Census Bureau data used in the analysis.

Nevada, which was 44th overall in 2015, had the largest reduction (–10.3%) in its uninsured rate from 2010 to 2015. Oregon had the next-largest drop (10.1%) and Massachusetts had the smallest decrease at –1.6%, meaning that no state saw an increase over the 5-year period, the WalletHub report showed.

A quick run through some subgroups shows that Vermont had the lowest percentage of uninsured children (1%) and Alaska had the highest (10.6%), Massachusetts was lowest for whites (2.2%) and Hispanics (5.3%) while Mississippi was highest (10.9% white and 37.6% Hispanic). Hawaii had the lowest rate (3.8%) for blacks, and Montana had the highest (17.4%), WalletHub said.

Publications
Topics
Sections

 

Massachusetts had the nation’s lowest uninsured rate in 2015, and Texas had the highest, according to the personal finance website WalletHub.

Massachusetts’ uninsured rate of 2.8% was followed by Vermont at 3.8%, Hawaii at 4%, Minnesota at 4.5%, and Iowa at 5%, WalletHub reported.

The other end of the scale offered more proof that everything is bigger in Texas: The state’s 17.1% of population without insurance was the country’s highest. Alaska was 49th with a rate of 14.9%, which was preceded by Oklahoma at 13.9%, Georgia at 13.6%, and Florida at 13.3%, according to the U.S. Census Bureau data used in the analysis.

Nevada, which was 44th overall in 2015, had the largest reduction (–10.3%) in its uninsured rate from 2010 to 2015. Oregon had the next-largest drop (10.1%) and Massachusetts had the smallest decrease at –1.6%, meaning that no state saw an increase over the 5-year period, the WalletHub report showed.

A quick run through some subgroups shows that Vermont had the lowest percentage of uninsured children (1%) and Alaska had the highest (10.6%), Massachusetts was lowest for whites (2.2%) and Hispanics (5.3%) while Mississippi was highest (10.9% white and 37.6% Hispanic). Hawaii had the lowest rate (3.8%) for blacks, and Montana had the highest (17.4%), WalletHub said.

 

Massachusetts had the nation’s lowest uninsured rate in 2015, and Texas had the highest, according to the personal finance website WalletHub.

Massachusetts’ uninsured rate of 2.8% was followed by Vermont at 3.8%, Hawaii at 4%, Minnesota at 4.5%, and Iowa at 5%, WalletHub reported.

The other end of the scale offered more proof that everything is bigger in Texas: The state’s 17.1% of population without insurance was the country’s highest. Alaska was 49th with a rate of 14.9%, which was preceded by Oklahoma at 13.9%, Georgia at 13.6%, and Florida at 13.3%, according to the U.S. Census Bureau data used in the analysis.

Nevada, which was 44th overall in 2015, had the largest reduction (–10.3%) in its uninsured rate from 2010 to 2015. Oregon had the next-largest drop (10.1%) and Massachusetts had the smallest decrease at –1.6%, meaning that no state saw an increase over the 5-year period, the WalletHub report showed.

A quick run through some subgroups shows that Vermont had the lowest percentage of uninsured children (1%) and Alaska had the highest (10.6%), Massachusetts was lowest for whites (2.2%) and Hispanics (5.3%) while Mississippi was highest (10.9% white and 37.6% Hispanic). Hawaii had the lowest rate (3.8%) for blacks, and Montana had the highest (17.4%), WalletHub said.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads

Picking at a Problem

Article Type
Changed
Thu, 04/12/2018 - 10:30
Display Headline
Picking at a Problem

An 80-year-old woman bitterly complains of itching and discomfort on the left side of her face that began several weeks ago. Her primary care provider initially diagnosed contact dermatitis and prescribed a class 6 topical steroid cream. This helped a bit with the itching but had no effect on appearance.

When a friend suggested the itch might be the result of a bug bite, the patient went directly to the emergency department and was given a two-week taper of prednisone (40 mg/d for a week, then 20 mg/d for a week). This eased the redness, but the itching returned as soon as the course was finished. Finally, she was referred to dermatology.

The patient, who has been a widow for many years, was recently “forced out” of her home of more than 50 years and into an assisted living facility; her family seldom visits, so her friend accompanies her to your office today.

EXAMINATION
Extensive honey-colored crusting on an erythematous base is confined to the left side of the patient’s face. No nodes are palpable in the area. Her friend confirms that she has been picking at the skin.

What is the diagnosis?

 

 

DISCUSSION
This condition is impetigo—in this case “non-bullous” impetigo, a rash that almost always begins with a breach in normal skin integrity. This opens the skin to superficial invasion by staph and strep organisms, mostly emitting from the nasal passages. In this case, the patient had been picking at seborrheic keratosis (an extremely common phenomenon in someone her age) on her face.

The inclination to scratch and pick—and the inability to manage nasal secretions—make children the most likely candidates for impetigo. It is especially common in those with atopic dermatitis, who not only have poor barrier function (which manifests as eczema) but also constant nasal drainage from seasonal allergies.

The differential includes herpes simplex or zoster, eczema and contact dermatitis. But the location of the rash, the honey-colored crust on an erythematous base, and the history of skin breaches all point directly to impetigo. Lymph nodes are often palpable in the drainage area; their presence corroborates the diagnosis.

Luckily, this type of impetigo is relatively easy to treat: The patient was advised to wash the area with soap and water and apply mupirocin ointment three times a day. She was also prescribed cephalexin (500 mg tid for a week), which cleared the condition aside from a faint bit of postinflammatory pinkness.

TAKE-HOME LEARNING POINTS
Impetigo is a superficial bacterial infection, usually on or near the face, caused by staph and strep organisms that seed the area from the nasal passages.
• These organisms generally require a break in the skin to gain entrance, often caused by picking or scratching.
• Honey-colored crust on an erythematous base, plus or minus enlarged local nodes, help to confirm the diagnosis.
• Symptoms of impetigo include itching and mild discomfort but not pain.
• Treatment can be as simple as cleaning with soap and water and applying topical mupirocin ointment. A short course of oral antibiotics may be needed to speed the clearance process.

Publications
Topics
Sections

An 80-year-old woman bitterly complains of itching and discomfort on the left side of her face that began several weeks ago. Her primary care provider initially diagnosed contact dermatitis and prescribed a class 6 topical steroid cream. This helped a bit with the itching but had no effect on appearance.

When a friend suggested the itch might be the result of a bug bite, the patient went directly to the emergency department and was given a two-week taper of prednisone (40 mg/d for a week, then 20 mg/d for a week). This eased the redness, but the itching returned as soon as the course was finished. Finally, she was referred to dermatology.

The patient, who has been a widow for many years, was recently “forced out” of her home of more than 50 years and into an assisted living facility; her family seldom visits, so her friend accompanies her to your office today.

EXAMINATION
Extensive honey-colored crusting on an erythematous base is confined to the left side of the patient’s face. No nodes are palpable in the area. Her friend confirms that she has been picking at the skin.

What is the diagnosis?

 

 

DISCUSSION
This condition is impetigo—in this case “non-bullous” impetigo, a rash that almost always begins with a breach in normal skin integrity. This opens the skin to superficial invasion by staph and strep organisms, mostly emitting from the nasal passages. In this case, the patient had been picking at seborrheic keratosis (an extremely common phenomenon in someone her age) on her face.

The inclination to scratch and pick—and the inability to manage nasal secretions—make children the most likely candidates for impetigo. It is especially common in those with atopic dermatitis, who not only have poor barrier function (which manifests as eczema) but also constant nasal drainage from seasonal allergies.

The differential includes herpes simplex or zoster, eczema and contact dermatitis. But the location of the rash, the honey-colored crust on an erythematous base, and the history of skin breaches all point directly to impetigo. Lymph nodes are often palpable in the drainage area; their presence corroborates the diagnosis.

Luckily, this type of impetigo is relatively easy to treat: The patient was advised to wash the area with soap and water and apply mupirocin ointment three times a day. She was also prescribed cephalexin (500 mg tid for a week), which cleared the condition aside from a faint bit of postinflammatory pinkness.

TAKE-HOME LEARNING POINTS
Impetigo is a superficial bacterial infection, usually on or near the face, caused by staph and strep organisms that seed the area from the nasal passages.
• These organisms generally require a break in the skin to gain entrance, often caused by picking or scratching.
• Honey-colored crust on an erythematous base, plus or minus enlarged local nodes, help to confirm the diagnosis.
• Symptoms of impetigo include itching and mild discomfort but not pain.
• Treatment can be as simple as cleaning with soap and water and applying topical mupirocin ointment. A short course of oral antibiotics may be needed to speed the clearance process.

An 80-year-old woman bitterly complains of itching and discomfort on the left side of her face that began several weeks ago. Her primary care provider initially diagnosed contact dermatitis and prescribed a class 6 topical steroid cream. This helped a bit with the itching but had no effect on appearance.

When a friend suggested the itch might be the result of a bug bite, the patient went directly to the emergency department and was given a two-week taper of prednisone (40 mg/d for a week, then 20 mg/d for a week). This eased the redness, but the itching returned as soon as the course was finished. Finally, she was referred to dermatology.

The patient, who has been a widow for many years, was recently “forced out” of her home of more than 50 years and into an assisted living facility; her family seldom visits, so her friend accompanies her to your office today.

EXAMINATION
Extensive honey-colored crusting on an erythematous base is confined to the left side of the patient’s face. No nodes are palpable in the area. Her friend confirms that she has been picking at the skin.

What is the diagnosis?

 

 

DISCUSSION
This condition is impetigo—in this case “non-bullous” impetigo, a rash that almost always begins with a breach in normal skin integrity. This opens the skin to superficial invasion by staph and strep organisms, mostly emitting from the nasal passages. In this case, the patient had been picking at seborrheic keratosis (an extremely common phenomenon in someone her age) on her face.

The inclination to scratch and pick—and the inability to manage nasal secretions—make children the most likely candidates for impetigo. It is especially common in those with atopic dermatitis, who not only have poor barrier function (which manifests as eczema) but also constant nasal drainage from seasonal allergies.

The differential includes herpes simplex or zoster, eczema and contact dermatitis. But the location of the rash, the honey-colored crust on an erythematous base, and the history of skin breaches all point directly to impetigo. Lymph nodes are often palpable in the drainage area; their presence corroborates the diagnosis.

Luckily, this type of impetigo is relatively easy to treat: The patient was advised to wash the area with soap and water and apply mupirocin ointment three times a day. She was also prescribed cephalexin (500 mg tid for a week), which cleared the condition aside from a faint bit of postinflammatory pinkness.

TAKE-HOME LEARNING POINTS
Impetigo is a superficial bacterial infection, usually on or near the face, caused by staph and strep organisms that seed the area from the nasal passages.
• These organisms generally require a break in the skin to gain entrance, often caused by picking or scratching.
• Honey-colored crust on an erythematous base, plus or minus enlarged local nodes, help to confirm the diagnosis.
• Symptoms of impetigo include itching and mild discomfort but not pain.
• Treatment can be as simple as cleaning with soap and water and applying topical mupirocin ointment. A short course of oral antibiotics may be needed to speed the clearance process.

Publications
Publications
Topics
Article Type
Display Headline
Picking at a Problem
Display Headline
Picking at a Problem
Sections
Disallow All Ads
Alternative CME
Use ProPublica

Ozanimod has lasting effect on ulcerative colitis

Article Type
Changed
Fri, 01/18/2019 - 16:17

 

– Ozanimod, an oral agent that selectively modulates the sphingosine 1-phosphate (S1P) 1 and 5 receptor, has a lasting effect on symptoms of ulcerative colitis, according to results from an open-label extension study.

The study extends the phase II TOUCHSTONE trial, in which patients with ulcerative colitis showed significant clinical improvement out to 32 weeks. The current study showed those improvements lasting out to at least 1 year, with about 80% of patients staying on the drug at the end of the study.

With other drug regimens, “the loss rates are at least 50% of the patients, so this is a remarkable level of durability over time,” William Sandborn, MD, chief of gastroenterology at the University of California at San Diego, said at the annual meeting of the American College of Gastroenterology. “With biologics, if you follow patients out for a year or 2, you see a fair amount of loss of response and some of that probably has to do with the formation of antidrug antibodies and immunogenicity,” Dr. Sandborn added.

Dr. William J. Sandborn


In the original TOUCHSTONE study, 197 patients were randomized to placebo, ozanimod 0.5 mg, or ozanimod 1.0 mg, and followed out to week 32. Twenty-one percent of those in the 1.0-mg group achieved clinical remission, compared with 26% in the 0.5-mg group and 6% of those receiving placebo. Clinical response rates were 51%, 35%, and 20%, respectively.

In the open-label study, patients from all arms who did not respond to treatment after the induction phase, or relapsed during the maintenance phase, or completed the maintenance phase (170 patients in total) entered the open-label study with a dose of 1.0 mg ozanimod. At the time of the cut-off, patients in the extension study had been taking ozanimod for at least 1 year.

At the start of the extension study, the partial Mayo Score (pMS) for patients on placebo, ozanimod 0.5 mg, and 1.0 mg were 4.6, 4.5, and 3.3, respectively. All groups showed improvement in pMS by week 44 (1.7, 1.7, and 1.9, respectively)

At week 44, 90.9% of patients had little or no active disease (physician global assessment 0 or 1), 98.4% had little or no blood in their stools, and 84.7% had no blood in their stools.

Adverse events with a frequency higher than 2% included ulcerative colitis flare (5.9%), anemia (3.5%), upper respiratory tract infection (4.1%), nasal pharyngitis (3.5%), back pain (2.9%), arthralgia (2.4%), and headache (2.4%). The researchers noted some transient elevation of alanine aminotransferase or aspartate aminotransferase, but these elevations were temporary and reversed during ongoing treatment; 2.4% of patients experienced ALT and AST levels higher than three times the upper limit of normal, and all were asymptomatic.

Serious adverse events that occurred in two or more patients included anemia (1.2%) and ulcerative colitis flare (2.4%).

“This is a promising oral product with a similar mechanism of action to other lymphocyte trafficking agents,” said Stephen Hanauer, MD, medical director of the digestive health center at the Northwestern University, Chicago, who attended the session.

Ozanimod and other lymphocyte trafficking agents may offer a slightly different profile than some of the other drug classes, such as the anti–tumor necrosis factor agents, according to Dr. Hanauer, because the agents don’t affect lymphocytes already in the tissues. On the other hand, once the drug has acted, its effect may linger. “The time to effect may be a little slower, but the long-term effect seems to be as good or better as other mechanisms of action.”

The drug’s real place could be in early disease, Dr. Hanauer said. “If this is truly an effective and safe agent, the real positioning should be earlier in the disease, before patients are exposed to steroids and other immune suppressants, or biologics that have an infection risk.”

Celgene funded the study. Dr. Sandborn has received funding from Receptos and Celgene and consulted for both companies. Dr. Hanauer has consulted with Receptos, Celgene, Pfizer, Jansen, and AbbVie.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Ozanimod, an oral agent that selectively modulates the sphingosine 1-phosphate (S1P) 1 and 5 receptor, has a lasting effect on symptoms of ulcerative colitis, according to results from an open-label extension study.

The study extends the phase II TOUCHSTONE trial, in which patients with ulcerative colitis showed significant clinical improvement out to 32 weeks. The current study showed those improvements lasting out to at least 1 year, with about 80% of patients staying on the drug at the end of the study.

With other drug regimens, “the loss rates are at least 50% of the patients, so this is a remarkable level of durability over time,” William Sandborn, MD, chief of gastroenterology at the University of California at San Diego, said at the annual meeting of the American College of Gastroenterology. “With biologics, if you follow patients out for a year or 2, you see a fair amount of loss of response and some of that probably has to do with the formation of antidrug antibodies and immunogenicity,” Dr. Sandborn added.

Dr. William J. Sandborn


In the original TOUCHSTONE study, 197 patients were randomized to placebo, ozanimod 0.5 mg, or ozanimod 1.0 mg, and followed out to week 32. Twenty-one percent of those in the 1.0-mg group achieved clinical remission, compared with 26% in the 0.5-mg group and 6% of those receiving placebo. Clinical response rates were 51%, 35%, and 20%, respectively.

In the open-label study, patients from all arms who did not respond to treatment after the induction phase, or relapsed during the maintenance phase, or completed the maintenance phase (170 patients in total) entered the open-label study with a dose of 1.0 mg ozanimod. At the time of the cut-off, patients in the extension study had been taking ozanimod for at least 1 year.

At the start of the extension study, the partial Mayo Score (pMS) for patients on placebo, ozanimod 0.5 mg, and 1.0 mg were 4.6, 4.5, and 3.3, respectively. All groups showed improvement in pMS by week 44 (1.7, 1.7, and 1.9, respectively)

At week 44, 90.9% of patients had little or no active disease (physician global assessment 0 or 1), 98.4% had little or no blood in their stools, and 84.7% had no blood in their stools.

Adverse events with a frequency higher than 2% included ulcerative colitis flare (5.9%), anemia (3.5%), upper respiratory tract infection (4.1%), nasal pharyngitis (3.5%), back pain (2.9%), arthralgia (2.4%), and headache (2.4%). The researchers noted some transient elevation of alanine aminotransferase or aspartate aminotransferase, but these elevations were temporary and reversed during ongoing treatment; 2.4% of patients experienced ALT and AST levels higher than three times the upper limit of normal, and all were asymptomatic.

Serious adverse events that occurred in two or more patients included anemia (1.2%) and ulcerative colitis flare (2.4%).

“This is a promising oral product with a similar mechanism of action to other lymphocyte trafficking agents,” said Stephen Hanauer, MD, medical director of the digestive health center at the Northwestern University, Chicago, who attended the session.

Ozanimod and other lymphocyte trafficking agents may offer a slightly different profile than some of the other drug classes, such as the anti–tumor necrosis factor agents, according to Dr. Hanauer, because the agents don’t affect lymphocytes already in the tissues. On the other hand, once the drug has acted, its effect may linger. “The time to effect may be a little slower, but the long-term effect seems to be as good or better as other mechanisms of action.”

The drug’s real place could be in early disease, Dr. Hanauer said. “If this is truly an effective and safe agent, the real positioning should be earlier in the disease, before patients are exposed to steroids and other immune suppressants, or biologics that have an infection risk.”

Celgene funded the study. Dr. Sandborn has received funding from Receptos and Celgene and consulted for both companies. Dr. Hanauer has consulted with Receptos, Celgene, Pfizer, Jansen, and AbbVie.

 

– Ozanimod, an oral agent that selectively modulates the sphingosine 1-phosphate (S1P) 1 and 5 receptor, has a lasting effect on symptoms of ulcerative colitis, according to results from an open-label extension study.

The study extends the phase II TOUCHSTONE trial, in which patients with ulcerative colitis showed significant clinical improvement out to 32 weeks. The current study showed those improvements lasting out to at least 1 year, with about 80% of patients staying on the drug at the end of the study.

With other drug regimens, “the loss rates are at least 50% of the patients, so this is a remarkable level of durability over time,” William Sandborn, MD, chief of gastroenterology at the University of California at San Diego, said at the annual meeting of the American College of Gastroenterology. “With biologics, if you follow patients out for a year or 2, you see a fair amount of loss of response and some of that probably has to do with the formation of antidrug antibodies and immunogenicity,” Dr. Sandborn added.

Dr. William J. Sandborn


In the original TOUCHSTONE study, 197 patients were randomized to placebo, ozanimod 0.5 mg, or ozanimod 1.0 mg, and followed out to week 32. Twenty-one percent of those in the 1.0-mg group achieved clinical remission, compared with 26% in the 0.5-mg group and 6% of those receiving placebo. Clinical response rates were 51%, 35%, and 20%, respectively.

In the open-label study, patients from all arms who did not respond to treatment after the induction phase, or relapsed during the maintenance phase, or completed the maintenance phase (170 patients in total) entered the open-label study with a dose of 1.0 mg ozanimod. At the time of the cut-off, patients in the extension study had been taking ozanimod for at least 1 year.

At the start of the extension study, the partial Mayo Score (pMS) for patients on placebo, ozanimod 0.5 mg, and 1.0 mg were 4.6, 4.5, and 3.3, respectively. All groups showed improvement in pMS by week 44 (1.7, 1.7, and 1.9, respectively)

At week 44, 90.9% of patients had little or no active disease (physician global assessment 0 or 1), 98.4% had little or no blood in their stools, and 84.7% had no blood in their stools.

Adverse events with a frequency higher than 2% included ulcerative colitis flare (5.9%), anemia (3.5%), upper respiratory tract infection (4.1%), nasal pharyngitis (3.5%), back pain (2.9%), arthralgia (2.4%), and headache (2.4%). The researchers noted some transient elevation of alanine aminotransferase or aspartate aminotransferase, but these elevations were temporary and reversed during ongoing treatment; 2.4% of patients experienced ALT and AST levels higher than three times the upper limit of normal, and all were asymptomatic.

Serious adverse events that occurred in two or more patients included anemia (1.2%) and ulcerative colitis flare (2.4%).

“This is a promising oral product with a similar mechanism of action to other lymphocyte trafficking agents,” said Stephen Hanauer, MD, medical director of the digestive health center at the Northwestern University, Chicago, who attended the session.

Ozanimod and other lymphocyte trafficking agents may offer a slightly different profile than some of the other drug classes, such as the anti–tumor necrosis factor agents, according to Dr. Hanauer, because the agents don’t affect lymphocytes already in the tissues. On the other hand, once the drug has acted, its effect may linger. “The time to effect may be a little slower, but the long-term effect seems to be as good or better as other mechanisms of action.”

The drug’s real place could be in early disease, Dr. Hanauer said. “If this is truly an effective and safe agent, the real positioning should be earlier in the disease, before patients are exposed to steroids and other immune suppressants, or biologics that have an infection risk.”

Celgene funded the study. Dr. Sandborn has received funding from Receptos and Celgene and consulted for both companies. Dr. Hanauer has consulted with Receptos, Celgene, Pfizer, Jansen, and AbbVie.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

AT ACG 2016

Disallow All Ads
Vitals

 

Key clinical point: Open-label study shows that short-term ozanimod gains are maintained.

Major finding: Ozanimod maintains efficacy in ulcerative colitis out to 1 year, with 90% of patients having little or no evidence of active disease.

Data source: Open-label extension study following a phase II clinical trial.

Disclosures: Celgene funded the study. Dr. Sandborn has received funding from Receptos and Celgene and consulted for both companies. Dr. Hanauer has consulted with Receptos, Celgene, Pfizer, Jansen, and AbbVie.

Experts: Fewer opioids, more treatment laws mean nothing without better access to care

Article Type
Changed
Fri, 01/18/2019 - 16:17

 

– Pressure on physicians to prescribe fewer opioids could have unintended consequences in the absence of adequate access to treatment, according to experts.

“There is mixed evidence that, when medication-assisted treatment is lacking, there are higher rates of transition from prescription opioids to heroin,” Gary Tsai, MD, said during a presentation at the American Psychiatric Association’s Institute on Psychiatric Services.

Whitney McKnight
Dr. Gary Tsai


“As we constrict our prescribing, we want to make sure that there is ready access to these interventions, so that those who are already dependent on opioids can transition to something safer,” said Dr. Tsai, medical director and science officer of Substance Abuse Prevention and Control, a division of Los Angeles County’s public health department.

Medication-assisted treatment (MAT) uses methadone, buprenorphine, or naltrexone in combination with appropriate behavioral and other other psychosocial therapies to help achieve opioid abstinence. Despite MAT’s well-established superiority to either pharmacotherapy or psychosocial interventions alone, the use of MAT has, in some cases, declined. According to the Substance Abuse and Mental Health Services Administration (SAMHSA), MAT was used in 35% of heroin-related treatment admissions in 2002, compared with 28% in 2010.

Reasons for MAT’s difficult path to acceptance are manifold, ranging from lack of certified facilities to administer the medications to misunderstanding about how the medications work, Dr. Tsai said.

A law passed earlier this year and the issuance of a final federal rule that increases the legal patient load that certified MAT providers can treat annually were designed to expand access to MAT. These, however, are only partial solutions, according to Margaret Chaplin, MD, a psychiatrist and program director of Community Mental Health Affiliates in New Britain, Conn.

“Can you imagine if endocrinologists were the only doctors who were certified to prescribe insulin and that each of them was only limited to prescribing to 100 patients?” Dr. Chaplin said in an interview. The final rule brought the number from 100 to 275 patients per year that a certified addiction specialist can treat. This might expand access to care, but “it sends a message that either the people with [addiction] don’t deserve treatment or that they don’t have a legitimate illness,” said Dr. Chaplin, who also was a presenter at the meeting.

Viewing people with opioid addiction through a lens of moral failing only compounds the nation’s addiction crisis, Dr. Chaplin believes. “Not to say that a person with a substance use disorder doesn’t have a responsibility to take care of their illness, [but] our [leaders] haven’t been well educated on the scientific evidence that addiction is a brain disease.”

It is true that, until the Comprehensive Addiction and Recovery Act was signed into law over the summer, nurse practitioners and physician assistants could have prescribed controlled substances such as acetaminophen/oxycontin but not the far less dangerous – and potentially life-saving – partial opioid agonist buprenorphine. Under the new law, those health care professions now have the same buprenorphine prescribing rights as physicians.

New legislation does not guarantee access to treatment, however. “Funding for MAT programs varies throughout the states, and the availability of these medications on formularies often determines how readily accessible MAT interventions are,” said Dr. Tsai, who emphasized the role of collaboration in ensuring the laws take hold.

“Addiction specialists comprise a minority of the work force. To scale MAT up, we need to engage other prescribers from other systems, including those in primary care and mental health,” Dr. Tai said. To wit, the three primary MAT facilities in Los Angeles County offer learning collaboratives with primary care clinicians who want to incorporate these services into their practice, even if they are not certified addiction specialists themselves. This helps increase referrals to the treatment facilities, he explained.

Overcoming resistance to offering MAT ultimately will depend on educating leaders about the costs of not doing so, Dr. Tsai and Dr. Chaplin said.

“Our system has been slow to adopt a disease model of addiction,” Dr. Chaplin said. “Buprenorphine and methadone are life-saving medical treatments that are regulated in ways that you don’t see for any other medical condition.”

SAMHSA currently is requesting comments through Nov. 1, 2016, on what should be required of MAT providers under the new law.

Neither Dr. Tsai nor Dr. Chaplin had any relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Pressure on physicians to prescribe fewer opioids could have unintended consequences in the absence of adequate access to treatment, according to experts.

“There is mixed evidence that, when medication-assisted treatment is lacking, there are higher rates of transition from prescription opioids to heroin,” Gary Tsai, MD, said during a presentation at the American Psychiatric Association’s Institute on Psychiatric Services.

Whitney McKnight
Dr. Gary Tsai


“As we constrict our prescribing, we want to make sure that there is ready access to these interventions, so that those who are already dependent on opioids can transition to something safer,” said Dr. Tsai, medical director and science officer of Substance Abuse Prevention and Control, a division of Los Angeles County’s public health department.

Medication-assisted treatment (MAT) uses methadone, buprenorphine, or naltrexone in combination with appropriate behavioral and other other psychosocial therapies to help achieve opioid abstinence. Despite MAT’s well-established superiority to either pharmacotherapy or psychosocial interventions alone, the use of MAT has, in some cases, declined. According to the Substance Abuse and Mental Health Services Administration (SAMHSA), MAT was used in 35% of heroin-related treatment admissions in 2002, compared with 28% in 2010.

Reasons for MAT’s difficult path to acceptance are manifold, ranging from lack of certified facilities to administer the medications to misunderstanding about how the medications work, Dr. Tsai said.

A law passed earlier this year and the issuance of a final federal rule that increases the legal patient load that certified MAT providers can treat annually were designed to expand access to MAT. These, however, are only partial solutions, according to Margaret Chaplin, MD, a psychiatrist and program director of Community Mental Health Affiliates in New Britain, Conn.

“Can you imagine if endocrinologists were the only doctors who were certified to prescribe insulin and that each of them was only limited to prescribing to 100 patients?” Dr. Chaplin said in an interview. The final rule brought the number from 100 to 275 patients per year that a certified addiction specialist can treat. This might expand access to care, but “it sends a message that either the people with [addiction] don’t deserve treatment or that they don’t have a legitimate illness,” said Dr. Chaplin, who also was a presenter at the meeting.

Viewing people with opioid addiction through a lens of moral failing only compounds the nation’s addiction crisis, Dr. Chaplin believes. “Not to say that a person with a substance use disorder doesn’t have a responsibility to take care of their illness, [but] our [leaders] haven’t been well educated on the scientific evidence that addiction is a brain disease.”

It is true that, until the Comprehensive Addiction and Recovery Act was signed into law over the summer, nurse practitioners and physician assistants could have prescribed controlled substances such as acetaminophen/oxycontin but not the far less dangerous – and potentially life-saving – partial opioid agonist buprenorphine. Under the new law, those health care professions now have the same buprenorphine prescribing rights as physicians.

New legislation does not guarantee access to treatment, however. “Funding for MAT programs varies throughout the states, and the availability of these medications on formularies often determines how readily accessible MAT interventions are,” said Dr. Tsai, who emphasized the role of collaboration in ensuring the laws take hold.

“Addiction specialists comprise a minority of the work force. To scale MAT up, we need to engage other prescribers from other systems, including those in primary care and mental health,” Dr. Tai said. To wit, the three primary MAT facilities in Los Angeles County offer learning collaboratives with primary care clinicians who want to incorporate these services into their practice, even if they are not certified addiction specialists themselves. This helps increase referrals to the treatment facilities, he explained.

Overcoming resistance to offering MAT ultimately will depend on educating leaders about the costs of not doing so, Dr. Tsai and Dr. Chaplin said.

“Our system has been slow to adopt a disease model of addiction,” Dr. Chaplin said. “Buprenorphine and methadone are life-saving medical treatments that are regulated in ways that you don’t see for any other medical condition.”

SAMHSA currently is requesting comments through Nov. 1, 2016, on what should be required of MAT providers under the new law.

Neither Dr. Tsai nor Dr. Chaplin had any relevant disclosures.

 

– Pressure on physicians to prescribe fewer opioids could have unintended consequences in the absence of adequate access to treatment, according to experts.

“There is mixed evidence that, when medication-assisted treatment is lacking, there are higher rates of transition from prescription opioids to heroin,” Gary Tsai, MD, said during a presentation at the American Psychiatric Association’s Institute on Psychiatric Services.

Whitney McKnight
Dr. Gary Tsai


“As we constrict our prescribing, we want to make sure that there is ready access to these interventions, so that those who are already dependent on opioids can transition to something safer,” said Dr. Tsai, medical director and science officer of Substance Abuse Prevention and Control, a division of Los Angeles County’s public health department.

Medication-assisted treatment (MAT) uses methadone, buprenorphine, or naltrexone in combination with appropriate behavioral and other other psychosocial therapies to help achieve opioid abstinence. Despite MAT’s well-established superiority to either pharmacotherapy or psychosocial interventions alone, the use of MAT has, in some cases, declined. According to the Substance Abuse and Mental Health Services Administration (SAMHSA), MAT was used in 35% of heroin-related treatment admissions in 2002, compared with 28% in 2010.

Reasons for MAT’s difficult path to acceptance are manifold, ranging from lack of certified facilities to administer the medications to misunderstanding about how the medications work, Dr. Tsai said.

A law passed earlier this year and the issuance of a final federal rule that increases the legal patient load that certified MAT providers can treat annually were designed to expand access to MAT. These, however, are only partial solutions, according to Margaret Chaplin, MD, a psychiatrist and program director of Community Mental Health Affiliates in New Britain, Conn.

“Can you imagine if endocrinologists were the only doctors who were certified to prescribe insulin and that each of them was only limited to prescribing to 100 patients?” Dr. Chaplin said in an interview. The final rule brought the number from 100 to 275 patients per year that a certified addiction specialist can treat. This might expand access to care, but “it sends a message that either the people with [addiction] don’t deserve treatment or that they don’t have a legitimate illness,” said Dr. Chaplin, who also was a presenter at the meeting.

Viewing people with opioid addiction through a lens of moral failing only compounds the nation’s addiction crisis, Dr. Chaplin believes. “Not to say that a person with a substance use disorder doesn’t have a responsibility to take care of their illness, [but] our [leaders] haven’t been well educated on the scientific evidence that addiction is a brain disease.”

It is true that, until the Comprehensive Addiction and Recovery Act was signed into law over the summer, nurse practitioners and physician assistants could have prescribed controlled substances such as acetaminophen/oxycontin but not the far less dangerous – and potentially life-saving – partial opioid agonist buprenorphine. Under the new law, those health care professions now have the same buprenorphine prescribing rights as physicians.

New legislation does not guarantee access to treatment, however. “Funding for MAT programs varies throughout the states, and the availability of these medications on formularies often determines how readily accessible MAT interventions are,” said Dr. Tsai, who emphasized the role of collaboration in ensuring the laws take hold.

“Addiction specialists comprise a minority of the work force. To scale MAT up, we need to engage other prescribers from other systems, including those in primary care and mental health,” Dr. Tai said. To wit, the three primary MAT facilities in Los Angeles County offer learning collaboratives with primary care clinicians who want to incorporate these services into their practice, even if they are not certified addiction specialists themselves. This helps increase referrals to the treatment facilities, he explained.

Overcoming resistance to offering MAT ultimately will depend on educating leaders about the costs of not doing so, Dr. Tsai and Dr. Chaplin said.

“Our system has been slow to adopt a disease model of addiction,” Dr. Chaplin said. “Buprenorphine and methadone are life-saving medical treatments that are regulated in ways that you don’t see for any other medical condition.”

SAMHSA currently is requesting comments through Nov. 1, 2016, on what should be required of MAT providers under the new law.

Neither Dr. Tsai nor Dr. Chaplin had any relevant disclosures.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

EXPERT ANALYSIS FROM INSTITUTE ON PSYCHIATRIC SERVICES

Disallow All Ads

SAVR for radiation-induced aortic stenosis has high late mortality

Article Type
Changed
Wed, 01/02/2019 - 09:41

 

ROME– Radiation-induced aortic stenosis is associated with markedly worse long-term outcome after surgical aortic valve replacement than when the operation is performed in patients without a history of radiotherapy, Milind Y. Desai, MD, reported at the annual congress of the European Society of Cardiology.

Moreover, the Society of Thoracic Surgeons (STS) score isn’t good at risk-stratifying patients with radiation-induced aortic stenosis who are under consideration for surgical aortic valve replacement (SAVR).

Bruce Jancin/Frontline Medical News
Dr. Milind Desai
“We probably need to develop a new score for these patients,” said Dr. Desai, a cardiologist at the Cleveland Clinic.

Radiation-induced heart disease is a late complication of thoracic radiotherapy. It’s particularly common in patients who got radiation for lymphomas or breast cancer. It can affect any cardiac structure, including the myocardium, pericardium, valves, coronary arteries, and the conduction system.

Aortic stenosis is the most common valvular manifestation, present in roughly 80% of patients with radiation-induced heart disease. At the Cleveland Clinic, the average time from radiotherapy to development of radiation-induced aortic stenosis (RIAS) is about 20 years. The condition is characterized by thickening of the junction between the base of the anterior mitral leaflet and aortic root, known as the aortomitral curtain, Dr. Desai explained.

He presented a retrospective observational cohort study involving 172 patients who underwent SAVR for RIAS and an equal number of SAVR patients with no such history. The groups were matched by age, sex, aortic valve area, and type and timing of SAVR. Of note, the group with RIAS had a mean preoperative STS score of 11, and the control group averaged a similar score of 10.

The key finding: During a mean follow-up of 6 years, the all-cause mortality rate was a hefty 48% in patients with RIAS, compared with just 7% in matched controls. Only about 5% of deaths in the group with RIAS were from recurrent malignancy. The low figure is not surprising given the average 20-year lag between radiotherapy and development of radiation-induced heart disease.

“In our experience, most of these patients develop a recurrent pleural effusion and nasty cardiopulmonary issues that result in their death,” according to Dr. Desai.

In a multivariate Cox proportional hazards analysis, a history of chest radiation therapy was by far the strongest predictor of all-cause mortality, conferring an 8.5-fold increase in risk.

The only other statistically significant predictor of mortality during follow-up in multivariate analysis was a high STS score, with an associated weak albeit statistically significant 1.15-fold increased risk. A total of 30 of 78 (39%) RIAS patients with an STS score below 4 died during follow-up, compared with none of 91 controls.

Thirty-four of 92 (37%) RIAS patients under age 65 died during follow-up, whereas none of 83 control SAVR patients did so.

Having coronary artery bypass surgery or other cardiac surgery at the time of SAVR was not associated with significantly increased risk of mortality compared with solo SAVR.

In-hospital outcomes were consistently worse after SAVR in the RIAS group. Half of the RIAS patients experienced in-hospital atrial fibrillation and 29% developed persistent atrial fibrillation, compared with 30% and 24% of controls. About 22% of RIAS patients were readmitted within 3 months after surgery, as were only 8% of controls. In-hospital mortality occurred in 2% of SAVR patients with RIAS; none of the matched controls did.

Dr. Desai reported having no financial interests relative to this study.
 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

ROME– Radiation-induced aortic stenosis is associated with markedly worse long-term outcome after surgical aortic valve replacement than when the operation is performed in patients without a history of radiotherapy, Milind Y. Desai, MD, reported at the annual congress of the European Society of Cardiology.

Moreover, the Society of Thoracic Surgeons (STS) score isn’t good at risk-stratifying patients with radiation-induced aortic stenosis who are under consideration for surgical aortic valve replacement (SAVR).

Bruce Jancin/Frontline Medical News
Dr. Milind Desai
“We probably need to develop a new score for these patients,” said Dr. Desai, a cardiologist at the Cleveland Clinic.

Radiation-induced heart disease is a late complication of thoracic radiotherapy. It’s particularly common in patients who got radiation for lymphomas or breast cancer. It can affect any cardiac structure, including the myocardium, pericardium, valves, coronary arteries, and the conduction system.

Aortic stenosis is the most common valvular manifestation, present in roughly 80% of patients with radiation-induced heart disease. At the Cleveland Clinic, the average time from radiotherapy to development of radiation-induced aortic stenosis (RIAS) is about 20 years. The condition is characterized by thickening of the junction between the base of the anterior mitral leaflet and aortic root, known as the aortomitral curtain, Dr. Desai explained.

He presented a retrospective observational cohort study involving 172 patients who underwent SAVR for RIAS and an equal number of SAVR patients with no such history. The groups were matched by age, sex, aortic valve area, and type and timing of SAVR. Of note, the group with RIAS had a mean preoperative STS score of 11, and the control group averaged a similar score of 10.

The key finding: During a mean follow-up of 6 years, the all-cause mortality rate was a hefty 48% in patients with RIAS, compared with just 7% in matched controls. Only about 5% of deaths in the group with RIAS were from recurrent malignancy. The low figure is not surprising given the average 20-year lag between radiotherapy and development of radiation-induced heart disease.

“In our experience, most of these patients develop a recurrent pleural effusion and nasty cardiopulmonary issues that result in their death,” according to Dr. Desai.

In a multivariate Cox proportional hazards analysis, a history of chest radiation therapy was by far the strongest predictor of all-cause mortality, conferring an 8.5-fold increase in risk.

The only other statistically significant predictor of mortality during follow-up in multivariate analysis was a high STS score, with an associated weak albeit statistically significant 1.15-fold increased risk. A total of 30 of 78 (39%) RIAS patients with an STS score below 4 died during follow-up, compared with none of 91 controls.

Thirty-four of 92 (37%) RIAS patients under age 65 died during follow-up, whereas none of 83 control SAVR patients did so.

Having coronary artery bypass surgery or other cardiac surgery at the time of SAVR was not associated with significantly increased risk of mortality compared with solo SAVR.

In-hospital outcomes were consistently worse after SAVR in the RIAS group. Half of the RIAS patients experienced in-hospital atrial fibrillation and 29% developed persistent atrial fibrillation, compared with 30% and 24% of controls. About 22% of RIAS patients were readmitted within 3 months after surgery, as were only 8% of controls. In-hospital mortality occurred in 2% of SAVR patients with RIAS; none of the matched controls did.

Dr. Desai reported having no financial interests relative to this study.
 

 

ROME– Radiation-induced aortic stenosis is associated with markedly worse long-term outcome after surgical aortic valve replacement than when the operation is performed in patients without a history of radiotherapy, Milind Y. Desai, MD, reported at the annual congress of the European Society of Cardiology.

Moreover, the Society of Thoracic Surgeons (STS) score isn’t good at risk-stratifying patients with radiation-induced aortic stenosis who are under consideration for surgical aortic valve replacement (SAVR).

Bruce Jancin/Frontline Medical News
Dr. Milind Desai
“We probably need to develop a new score for these patients,” said Dr. Desai, a cardiologist at the Cleveland Clinic.

Radiation-induced heart disease is a late complication of thoracic radiotherapy. It’s particularly common in patients who got radiation for lymphomas or breast cancer. It can affect any cardiac structure, including the myocardium, pericardium, valves, coronary arteries, and the conduction system.

Aortic stenosis is the most common valvular manifestation, present in roughly 80% of patients with radiation-induced heart disease. At the Cleveland Clinic, the average time from radiotherapy to development of radiation-induced aortic stenosis (RIAS) is about 20 years. The condition is characterized by thickening of the junction between the base of the anterior mitral leaflet and aortic root, known as the aortomitral curtain, Dr. Desai explained.

He presented a retrospective observational cohort study involving 172 patients who underwent SAVR for RIAS and an equal number of SAVR patients with no such history. The groups were matched by age, sex, aortic valve area, and type and timing of SAVR. Of note, the group with RIAS had a mean preoperative STS score of 11, and the control group averaged a similar score of 10.

The key finding: During a mean follow-up of 6 years, the all-cause mortality rate was a hefty 48% in patients with RIAS, compared with just 7% in matched controls. Only about 5% of deaths in the group with RIAS were from recurrent malignancy. The low figure is not surprising given the average 20-year lag between radiotherapy and development of radiation-induced heart disease.

“In our experience, most of these patients develop a recurrent pleural effusion and nasty cardiopulmonary issues that result in their death,” according to Dr. Desai.

In a multivariate Cox proportional hazards analysis, a history of chest radiation therapy was by far the strongest predictor of all-cause mortality, conferring an 8.5-fold increase in risk.

The only other statistically significant predictor of mortality during follow-up in multivariate analysis was a high STS score, with an associated weak albeit statistically significant 1.15-fold increased risk. A total of 30 of 78 (39%) RIAS patients with an STS score below 4 died during follow-up, compared with none of 91 controls.

Thirty-four of 92 (37%) RIAS patients under age 65 died during follow-up, whereas none of 83 control SAVR patients did so.

Having coronary artery bypass surgery or other cardiac surgery at the time of SAVR was not associated with significantly increased risk of mortality compared with solo SAVR.

In-hospital outcomes were consistently worse after SAVR in the RIAS group. Half of the RIAS patients experienced in-hospital atrial fibrillation and 29% developed persistent atrial fibrillation, compared with 30% and 24% of controls. About 22% of RIAS patients were readmitted within 3 months after surgery, as were only 8% of controls. In-hospital mortality occurred in 2% of SAVR patients with RIAS; none of the matched controls did.

Dr. Desai reported having no financial interests relative to this study.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE ESC CONGRESS 2016

Disallow All Ads
Vitals

 

Key clinical point: Mortality is high following surgical aortic valve replacement in patients with radiation-induced severe aortic stenosis.

Major finding: All-cause mortality occurred in 48% of 172 patients with radiation-induced severe aortic stenosis during a mean follow-up of 6 years after surgical aortic valve replacement, compared with just 7% of matched controls.

Data source: This was a retrospective observational study involving 172 closely matched pairs of surgical aortic valve replacement patients.

Disclosures: The presenter reported having no financial conflicts of interest regarding this study.