News and Views that Matter to Physicians

Top Sections
Commentary
Teachable Moments
hn
Main menu
HOSP Main Menu
Explore menu
HOSP Explore Menu
Proclivity ID
18825001
Unpublish
Specialty Focus
Cardiology
Critical Care
Imaging
Hospice & Palliative Medicine
Altmetric
DSM Affiliated
Display in offset block
Enable Disqus
Display Author and Disclosure Link
Publication Type
News
Slot System
Top 25
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz

ASCO: Patients with advanced cancer should receive palliative care within 8 weeks of diagnosis

Article Type
Changed
Fri, 01/18/2019 - 16:19

 

Patients with advanced cancer should receive dedicated palliative care services early in the disease course, concurrently with active treatment, according to the American Society of Clinical Oncology’s new guidelines on the integration of palliative care into standard oncology care.

Ideally, patients should be referred to interdisciplinary palliative care teams within 8 weeks of cancer diagnosis, and palliative care should be available in both the inpatient and outpatient setting, recommended ASCO.

The guidelines, which updated and expanded the 2012 ASCO provisional clinical opinion, were developed by a multidisciplinary expert panel that systematically reviewed phase III randomized controlled trials, secondary analyses of those trials, and meta-analyses that were published between March 2010 and January 2016.

According to the panel, essential components of palliative care include:
 

• Rapport and relationship building with patient and family caregivers.

• Symptom, distress, and functional status management.

• Exploration of understanding and education about illness and prognosis.

• Clarification of treatment goals.

• Assessment and support of coping needs.

• Assistance with medical decision making.

Dr. Betty Ferrell
Dr. Betty Ferrell
• Coordination with other care providers.

• Provision of referrals to other care providers as indicated.

The panel makes the case that not only does palliative care improve care for patients and families, it also likely reduces the total cost of care, often substantially. However, “race, poverty and low socioeconomic and/or immigration status are determinants of barriers to palliative care,” wrote the expert panel, which was cochaired by Betty Ferrell, PhD, of the City of Hope Medical Center, Duarte, Calif., and Thomas Smith, MD, of the Sidney Kimmel Comprehensive Cancer Center in Baltimore.

Dr. Thomas J. Smith
While it was not “within the scope of this guideline to examine specific factors contributing to disparities,” the panel urged health care providers to be aware of the paucity of health disparities research on palliative care and to “strive to deliver the highest level of cancer care to these vulnerable populations.”

Read the full guidelines here.

Publications
Topics
Sections

 

Patients with advanced cancer should receive dedicated palliative care services early in the disease course, concurrently with active treatment, according to the American Society of Clinical Oncology’s new guidelines on the integration of palliative care into standard oncology care.

Ideally, patients should be referred to interdisciplinary palliative care teams within 8 weeks of cancer diagnosis, and palliative care should be available in both the inpatient and outpatient setting, recommended ASCO.

The guidelines, which updated and expanded the 2012 ASCO provisional clinical opinion, were developed by a multidisciplinary expert panel that systematically reviewed phase III randomized controlled trials, secondary analyses of those trials, and meta-analyses that were published between March 2010 and January 2016.

According to the panel, essential components of palliative care include:
 

• Rapport and relationship building with patient and family caregivers.

• Symptom, distress, and functional status management.

• Exploration of understanding and education about illness and prognosis.

• Clarification of treatment goals.

• Assessment and support of coping needs.

• Assistance with medical decision making.

Dr. Betty Ferrell
Dr. Betty Ferrell
• Coordination with other care providers.

• Provision of referrals to other care providers as indicated.

The panel makes the case that not only does palliative care improve care for patients and families, it also likely reduces the total cost of care, often substantially. However, “race, poverty and low socioeconomic and/or immigration status are determinants of barriers to palliative care,” wrote the expert panel, which was cochaired by Betty Ferrell, PhD, of the City of Hope Medical Center, Duarte, Calif., and Thomas Smith, MD, of the Sidney Kimmel Comprehensive Cancer Center in Baltimore.

Dr. Thomas J. Smith
While it was not “within the scope of this guideline to examine specific factors contributing to disparities,” the panel urged health care providers to be aware of the paucity of health disparities research on palliative care and to “strive to deliver the highest level of cancer care to these vulnerable populations.”

Read the full guidelines here.

 

Patients with advanced cancer should receive dedicated palliative care services early in the disease course, concurrently with active treatment, according to the American Society of Clinical Oncology’s new guidelines on the integration of palliative care into standard oncology care.

Ideally, patients should be referred to interdisciplinary palliative care teams within 8 weeks of cancer diagnosis, and palliative care should be available in both the inpatient and outpatient setting, recommended ASCO.

The guidelines, which updated and expanded the 2012 ASCO provisional clinical opinion, were developed by a multidisciplinary expert panel that systematically reviewed phase III randomized controlled trials, secondary analyses of those trials, and meta-analyses that were published between March 2010 and January 2016.

According to the panel, essential components of palliative care include:
 

• Rapport and relationship building with patient and family caregivers.

• Symptom, distress, and functional status management.

• Exploration of understanding and education about illness and prognosis.

• Clarification of treatment goals.

• Assessment and support of coping needs.

• Assistance with medical decision making.

Dr. Betty Ferrell
Dr. Betty Ferrell
• Coordination with other care providers.

• Provision of referrals to other care providers as indicated.

The panel makes the case that not only does palliative care improve care for patients and families, it also likely reduces the total cost of care, often substantially. However, “race, poverty and low socioeconomic and/or immigration status are determinants of barriers to palliative care,” wrote the expert panel, which was cochaired by Betty Ferrell, PhD, of the City of Hope Medical Center, Duarte, Calif., and Thomas Smith, MD, of the Sidney Kimmel Comprehensive Cancer Center in Baltimore.

Dr. Thomas J. Smith
While it was not “within the scope of this guideline to examine specific factors contributing to disparities,” the panel urged health care providers to be aware of the paucity of health disparities research on palliative care and to “strive to deliver the highest level of cancer care to these vulnerable populations.”

Read the full guidelines here.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF CLINICAL ONCOLOGY

Disallow All Ads

Results puzzling for embolic protection during TAVR

Don’t abandon cerebral protection devices
Article Type
Changed
Mon, 01/07/2019 - 12:46

 

The largest randomized clinical trial to assess the safety and efficacy of cerebral embolic protection systems during transcatheter aortic valve replacement yielded puzzling and somewhat contradictory results, according to a report presented at the Transcatheter Cardiovascular Therapeutics annual meeting and published simultaneously in the Journal of the American College of Cardiology.

Virtually every device in this industry-sponsored study involving 363 elderly patients (mean age, 83.4 years) with severe aortic stenosis trapped particulate debris as intended, the mean volume of new lesions in the protected areas of the brain was reduced by 42%, and the number and volume of new lesions correlated with neurocognitive outcomes at 30 days.

However, the reduction in lesion volume did not achieve statistical significance, and the improvement in neurocognitive function also did not reach statistical significance.

In addition, “the sample size was clearly too low to assess clinical outcomes, and in retrospect, was also too low to evaluate follow-up MRI findings or neurocognitive outcomes.” Nevertheless, the trial “provides reassuring evidence of device safety,” said Samir R. Kapadia, MD, of the Cleveland Clinic (J Am Coll Cardiol. 2016 Nov 1. doi: 10.1016/j.jacc.2016.10.023).

In this prospective study, the investigators assessed patients at 17 medical centers in the United States and 2 in Germany. In addition to being elderly, the study patients were at high risk because of frequent comorbidities, including atrial fibrillation (31.7%) and prior stroke (5.8%).

Dr. Samir R. Kapadia
In all, 121 patients were randomly assigned to undergo TAVR with a cerebral embolic protective device and 119 to TAVR without a protective device. New brain lesions were then assessed via MRI at 2-7 days post procedure, and neurocognitive function was assessed at 30 days.

The remaining 123 patients underwent TAVR but not MRI in a safety arm of the trial.

The protection devices were placed “without safety concerns” in most patients. The rate of major adverse events with the device was 7.3%, markedly less than the 18.3% prespecified performance goal for this outcome. Total procedure time was lengthened by only 13 minutes when the device was used, and total fluoroscopy time was increased by only 3 minutes. These findings demonstrate the overall safety of using the device, Dr. Kapadia said.

Debris including thrombus with tissue elements, artery wall particles, calcifications, valve tissue, and foreign materials was retrieved from the filters in 99% of patients.

The mean volume of new cerebral lesions in areas of the brain protected by the device was reduced by 42%, compared with that in patients who underwent TAVR without the protection device. However, this reduction was not statistically significant, so the primary efficacy endpoint of the study was not met.

Similarly, neurocognitive testing at 30 days showed that the volume of new lesions correlated with poorer outcomes. However, the difference in neurocognitive function between the intervention group and the control group did not reach statistical significance.

Several limitations likely contributed to this lack of statistical significance, Dr. Kapadia said.

First, the 5-day “window” for MRI assessment was too long. Both the number and the volume of new lesions rapidly changed over time, which led to marked variance in MRI findings depending on when the images were taken.

In addition, only one TAVR device was available at the time the trial was designed, so the study wasn’t stratified by type of valve device. But several new devices became available during the study, and the study investigators were permitted to use any of them. Both pre- and postimplantation techniques differ among these TAVR devices, but these differences could not be accounted for, given the study design.

Also, certain risk factors for stroke, especially certain findings on baseline MRI, were not understood when the trial was designed, and those factors also were not accounted for, Dr. Kapadia said.

Claret Medical funded the study. Dr. Kapadia reported having no relevant financial disclosures; his associates reported numerous ties to industry sources. The meeting was sponsored by the Cardiovascular Research Foundation.

Body

 

From a logical standpoint, a device that collects cerebral embolic material in 99% of cases should prevent ischemic brain injury, yet the findings from this randomized trial don’t appear to support the routine use of such devices. But it would be inappropriate and unfair to close the book on cerebral protection after this chapter.

The authors acknowledge that an MRI “window” of 5 days creates too much heterogeneity in the data, that multiple TAVR devices requiring different implantation techniques further muddy the picture, and that in retrospect the sample size was inadequate and the study was underpowered. In addition, rigorous neurocognitive assessment can be challenging in elderly, recovering patients, and results can depend on the time of day and the patient’s alertness.

Despite the negative findings regarding both primary and secondary endpoints, the data do show the overall safety of embolic protection devices. We are dealing with a potential benefit that cannot be ignored as TAVR shifts to younger and lower-risk patients.
 

Azeem Latib, MD, is in the interventional cardiology unit at San Raffaele Scientific Institute in Milan. Matteo Pagnesi, MD, is in the interventional cardiology unit at EMO-GVM Centro Cuore Columbus in Milan. San Raffaele Scientific Institute has been involved in clinical studies of embolic protection devices made by Claret Medical, Innovative Cardiovascular Solutions, and Keystone Heart. Dr. Latib and Dr. Pagnesi reported having no other relevant financial disclosures. They made these remarks in an editorial accompanying Dr. Kapadia’s report (J Am Coll Cardiol. 2016 Nov 1. doi: 10.1016/j.jacc.2016.10.036).

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Body

 

From a logical standpoint, a device that collects cerebral embolic material in 99% of cases should prevent ischemic brain injury, yet the findings from this randomized trial don’t appear to support the routine use of such devices. But it would be inappropriate and unfair to close the book on cerebral protection after this chapter.

The authors acknowledge that an MRI “window” of 5 days creates too much heterogeneity in the data, that multiple TAVR devices requiring different implantation techniques further muddy the picture, and that in retrospect the sample size was inadequate and the study was underpowered. In addition, rigorous neurocognitive assessment can be challenging in elderly, recovering patients, and results can depend on the time of day and the patient’s alertness.

Despite the negative findings regarding both primary and secondary endpoints, the data do show the overall safety of embolic protection devices. We are dealing with a potential benefit that cannot be ignored as TAVR shifts to younger and lower-risk patients.
 

Azeem Latib, MD, is in the interventional cardiology unit at San Raffaele Scientific Institute in Milan. Matteo Pagnesi, MD, is in the interventional cardiology unit at EMO-GVM Centro Cuore Columbus in Milan. San Raffaele Scientific Institute has been involved in clinical studies of embolic protection devices made by Claret Medical, Innovative Cardiovascular Solutions, and Keystone Heart. Dr. Latib and Dr. Pagnesi reported having no other relevant financial disclosures. They made these remarks in an editorial accompanying Dr. Kapadia’s report (J Am Coll Cardiol. 2016 Nov 1. doi: 10.1016/j.jacc.2016.10.036).

Body

 

From a logical standpoint, a device that collects cerebral embolic material in 99% of cases should prevent ischemic brain injury, yet the findings from this randomized trial don’t appear to support the routine use of such devices. But it would be inappropriate and unfair to close the book on cerebral protection after this chapter.

The authors acknowledge that an MRI “window” of 5 days creates too much heterogeneity in the data, that multiple TAVR devices requiring different implantation techniques further muddy the picture, and that in retrospect the sample size was inadequate and the study was underpowered. In addition, rigorous neurocognitive assessment can be challenging in elderly, recovering patients, and results can depend on the time of day and the patient’s alertness.

Despite the negative findings regarding both primary and secondary endpoints, the data do show the overall safety of embolic protection devices. We are dealing with a potential benefit that cannot be ignored as TAVR shifts to younger and lower-risk patients.
 

Azeem Latib, MD, is in the interventional cardiology unit at San Raffaele Scientific Institute in Milan. Matteo Pagnesi, MD, is in the interventional cardiology unit at EMO-GVM Centro Cuore Columbus in Milan. San Raffaele Scientific Institute has been involved in clinical studies of embolic protection devices made by Claret Medical, Innovative Cardiovascular Solutions, and Keystone Heart. Dr. Latib and Dr. Pagnesi reported having no other relevant financial disclosures. They made these remarks in an editorial accompanying Dr. Kapadia’s report (J Am Coll Cardiol. 2016 Nov 1. doi: 10.1016/j.jacc.2016.10.036).

Title
Don’t abandon cerebral protection devices
Don’t abandon cerebral protection devices

 

The largest randomized clinical trial to assess the safety and efficacy of cerebral embolic protection systems during transcatheter aortic valve replacement yielded puzzling and somewhat contradictory results, according to a report presented at the Transcatheter Cardiovascular Therapeutics annual meeting and published simultaneously in the Journal of the American College of Cardiology.

Virtually every device in this industry-sponsored study involving 363 elderly patients (mean age, 83.4 years) with severe aortic stenosis trapped particulate debris as intended, the mean volume of new lesions in the protected areas of the brain was reduced by 42%, and the number and volume of new lesions correlated with neurocognitive outcomes at 30 days.

However, the reduction in lesion volume did not achieve statistical significance, and the improvement in neurocognitive function also did not reach statistical significance.

In addition, “the sample size was clearly too low to assess clinical outcomes, and in retrospect, was also too low to evaluate follow-up MRI findings or neurocognitive outcomes.” Nevertheless, the trial “provides reassuring evidence of device safety,” said Samir R. Kapadia, MD, of the Cleveland Clinic (J Am Coll Cardiol. 2016 Nov 1. doi: 10.1016/j.jacc.2016.10.023).

In this prospective study, the investigators assessed patients at 17 medical centers in the United States and 2 in Germany. In addition to being elderly, the study patients were at high risk because of frequent comorbidities, including atrial fibrillation (31.7%) and prior stroke (5.8%).

Dr. Samir R. Kapadia
In all, 121 patients were randomly assigned to undergo TAVR with a cerebral embolic protective device and 119 to TAVR without a protective device. New brain lesions were then assessed via MRI at 2-7 days post procedure, and neurocognitive function was assessed at 30 days.

The remaining 123 patients underwent TAVR but not MRI in a safety arm of the trial.

The protection devices were placed “without safety concerns” in most patients. The rate of major adverse events with the device was 7.3%, markedly less than the 18.3% prespecified performance goal for this outcome. Total procedure time was lengthened by only 13 minutes when the device was used, and total fluoroscopy time was increased by only 3 minutes. These findings demonstrate the overall safety of using the device, Dr. Kapadia said.

Debris including thrombus with tissue elements, artery wall particles, calcifications, valve tissue, and foreign materials was retrieved from the filters in 99% of patients.

The mean volume of new cerebral lesions in areas of the brain protected by the device was reduced by 42%, compared with that in patients who underwent TAVR without the protection device. However, this reduction was not statistically significant, so the primary efficacy endpoint of the study was not met.

Similarly, neurocognitive testing at 30 days showed that the volume of new lesions correlated with poorer outcomes. However, the difference in neurocognitive function between the intervention group and the control group did not reach statistical significance.

Several limitations likely contributed to this lack of statistical significance, Dr. Kapadia said.

First, the 5-day “window” for MRI assessment was too long. Both the number and the volume of new lesions rapidly changed over time, which led to marked variance in MRI findings depending on when the images were taken.

In addition, only one TAVR device was available at the time the trial was designed, so the study wasn’t stratified by type of valve device. But several new devices became available during the study, and the study investigators were permitted to use any of them. Both pre- and postimplantation techniques differ among these TAVR devices, but these differences could not be accounted for, given the study design.

Also, certain risk factors for stroke, especially certain findings on baseline MRI, were not understood when the trial was designed, and those factors also were not accounted for, Dr. Kapadia said.

Claret Medical funded the study. Dr. Kapadia reported having no relevant financial disclosures; his associates reported numerous ties to industry sources. The meeting was sponsored by the Cardiovascular Research Foundation.

 

The largest randomized clinical trial to assess the safety and efficacy of cerebral embolic protection systems during transcatheter aortic valve replacement yielded puzzling and somewhat contradictory results, according to a report presented at the Transcatheter Cardiovascular Therapeutics annual meeting and published simultaneously in the Journal of the American College of Cardiology.

Virtually every device in this industry-sponsored study involving 363 elderly patients (mean age, 83.4 years) with severe aortic stenosis trapped particulate debris as intended, the mean volume of new lesions in the protected areas of the brain was reduced by 42%, and the number and volume of new lesions correlated with neurocognitive outcomes at 30 days.

However, the reduction in lesion volume did not achieve statistical significance, and the improvement in neurocognitive function also did not reach statistical significance.

In addition, “the sample size was clearly too low to assess clinical outcomes, and in retrospect, was also too low to evaluate follow-up MRI findings or neurocognitive outcomes.” Nevertheless, the trial “provides reassuring evidence of device safety,” said Samir R. Kapadia, MD, of the Cleveland Clinic (J Am Coll Cardiol. 2016 Nov 1. doi: 10.1016/j.jacc.2016.10.023).

In this prospective study, the investigators assessed patients at 17 medical centers in the United States and 2 in Germany. In addition to being elderly, the study patients were at high risk because of frequent comorbidities, including atrial fibrillation (31.7%) and prior stroke (5.8%).

Dr. Samir R. Kapadia
In all, 121 patients were randomly assigned to undergo TAVR with a cerebral embolic protective device and 119 to TAVR without a protective device. New brain lesions were then assessed via MRI at 2-7 days post procedure, and neurocognitive function was assessed at 30 days.

The remaining 123 patients underwent TAVR but not MRI in a safety arm of the trial.

The protection devices were placed “without safety concerns” in most patients. The rate of major adverse events with the device was 7.3%, markedly less than the 18.3% prespecified performance goal for this outcome. Total procedure time was lengthened by only 13 minutes when the device was used, and total fluoroscopy time was increased by only 3 minutes. These findings demonstrate the overall safety of using the device, Dr. Kapadia said.

Debris including thrombus with tissue elements, artery wall particles, calcifications, valve tissue, and foreign materials was retrieved from the filters in 99% of patients.

The mean volume of new cerebral lesions in areas of the brain protected by the device was reduced by 42%, compared with that in patients who underwent TAVR without the protection device. However, this reduction was not statistically significant, so the primary efficacy endpoint of the study was not met.

Similarly, neurocognitive testing at 30 days showed that the volume of new lesions correlated with poorer outcomes. However, the difference in neurocognitive function between the intervention group and the control group did not reach statistical significance.

Several limitations likely contributed to this lack of statistical significance, Dr. Kapadia said.

First, the 5-day “window” for MRI assessment was too long. Both the number and the volume of new lesions rapidly changed over time, which led to marked variance in MRI findings depending on when the images were taken.

In addition, only one TAVR device was available at the time the trial was designed, so the study wasn’t stratified by type of valve device. But several new devices became available during the study, and the study investigators were permitted to use any of them. Both pre- and postimplantation techniques differ among these TAVR devices, but these differences could not be accounted for, given the study design.

Also, certain risk factors for stroke, especially certain findings on baseline MRI, were not understood when the trial was designed, and those factors also were not accounted for, Dr. Kapadia said.

Claret Medical funded the study. Dr. Kapadia reported having no relevant financial disclosures; his associates reported numerous ties to industry sources. The meeting was sponsored by the Cardiovascular Research Foundation.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Disallow All Ads
Alternative CME
Vitals

 

Key clinical point: The largest randomized clinical trial to assess the safety and efficacy of cerebral embolic protection systems during TAVR yielded puzzling and contradictory results.

Major finding: Debris including thrombus with tissue elements, artery wall particles, calcifications, valve tissue, and foreign materials was retrieved from the cerebral protection filters in 99% of patients.

Data source: A prospective, international, randomized trial involving 363 elderly patients undergoing TAVR for severe aortic stenosis.

Disclosures: Claret Medical funded the study. Dr. Kapadia reported having no relevant financial disclosures; his associates reported numerous ties to industry sources.

Hospitalizations for opioid poisoning tripled in preschool children

Article Type
Changed
Fri, 01/18/2019 - 16:19

 

From 1997 to 2012, the annual number of hospitalizations for opioid poisoning rose 178% among children aged 1-19 years, according to data from over 13,000 discharge records.

In 2012, there were 2,918 hospitalizations for opioid poisoning among children aged 1-19, compared with 1,049 in 1997, reported Julie R. Gaither, PhD, MPH, RN, and her associates at Yale University in New Haven, Conn. (JAMA Pediatr. 2016 Oct 31. doi: 10.1001/jamapediatrics.2016.2154).

The greatest change occurred among the youngest children, as the number of those aged 1-4 years rose from 133 in 1997 to 421 in 2012 – an increase of 217%. For those aged 15-19 years, the annual number of hospitalizations went from 715 to 2,171 (204%) over that time period, which included a slight drop from 2009 to 2012, according to the investigators, who used data from 13,052 discharges in the Agency for Healthcare Research and Quality’s Kids’ Inpatient Database.



The increase in hospitalizations for prescription opioid poisoning in children aged 10-14 years was 58% from 1997 to 2012 (rising from 171 to 272), while estimates for 5- to 9-year-olds did not meet the criteria for statistical reliability and were not included in the analysis, Dr. Gaither and her associates said.

The study was supported by grants from the National Institute on Drug Abuse. The investigators did not report any conflicts of interest.

Publications
Topics
Sections

 

From 1997 to 2012, the annual number of hospitalizations for opioid poisoning rose 178% among children aged 1-19 years, according to data from over 13,000 discharge records.

In 2012, there were 2,918 hospitalizations for opioid poisoning among children aged 1-19, compared with 1,049 in 1997, reported Julie R. Gaither, PhD, MPH, RN, and her associates at Yale University in New Haven, Conn. (JAMA Pediatr. 2016 Oct 31. doi: 10.1001/jamapediatrics.2016.2154).

The greatest change occurred among the youngest children, as the number of those aged 1-4 years rose from 133 in 1997 to 421 in 2012 – an increase of 217%. For those aged 15-19 years, the annual number of hospitalizations went from 715 to 2,171 (204%) over that time period, which included a slight drop from 2009 to 2012, according to the investigators, who used data from 13,052 discharges in the Agency for Healthcare Research and Quality’s Kids’ Inpatient Database.



The increase in hospitalizations for prescription opioid poisoning in children aged 10-14 years was 58% from 1997 to 2012 (rising from 171 to 272), while estimates for 5- to 9-year-olds did not meet the criteria for statistical reliability and were not included in the analysis, Dr. Gaither and her associates said.

The study was supported by grants from the National Institute on Drug Abuse. The investigators did not report any conflicts of interest.

 

From 1997 to 2012, the annual number of hospitalizations for opioid poisoning rose 178% among children aged 1-19 years, according to data from over 13,000 discharge records.

In 2012, there were 2,918 hospitalizations for opioid poisoning among children aged 1-19, compared with 1,049 in 1997, reported Julie R. Gaither, PhD, MPH, RN, and her associates at Yale University in New Haven, Conn. (JAMA Pediatr. 2016 Oct 31. doi: 10.1001/jamapediatrics.2016.2154).

The greatest change occurred among the youngest children, as the number of those aged 1-4 years rose from 133 in 1997 to 421 in 2012 – an increase of 217%. For those aged 15-19 years, the annual number of hospitalizations went from 715 to 2,171 (204%) over that time period, which included a slight drop from 2009 to 2012, according to the investigators, who used data from 13,052 discharges in the Agency for Healthcare Research and Quality’s Kids’ Inpatient Database.



The increase in hospitalizations for prescription opioid poisoning in children aged 10-14 years was 58% from 1997 to 2012 (rising from 171 to 272), while estimates for 5- to 9-year-olds did not meet the criteria for statistical reliability and were not included in the analysis, Dr. Gaither and her associates said.

The study was supported by grants from the National Institute on Drug Abuse. The investigators did not report any conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PEDIATRICS

Disallow All Ads
Vitals

From 1997 to 2012, the annual number of hospitalizations for opioid poisoning rose 178% among children aged 1-19 years, according to data from over 13,000 discharge records.

ECMO patients need less sedation, pain meds than previously reported

Article Type
Changed
Wed, 01/02/2019 - 09:42

 

Patients on extracorporeal membrane oxygenation (ECMO) received relatively low doses of sedatives and analgesics while at a light level of sedation in a single-center prospective study of 32 patients.

In addition, patients rarely required neuromuscular blockade, investigators reported online in the Journal of Critical Care.

This finding contrasts with current guidelines on the management of pain, agitation, and delirium in patients on ECMO. The guidelines are based upon previous research that indicated the need for significant increases in sedative and analgesic doses, as well as the need for neuromuscular blockade, wrote Jeremy R. DeGrado, PharmD, of the department of pharmacy at Brigham and Women’s Hospital, Boston, and his colleagues (J Crit Care. 2016 Aug 10;37:1-6. doi: 10.1016/j.jcrc.2016.07.020).

“Patients required significantly lower doses of opioids and sedatives than previously reported in the literature and did not demonstrate a need for increasing doses throughout the study period,” the investigators said. “Continuous infusions of opioids were utilized on most ECMO days, but continuous infusions of benzodiazepines were used on less than half of all ECMO days.”

Their 2-year, prospective, observational study assessed 32 adult intensive care unit patients on ECMO support for more than 48 hours. A total of 15 patients received VA (venoarterial) ECMO and 17 received VV (venovenous) ECMO. Patients received a median daily dose of benzodiazepines (midazolam equivalents) of 24 mg and a median daily dose of opioids (fentanyl equivalents) of 3,875 mcg.

The primary indication for VA ECMO was cardiogenic shock, while VV ECMO was mainly used as a bridge to lung transplant or in patients with severe acute respiratory distress syndrome. The researchers evaluated a total of 475 ECMO days: 110 VA ECMO and 365 VV ECMO.

On average, patients were sedated to Richmond Agitation Sedation Scale scores between 0 and −1. Across all 475 ECMO days, patients were treated with continuous infusions of opioids (on 85% of ECMO days), benzodiazepines (42%), propofol (20%), dexmedetomidine (7%), and neuromuscular blocking agents (13%).

In total, patients who received VV ECMO had a higher median dose of opioids and trended toward a lower dose of benzodiazepines than those who received VA ECMO, Dr. DeGrado and his associates reported.

In total, patients in the VA arm, compared with those in the VV arm, more frequently received a continuous infusion opioid (96% vs. 82% of days) and a benzodiazepine (58% vs. 37% of days). These differences were statistically significant.

Adjunctive therapies, including antipsychotics and clonidine, were administered frequently, according to the report.

“We did not observe an increase in dose requirement over time during ECMO support, possibly due to a multi-modal pharmacologic approach. Overall, patients were not deeply sedated and rarely required neuromuscular blockade. The hypothesis that patients on ECMO require high doses of sedatives and analgesics should be further investigated,” the researchers concluded.

The authors reported that they had no disclosures.

Publications
Topics
Sections

 

Patients on extracorporeal membrane oxygenation (ECMO) received relatively low doses of sedatives and analgesics while at a light level of sedation in a single-center prospective study of 32 patients.

In addition, patients rarely required neuromuscular blockade, investigators reported online in the Journal of Critical Care.

This finding contrasts with current guidelines on the management of pain, agitation, and delirium in patients on ECMO. The guidelines are based upon previous research that indicated the need for significant increases in sedative and analgesic doses, as well as the need for neuromuscular blockade, wrote Jeremy R. DeGrado, PharmD, of the department of pharmacy at Brigham and Women’s Hospital, Boston, and his colleagues (J Crit Care. 2016 Aug 10;37:1-6. doi: 10.1016/j.jcrc.2016.07.020).

“Patients required significantly lower doses of opioids and sedatives than previously reported in the literature and did not demonstrate a need for increasing doses throughout the study period,” the investigators said. “Continuous infusions of opioids were utilized on most ECMO days, but continuous infusions of benzodiazepines were used on less than half of all ECMO days.”

Their 2-year, prospective, observational study assessed 32 adult intensive care unit patients on ECMO support for more than 48 hours. A total of 15 patients received VA (venoarterial) ECMO and 17 received VV (venovenous) ECMO. Patients received a median daily dose of benzodiazepines (midazolam equivalents) of 24 mg and a median daily dose of opioids (fentanyl equivalents) of 3,875 mcg.

The primary indication for VA ECMO was cardiogenic shock, while VV ECMO was mainly used as a bridge to lung transplant or in patients with severe acute respiratory distress syndrome. The researchers evaluated a total of 475 ECMO days: 110 VA ECMO and 365 VV ECMO.

On average, patients were sedated to Richmond Agitation Sedation Scale scores between 0 and −1. Across all 475 ECMO days, patients were treated with continuous infusions of opioids (on 85% of ECMO days), benzodiazepines (42%), propofol (20%), dexmedetomidine (7%), and neuromuscular blocking agents (13%).

In total, patients who received VV ECMO had a higher median dose of opioids and trended toward a lower dose of benzodiazepines than those who received VA ECMO, Dr. DeGrado and his associates reported.

In total, patients in the VA arm, compared with those in the VV arm, more frequently received a continuous infusion opioid (96% vs. 82% of days) and a benzodiazepine (58% vs. 37% of days). These differences were statistically significant.

Adjunctive therapies, including antipsychotics and clonidine, were administered frequently, according to the report.

“We did not observe an increase in dose requirement over time during ECMO support, possibly due to a multi-modal pharmacologic approach. Overall, patients were not deeply sedated and rarely required neuromuscular blockade. The hypothesis that patients on ECMO require high doses of sedatives and analgesics should be further investigated,” the researchers concluded.

The authors reported that they had no disclosures.

 

Patients on extracorporeal membrane oxygenation (ECMO) received relatively low doses of sedatives and analgesics while at a light level of sedation in a single-center prospective study of 32 patients.

In addition, patients rarely required neuromuscular blockade, investigators reported online in the Journal of Critical Care.

This finding contrasts with current guidelines on the management of pain, agitation, and delirium in patients on ECMO. The guidelines are based upon previous research that indicated the need for significant increases in sedative and analgesic doses, as well as the need for neuromuscular blockade, wrote Jeremy R. DeGrado, PharmD, of the department of pharmacy at Brigham and Women’s Hospital, Boston, and his colleagues (J Crit Care. 2016 Aug 10;37:1-6. doi: 10.1016/j.jcrc.2016.07.020).

“Patients required significantly lower doses of opioids and sedatives than previously reported in the literature and did not demonstrate a need for increasing doses throughout the study period,” the investigators said. “Continuous infusions of opioids were utilized on most ECMO days, but continuous infusions of benzodiazepines were used on less than half of all ECMO days.”

Their 2-year, prospective, observational study assessed 32 adult intensive care unit patients on ECMO support for more than 48 hours. A total of 15 patients received VA (venoarterial) ECMO and 17 received VV (venovenous) ECMO. Patients received a median daily dose of benzodiazepines (midazolam equivalents) of 24 mg and a median daily dose of opioids (fentanyl equivalents) of 3,875 mcg.

The primary indication for VA ECMO was cardiogenic shock, while VV ECMO was mainly used as a bridge to lung transplant or in patients with severe acute respiratory distress syndrome. The researchers evaluated a total of 475 ECMO days: 110 VA ECMO and 365 VV ECMO.

On average, patients were sedated to Richmond Agitation Sedation Scale scores between 0 and −1. Across all 475 ECMO days, patients were treated with continuous infusions of opioids (on 85% of ECMO days), benzodiazepines (42%), propofol (20%), dexmedetomidine (7%), and neuromuscular blocking agents (13%).

In total, patients who received VV ECMO had a higher median dose of opioids and trended toward a lower dose of benzodiazepines than those who received VA ECMO, Dr. DeGrado and his associates reported.

In total, patients in the VA arm, compared with those in the VV arm, more frequently received a continuous infusion opioid (96% vs. 82% of days) and a benzodiazepine (58% vs. 37% of days). These differences were statistically significant.

Adjunctive therapies, including antipsychotics and clonidine, were administered frequently, according to the report.

“We did not observe an increase in dose requirement over time during ECMO support, possibly due to a multi-modal pharmacologic approach. Overall, patients were not deeply sedated and rarely required neuromuscular blockade. The hypothesis that patients on ECMO require high doses of sedatives and analgesics should be further investigated,” the researchers concluded.

The authors reported that they had no disclosures.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JOURNAL OF CRITICAL CARE

Disallow All Ads
Vitals

 

Key clinical point: Patients on ECMO required relatively low doses of sedatives and analgesics and rarely required neuromuscular blockade.

Major finding: Patients required lower doses of opioids and sedatives than previously reported and did not need increasing doses.

Data source: A single-institution, prospective study of 32 patients on extracorporeal membrane oxygenation.

Disclosures: Dr. DeGrado reported having no financial disclosures.

Selected liver-transplant patients thrive off immunosuppression

Article Type
Changed
Tue, 07/21/2020 - 14:18


MONTREAL  – Three-fifths of pediatric liver-transplant recipients who were doing well enough to attempt weaning from their immunosuppression regimen succeeded in getting off immunosuppression and staying off for more than a year. In the process, they also significantly improved their health-related quality of life.


“Health-related quality of life domains associated with social interactions, worry, and medications improved” in pediatric liver recipients who had undergone immunosuppression withdrawal, Saeed Mohammad, MD, said at the World Congress of Pediatric Gastroenterology, Hepatology and Nutrition.


Patients who succeeded in staying off immunosuppressant drugs for at least 2 years after they first began ratcheting down their regimen showed better quality of life scores compared with their scores at baseline, and also compared with the scores of other pediatric liver transplant patients who unsuccessfully tried coming off immunosuppression.


Not every pediatric liver transplant patient should attempt withdrawing immunosuppression, cautioned Dr. Mohammad, a pediatric gastroenterologist at Northwestern University in Chicago. “To be successful withdrawal of immunosuppression needs to be in selected patients; not every patient is a good candidate.”


The Immunosuppression Withdrawal for Stable Pediatric Liver Transplant Recipients (iWITH) study ran at 11 U.S. center and one center in Toronto during October 2012 through June 2014. Pediatric liver transplant recipients were eligible to start a 9-10 month graduated withdrawal from their immunosuppression regimen if they met several criteria of stability including no rejection episode over at least the prior 12 months, normal laboratory-test results, no autoimmune disease and no problems detected in a liver biopsy. The prospective study enrolled 88 patients who averaged 10 years old. Patients underwent comprehensive examinations and laboratory testing at baseline and again  several times during the subsequent 2 years including assessment of several quality of life measures.


During follow-up, 35 of the 88 patients (40%) developed symptoms of rejection and had to go back on immunosuppression. Most of these patients developed their rejection symptoms early during immunosuppression weaning, but a few patients failed later including one patient who failed 22 months after starting immunosuppression withdrawal, Dr. Mohammad said. Researchers from the iWITH study first reported these results at the American Transplant Congress in June 2016.


The quality of life findings reported by Dr. Mohammad came from assessments at baseline, after 12 months, and after 24 months, and included 30 of the patients who resumed immunosuppression and 48 patients who remained off immunosuppression for 2 years. All of these 78 patients had relatively robust quality of life profiles at baseline. Their scores for both physical and social subscales as well as for total score were significantly superior to the average scores for a large number of primarily U.S. pediatric liver transplant patients in the SPLIT database. Dr. Mohammad called the patients who attempted immunosuppression discontinuation as the “creme de la creme” of pediatric liver transplant patients in terms of their clinical status.


Analysis of scores after 2 years compared with baseline showed statistically significant improvements among patients who stayed off immunosuppression for the domains of social function, treatment attitudes and compliance, communication, and worry. A comparison of changes in quality of life scores from baseline to 2 years showed that patients who stayed off immunosuppression had improvements in several of their scores while patients who went back onto immunosuppression had on average a small deterioration of their scores.
Dr. Mohammad had no disclosures.

 

[email protected]
On Twitter @mitchelzoler

Publications
Topics
Sections


MONTREAL  – Three-fifths of pediatric liver-transplant recipients who were doing well enough to attempt weaning from their immunosuppression regimen succeeded in getting off immunosuppression and staying off for more than a year. In the process, they also significantly improved their health-related quality of life.


“Health-related quality of life domains associated with social interactions, worry, and medications improved” in pediatric liver recipients who had undergone immunosuppression withdrawal, Saeed Mohammad, MD, said at the World Congress of Pediatric Gastroenterology, Hepatology and Nutrition.


Patients who succeeded in staying off immunosuppressant drugs for at least 2 years after they first began ratcheting down their regimen showed better quality of life scores compared with their scores at baseline, and also compared with the scores of other pediatric liver transplant patients who unsuccessfully tried coming off immunosuppression.


Not every pediatric liver transplant patient should attempt withdrawing immunosuppression, cautioned Dr. Mohammad, a pediatric gastroenterologist at Northwestern University in Chicago. “To be successful withdrawal of immunosuppression needs to be in selected patients; not every patient is a good candidate.”


The Immunosuppression Withdrawal for Stable Pediatric Liver Transplant Recipients (iWITH) study ran at 11 U.S. center and one center in Toronto during October 2012 through June 2014. Pediatric liver transplant recipients were eligible to start a 9-10 month graduated withdrawal from their immunosuppression regimen if they met several criteria of stability including no rejection episode over at least the prior 12 months, normal laboratory-test results, no autoimmune disease and no problems detected in a liver biopsy. The prospective study enrolled 88 patients who averaged 10 years old. Patients underwent comprehensive examinations and laboratory testing at baseline and again  several times during the subsequent 2 years including assessment of several quality of life measures.


During follow-up, 35 of the 88 patients (40%) developed symptoms of rejection and had to go back on immunosuppression. Most of these patients developed their rejection symptoms early during immunosuppression weaning, but a few patients failed later including one patient who failed 22 months after starting immunosuppression withdrawal, Dr. Mohammad said. Researchers from the iWITH study first reported these results at the American Transplant Congress in June 2016.


The quality of life findings reported by Dr. Mohammad came from assessments at baseline, after 12 months, and after 24 months, and included 30 of the patients who resumed immunosuppression and 48 patients who remained off immunosuppression for 2 years. All of these 78 patients had relatively robust quality of life profiles at baseline. Their scores for both physical and social subscales as well as for total score were significantly superior to the average scores for a large number of primarily U.S. pediatric liver transplant patients in the SPLIT database. Dr. Mohammad called the patients who attempted immunosuppression discontinuation as the “creme de la creme” of pediatric liver transplant patients in terms of their clinical status.


Analysis of scores after 2 years compared with baseline showed statistically significant improvements among patients who stayed off immunosuppression for the domains of social function, treatment attitudes and compliance, communication, and worry. A comparison of changes in quality of life scores from baseline to 2 years showed that patients who stayed off immunosuppression had improvements in several of their scores while patients who went back onto immunosuppression had on average a small deterioration of their scores.
Dr. Mohammad had no disclosures.

 

[email protected]
On Twitter @mitchelzoler


MONTREAL  – Three-fifths of pediatric liver-transplant recipients who were doing well enough to attempt weaning from their immunosuppression regimen succeeded in getting off immunosuppression and staying off for more than a year. In the process, they also significantly improved their health-related quality of life.


“Health-related quality of life domains associated with social interactions, worry, and medications improved” in pediatric liver recipients who had undergone immunosuppression withdrawal, Saeed Mohammad, MD, said at the World Congress of Pediatric Gastroenterology, Hepatology and Nutrition.


Patients who succeeded in staying off immunosuppressant drugs for at least 2 years after they first began ratcheting down their regimen showed better quality of life scores compared with their scores at baseline, and also compared with the scores of other pediatric liver transplant patients who unsuccessfully tried coming off immunosuppression.


Not every pediatric liver transplant patient should attempt withdrawing immunosuppression, cautioned Dr. Mohammad, a pediatric gastroenterologist at Northwestern University in Chicago. “To be successful withdrawal of immunosuppression needs to be in selected patients; not every patient is a good candidate.”


The Immunosuppression Withdrawal for Stable Pediatric Liver Transplant Recipients (iWITH) study ran at 11 U.S. center and one center in Toronto during October 2012 through June 2014. Pediatric liver transplant recipients were eligible to start a 9-10 month graduated withdrawal from their immunosuppression regimen if they met several criteria of stability including no rejection episode over at least the prior 12 months, normal laboratory-test results, no autoimmune disease and no problems detected in a liver biopsy. The prospective study enrolled 88 patients who averaged 10 years old. Patients underwent comprehensive examinations and laboratory testing at baseline and again  several times during the subsequent 2 years including assessment of several quality of life measures.


During follow-up, 35 of the 88 patients (40%) developed symptoms of rejection and had to go back on immunosuppression. Most of these patients developed their rejection symptoms early during immunosuppression weaning, but a few patients failed later including one patient who failed 22 months after starting immunosuppression withdrawal, Dr. Mohammad said. Researchers from the iWITH study first reported these results at the American Transplant Congress in June 2016.


The quality of life findings reported by Dr. Mohammad came from assessments at baseline, after 12 months, and after 24 months, and included 30 of the patients who resumed immunosuppression and 48 patients who remained off immunosuppression for 2 years. All of these 78 patients had relatively robust quality of life profiles at baseline. Their scores for both physical and social subscales as well as for total score were significantly superior to the average scores for a large number of primarily U.S. pediatric liver transplant patients in the SPLIT database. Dr. Mohammad called the patients who attempted immunosuppression discontinuation as the “creme de la creme” of pediatric liver transplant patients in terms of their clinical status.


Analysis of scores after 2 years compared with baseline showed statistically significant improvements among patients who stayed off immunosuppression for the domains of social function, treatment attitudes and compliance, communication, and worry. A comparison of changes in quality of life scores from baseline to 2 years showed that patients who stayed off immunosuppression had improvements in several of their scores while patients who went back onto immunosuppression had on average a small deterioration of their scores.
Dr. Mohammad had no disclosures.

 

[email protected]
On Twitter @mitchelzoler

Publications
Publications
Topics
Article Type
Sections
Article Source

AT WCPGHAN 2016

Disallow All Ads
Vitals

Key clinical point: Selected pediatric liver-transplant patients who successfully weaned off immunosuppression responded with significantly improved quality of life scores.


Major finding: Patient and parent treatment satisfaction improved by 6-7 points when patients stopped immunosuppression and fell by 2-3 points when they did not.


Data source: iWISH, a multicenter study with 88 enrolled patients.


Disclosures: Dr. Mohammad had no disclosures.
 

Resorbable scaffold appears safe, effective in diabetes patients

Article Type
Changed
Tue, 05/03/2022 - 15:32

 

An everolimus-eluting resorbable scaffold appeared to be safe and effective for percutaneous coronary intervention (PCI) in patients with diabetes and noncomplex coronary lesions, according to a study presented at the Transcatheter Cardiovascular Therapeutics annual meeting and published simultaneously in the Journal of the American College of Cardiology: Cardiovascular Interventions.

Patients with diabetes constitute an important and increasingly prevalent subgroup of PCI patients, who are at high risk of adverse clinical and angiographic outcomes such as MI, stent thrombosis, restenosis, and death. This is thought to be due to diabetic patients’ greater level of vascular inflammation and tendency toward a prothrombotic state and more complex angiographic features, said Dean J. Kereiakes, MD, of the Christ Hospital Heart and Vascular Center, Lindner Research Center, Cincinnati.

Dr. Dean J. Kereiakes
Dr. Kereiakes and his associates performed the prespecified formal substudy, designed in conjunction with the U.S. Food and Drug Administration, to support a diabetic indication for the resorbable scaffold. It was funded by Abbott Vascular, maker of the device. The study involved 754 patients who participated in three clinical trials and one device registry assessing 1-year outcomes. Even though this represents the largest study to date of patients with diabetes, it “remained underpowered to precisely evaluate low-frequency events such as scaffold thrombosis,” the coauthors noted (JACC Cardiovasc Interv. 2016 Oct 31. doi: 10.1016/j.jcin.2016.10.019).

The substudy participants all received at least one resorbable scaffold in at least one target lesion. A total of 27.3% were insulin dependent and nearly 60% had HbA1c levels exceeding 7.0%. Notably, 18% of all the treated lesions in this analysis were less than 2.25 mm in diameter as assessed by quantitative coronary angiography, and approximately 60% had moderately to severely complex morphology.

The primary endpoint – the rate of target-lesion failure at 1-year follow-up – was 8.3%, which was well below the prespecified performance goal of 12.7%. This rate ranged from 4.4% to 10.9% across the different trials. A sensitivity analysis confirmed that the 1-year rate of target-lesion failure was significantly lower than the prespecified performance goal.

The rates of target-lesion failure, target-vessel MI, ischemia-driven target-lesion revascularization, and scaffold thrombosis were significantly higher in diabetic patients who required insulin than in those who did not. Older patient age, insulin dependency, and small target-vessel diameter all were independent predictors of target-lesion failure at 1 year.

The overall 1-year rate of scaffold thrombosis in this study was 2.3%, which is not surprising given the study population’s risk factors. For diabetic patients with appropriately sized vessels of greater than 2.25 mm diameter, the scaffold thrombosis rate was lower (1.3%).

In addition to being underpowered to assess rare adverse events, this study was limited in that it reported outcomes at 1 year, before resorption of the device was complete. It also reflects the first-time clinical experience with a resorbable scaffold for most of the participating investigators, “and one would expect that as with all new medical procedures, results will improve over time with increased operator experience,” the coauthors wrote.

Dr. Kereiakes reported being a consultant to Abbott Vascular, and his associates also reported ties to the company and to other industry sources.


 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

An everolimus-eluting resorbable scaffold appeared to be safe and effective for percutaneous coronary intervention (PCI) in patients with diabetes and noncomplex coronary lesions, according to a study presented at the Transcatheter Cardiovascular Therapeutics annual meeting and published simultaneously in the Journal of the American College of Cardiology: Cardiovascular Interventions.

Patients with diabetes constitute an important and increasingly prevalent subgroup of PCI patients, who are at high risk of adverse clinical and angiographic outcomes such as MI, stent thrombosis, restenosis, and death. This is thought to be due to diabetic patients’ greater level of vascular inflammation and tendency toward a prothrombotic state and more complex angiographic features, said Dean J. Kereiakes, MD, of the Christ Hospital Heart and Vascular Center, Lindner Research Center, Cincinnati.

Dr. Dean J. Kereiakes
Dr. Kereiakes and his associates performed the prespecified formal substudy, designed in conjunction with the U.S. Food and Drug Administration, to support a diabetic indication for the resorbable scaffold. It was funded by Abbott Vascular, maker of the device. The study involved 754 patients who participated in three clinical trials and one device registry assessing 1-year outcomes. Even though this represents the largest study to date of patients with diabetes, it “remained underpowered to precisely evaluate low-frequency events such as scaffold thrombosis,” the coauthors noted (JACC Cardiovasc Interv. 2016 Oct 31. doi: 10.1016/j.jcin.2016.10.019).

The substudy participants all received at least one resorbable scaffold in at least one target lesion. A total of 27.3% were insulin dependent and nearly 60% had HbA1c levels exceeding 7.0%. Notably, 18% of all the treated lesions in this analysis were less than 2.25 mm in diameter as assessed by quantitative coronary angiography, and approximately 60% had moderately to severely complex morphology.

The primary endpoint – the rate of target-lesion failure at 1-year follow-up – was 8.3%, which was well below the prespecified performance goal of 12.7%. This rate ranged from 4.4% to 10.9% across the different trials. A sensitivity analysis confirmed that the 1-year rate of target-lesion failure was significantly lower than the prespecified performance goal.

The rates of target-lesion failure, target-vessel MI, ischemia-driven target-lesion revascularization, and scaffold thrombosis were significantly higher in diabetic patients who required insulin than in those who did not. Older patient age, insulin dependency, and small target-vessel diameter all were independent predictors of target-lesion failure at 1 year.

The overall 1-year rate of scaffold thrombosis in this study was 2.3%, which is not surprising given the study population’s risk factors. For diabetic patients with appropriately sized vessels of greater than 2.25 mm diameter, the scaffold thrombosis rate was lower (1.3%).

In addition to being underpowered to assess rare adverse events, this study was limited in that it reported outcomes at 1 year, before resorption of the device was complete. It also reflects the first-time clinical experience with a resorbable scaffold for most of the participating investigators, “and one would expect that as with all new medical procedures, results will improve over time with increased operator experience,” the coauthors wrote.

Dr. Kereiakes reported being a consultant to Abbott Vascular, and his associates also reported ties to the company and to other industry sources.


 

 

An everolimus-eluting resorbable scaffold appeared to be safe and effective for percutaneous coronary intervention (PCI) in patients with diabetes and noncomplex coronary lesions, according to a study presented at the Transcatheter Cardiovascular Therapeutics annual meeting and published simultaneously in the Journal of the American College of Cardiology: Cardiovascular Interventions.

Patients with diabetes constitute an important and increasingly prevalent subgroup of PCI patients, who are at high risk of adverse clinical and angiographic outcomes such as MI, stent thrombosis, restenosis, and death. This is thought to be due to diabetic patients’ greater level of vascular inflammation and tendency toward a prothrombotic state and more complex angiographic features, said Dean J. Kereiakes, MD, of the Christ Hospital Heart and Vascular Center, Lindner Research Center, Cincinnati.

Dr. Dean J. Kereiakes
Dr. Kereiakes and his associates performed the prespecified formal substudy, designed in conjunction with the U.S. Food and Drug Administration, to support a diabetic indication for the resorbable scaffold. It was funded by Abbott Vascular, maker of the device. The study involved 754 patients who participated in three clinical trials and one device registry assessing 1-year outcomes. Even though this represents the largest study to date of patients with diabetes, it “remained underpowered to precisely evaluate low-frequency events such as scaffold thrombosis,” the coauthors noted (JACC Cardiovasc Interv. 2016 Oct 31. doi: 10.1016/j.jcin.2016.10.019).

The substudy participants all received at least one resorbable scaffold in at least one target lesion. A total of 27.3% were insulin dependent and nearly 60% had HbA1c levels exceeding 7.0%. Notably, 18% of all the treated lesions in this analysis were less than 2.25 mm in diameter as assessed by quantitative coronary angiography, and approximately 60% had moderately to severely complex morphology.

The primary endpoint – the rate of target-lesion failure at 1-year follow-up – was 8.3%, which was well below the prespecified performance goal of 12.7%. This rate ranged from 4.4% to 10.9% across the different trials. A sensitivity analysis confirmed that the 1-year rate of target-lesion failure was significantly lower than the prespecified performance goal.

The rates of target-lesion failure, target-vessel MI, ischemia-driven target-lesion revascularization, and scaffold thrombosis were significantly higher in diabetic patients who required insulin than in those who did not. Older patient age, insulin dependency, and small target-vessel diameter all were independent predictors of target-lesion failure at 1 year.

The overall 1-year rate of scaffold thrombosis in this study was 2.3%, which is not surprising given the study population’s risk factors. For diabetic patients with appropriately sized vessels of greater than 2.25 mm diameter, the scaffold thrombosis rate was lower (1.3%).

In addition to being underpowered to assess rare adverse events, this study was limited in that it reported outcomes at 1 year, before resorption of the device was complete. It also reflects the first-time clinical experience with a resorbable scaffold for most of the participating investigators, “and one would expect that as with all new medical procedures, results will improve over time with increased operator experience,” the coauthors wrote.

Dr. Kereiakes reported being a consultant to Abbott Vascular, and his associates also reported ties to the company and to other industry sources.


 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Vitals

 

Key clinical point: An everolimus-eluting resorbable scaffold appeared to be safe and effective for PCI in patients with diabetes.

Major finding: The primary endpoint – the rate of target-lesion failure at 1 year follow-up – was 8.3%, which was well below the prespecified performance goal of 12.7%.

Data source: A prespecified formal substudy of 754 patients with diabetes who participated in three clinical trials and one device registry, assessing 1-year outcomes after PCI.

Disclosures: This pooled analysis, plus all the contributing trials and the device registry, were funded by Abbott Vascular, maker of the resorbable scaffold. Dr. Kereiakes reported being a consultant to Abbott Vascular, and his associates also reported ties to the company and to other industry sources.

Causes of recurrent pediatric pancreatitis start to emerge

Article Type
Changed
Tue, 07/21/2020 - 14:18

 

– Once children have a first bout of acute pancreatitis, a second, separate episode of acute pancreatitis most often occurs in patients with genetically triggered pancreatitis, those who are taller or weigh more than average, and patients with pancreatic necrosis, based on multicenter, prospective data collected from 83 patients.

This is the first reported study to prospectively follow pediatric cases of acute pancreatitis, and additional studies with more patients are needed to better identify the factors predisposing patients to recurrent episodes of acute pancreatitis and to quantify the amount of risk these factors pose, Katherine F. Sweeny, MD, said at the annual meeting of the Federation of the International Societies of Pediatric Gastroenterology, Hepatology, and Nutrition.

 



Mitchel L. Zoler/Frontline Medical News
Dr. Katherine F. Sweeny
Dr. Sweeny and her associates enrolled 85 pediatric patients into the study who were diagnosed with an initial episode of acute pancreatitis at a center participating in the International Study Group of Pediatric Pancreatitis: In Search for a Cure (INSPPIRE) during the 37 months from March 2013 to April 2016. The average age of the patients was 14 years. They came from the 14 centers participating in INSPPIRE, including 10 U.S. based locations. Nearly a third of the pancreatitis cases had an idiopathic cause, a toxin or drug was implicated in 18%, a virus or other systemic cause in 18%, a biliary or gallstone problem in 17%, trauma in 9%, a genetic cause in six patients (mutations in PRSS1, SPNK-1, and CFTR), 7%, and one patient had a metabolic etiology.

The analysis focused on the 83 patients with at least 3 months of follow-up. During observation, 17 (20%) of the patients developed a second episode of acute pancreatitis that was distinguished from the initial episode by either at least 1 pain-free month or by complete normalization of amylase and lipase levels between the two episodes. Thirteen of the 17 recurrences occurred within 5 months of the first episode, with 11 of these occurring within the first 3 months after the first attack, a subgroup Dr. Sweeny called the “rapid progressors.”

Comparison of the 11 rapid progressors with the other 72 patients showed that the rapid progressors were significantly taller and weighed more. In addition, two of the 11 rapid progressors had pancreatic necrosis while none of the other patients had this complication.

The pancreatitis etiologies of the 11 rapid progressors also highlighted the potent influence a mutation can have on producing recurrent acute pancreatitis. Four of the 11 rapid progressors had a genetic mutation linked to pancreatitis susceptibility, and five of the six patients with a genetic cause for their index episode of pancreatitis developed a second acute episode during follow-up, said Dr. Sweeny, a pediatrician at Cincinnati Children’s Hospital Medical Center. In contrast, the next most effective cause of recurrent pancreatitis was a toxin or drug, which resulted in about a 25% incidence rate of a second episode. All of the other pancreatitis etiologies had recurrence rates of 10% or less.

Collecting better information on the causes of recurrent pancreatitis and chronic pancreatitis is especially important because of the rising incidence of acute pediatric pancreatitis, currently about one case in every 10,000 children and adolescents. Prior to formation of the INSPPIRE consortium, studies of pediatric pancreatitis had largely been limited to single-center retrospective reviews. The limitations of these data have made it hard to predict which patients with a first episode of acute pancreatitis will progress to a second episode or beyond, Dr. Sweeny said.

Dr. Sweeny had no disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Once children have a first bout of acute pancreatitis, a second, separate episode of acute pancreatitis most often occurs in patients with genetically triggered pancreatitis, those who are taller or weigh more than average, and patients with pancreatic necrosis, based on multicenter, prospective data collected from 83 patients.

This is the first reported study to prospectively follow pediatric cases of acute pancreatitis, and additional studies with more patients are needed to better identify the factors predisposing patients to recurrent episodes of acute pancreatitis and to quantify the amount of risk these factors pose, Katherine F. Sweeny, MD, said at the annual meeting of the Federation of the International Societies of Pediatric Gastroenterology, Hepatology, and Nutrition.

 



Mitchel L. Zoler/Frontline Medical News
Dr. Katherine F. Sweeny
Dr. Sweeny and her associates enrolled 85 pediatric patients into the study who were diagnosed with an initial episode of acute pancreatitis at a center participating in the International Study Group of Pediatric Pancreatitis: In Search for a Cure (INSPPIRE) during the 37 months from March 2013 to April 2016. The average age of the patients was 14 years. They came from the 14 centers participating in INSPPIRE, including 10 U.S. based locations. Nearly a third of the pancreatitis cases had an idiopathic cause, a toxin or drug was implicated in 18%, a virus or other systemic cause in 18%, a biliary or gallstone problem in 17%, trauma in 9%, a genetic cause in six patients (mutations in PRSS1, SPNK-1, and CFTR), 7%, and one patient had a metabolic etiology.

The analysis focused on the 83 patients with at least 3 months of follow-up. During observation, 17 (20%) of the patients developed a second episode of acute pancreatitis that was distinguished from the initial episode by either at least 1 pain-free month or by complete normalization of amylase and lipase levels between the two episodes. Thirteen of the 17 recurrences occurred within 5 months of the first episode, with 11 of these occurring within the first 3 months after the first attack, a subgroup Dr. Sweeny called the “rapid progressors.”

Comparison of the 11 rapid progressors with the other 72 patients showed that the rapid progressors were significantly taller and weighed more. In addition, two of the 11 rapid progressors had pancreatic necrosis while none of the other patients had this complication.

The pancreatitis etiologies of the 11 rapid progressors also highlighted the potent influence a mutation can have on producing recurrent acute pancreatitis. Four of the 11 rapid progressors had a genetic mutation linked to pancreatitis susceptibility, and five of the six patients with a genetic cause for their index episode of pancreatitis developed a second acute episode during follow-up, said Dr. Sweeny, a pediatrician at Cincinnati Children’s Hospital Medical Center. In contrast, the next most effective cause of recurrent pancreatitis was a toxin or drug, which resulted in about a 25% incidence rate of a second episode. All of the other pancreatitis etiologies had recurrence rates of 10% or less.

Collecting better information on the causes of recurrent pancreatitis and chronic pancreatitis is especially important because of the rising incidence of acute pediatric pancreatitis, currently about one case in every 10,000 children and adolescents. Prior to formation of the INSPPIRE consortium, studies of pediatric pancreatitis had largely been limited to single-center retrospective reviews. The limitations of these data have made it hard to predict which patients with a first episode of acute pancreatitis will progress to a second episode or beyond, Dr. Sweeny said.

Dr. Sweeny had no disclosures.

 

– Once children have a first bout of acute pancreatitis, a second, separate episode of acute pancreatitis most often occurs in patients with genetically triggered pancreatitis, those who are taller or weigh more than average, and patients with pancreatic necrosis, based on multicenter, prospective data collected from 83 patients.

This is the first reported study to prospectively follow pediatric cases of acute pancreatitis, and additional studies with more patients are needed to better identify the factors predisposing patients to recurrent episodes of acute pancreatitis and to quantify the amount of risk these factors pose, Katherine F. Sweeny, MD, said at the annual meeting of the Federation of the International Societies of Pediatric Gastroenterology, Hepatology, and Nutrition.

 



Mitchel L. Zoler/Frontline Medical News
Dr. Katherine F. Sweeny
Dr. Sweeny and her associates enrolled 85 pediatric patients into the study who were diagnosed with an initial episode of acute pancreatitis at a center participating in the International Study Group of Pediatric Pancreatitis: In Search for a Cure (INSPPIRE) during the 37 months from March 2013 to April 2016. The average age of the patients was 14 years. They came from the 14 centers participating in INSPPIRE, including 10 U.S. based locations. Nearly a third of the pancreatitis cases had an idiopathic cause, a toxin or drug was implicated in 18%, a virus or other systemic cause in 18%, a biliary or gallstone problem in 17%, trauma in 9%, a genetic cause in six patients (mutations in PRSS1, SPNK-1, and CFTR), 7%, and one patient had a metabolic etiology.

The analysis focused on the 83 patients with at least 3 months of follow-up. During observation, 17 (20%) of the patients developed a second episode of acute pancreatitis that was distinguished from the initial episode by either at least 1 pain-free month or by complete normalization of amylase and lipase levels between the two episodes. Thirteen of the 17 recurrences occurred within 5 months of the first episode, with 11 of these occurring within the first 3 months after the first attack, a subgroup Dr. Sweeny called the “rapid progressors.”

Comparison of the 11 rapid progressors with the other 72 patients showed that the rapid progressors were significantly taller and weighed more. In addition, two of the 11 rapid progressors had pancreatic necrosis while none of the other patients had this complication.

The pancreatitis etiologies of the 11 rapid progressors also highlighted the potent influence a mutation can have on producing recurrent acute pancreatitis. Four of the 11 rapid progressors had a genetic mutation linked to pancreatitis susceptibility, and five of the six patients with a genetic cause for their index episode of pancreatitis developed a second acute episode during follow-up, said Dr. Sweeny, a pediatrician at Cincinnati Children’s Hospital Medical Center. In contrast, the next most effective cause of recurrent pancreatitis was a toxin or drug, which resulted in about a 25% incidence rate of a second episode. All of the other pancreatitis etiologies had recurrence rates of 10% or less.

Collecting better information on the causes of recurrent pancreatitis and chronic pancreatitis is especially important because of the rising incidence of acute pediatric pancreatitis, currently about one case in every 10,000 children and adolescents. Prior to formation of the INSPPIRE consortium, studies of pediatric pancreatitis had largely been limited to single-center retrospective reviews. The limitations of these data have made it hard to predict which patients with a first episode of acute pancreatitis will progress to a second episode or beyond, Dr. Sweeny said.

Dr. Sweeny had no disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT WCPGHAN 2016

Disallow All Ads
Alternative CME
Vitals

 

Key clinical point: Recurrent episodes of acute pancreatitis in children and adolescents were linked with above-average weight and height, pancreatic necrosis, and genetic mutations causing the pancreatitis.

Major finding: Overall, 17 of 83 patients (20%) had recurrent acute pancreatitis, but among six patients with a genetic cause, five had recurrences.

Data source: Eighty-three patients enrolled in INSPPIRE, an international consortium formed to prospectively study pediatric pancreatitis.

Disclosures: Dr. Sweeny had no disclosures.

VIDEO: Bioresorbable Absorb unexpectedly humbled by metallic DES

Disappointing results but BVS concept remains viable
Article Type
Changed
Tue, 07/21/2020 - 14:18

 

– The bioabsorbable vascular scaffold bubble suddenly burst with the first 3-year follow-up data from a randomized trial that unexpectedly showed that the Absorb device significantly underperformed compared with Xience, a widely-used, second-generation metallic drug-eluting stent.

“As a pioneer of BVS [bioresorbable vascular scaffold] I’m disappointed,” Patrick W. Serruys, MD, said at the Transcatheter Cardiovascular Therapeutics annual meeting. “The performance of the comparator stent was spectacular.”

Mitchel L. Zoler/Frontline Medical News
Dr. Patrick W. Serruys
The ABSORB II Randomized Controlled Trial (ABSORB II) randomized 501 patients to treatment with the Absorb BVS or the Xience everolimus-eluting metallic stent, with two coprimary endpoints designed for 3-year follow-up, which occurred in 468 of enrolled patients. One primary outcome was in-device vasomotion in response to nitrate challenge, which averaged 0.047 mm with Absorb and 0.056 mm with Xience, representing in failure by Absorb to meet the prespecified test for superiority. The second primary endpoint was angiographic late luminal loss, which was 0.371 mm with Absorb and 0.250 mm with Xience, a result that both failed to prove noninferiority with Absorb and actually showed statistical superiority for Xience. Concurrently with the report, an article with the results appeared online (Lancet. 2016 Oct 30. doi: 10.1016/S0140-6736[16]32050-5).

Xience surpassed Absorb in several other secondary endpoints. For example, the in-device binary restenosis rate was 7.0% with Absorb and 0.7% with Xience; the in-segment binary restenosis rate was 8% with Absorb and 3% with Xience. Target-vessel MIs occurred in 7% of the Absorb patients and 1% of the Xience patients, while clinically indicated target-lesion revascularization occurred in 6% of the Absorb patients and 1% of the Xience patients.

Another notable finding was that definite or probable in-device thrombosis occurred in nine Absorb patents and in none of the Xience patients, a statistically significant difference. Six of the Absorb thrombotic events occurred more than 1 year after the device was placed, and in several instances these thromboses occurred more than 900 days after placement, when the BVS had largely resorbed.

“These thromboses are occurring at the late stages of BVS degradation,” Dr. Serruys noted. “The Absorb polymer is basically gone after 3 years, but it’s replaced by a proteoglycan, and some proteoglycans are quite thrombogenic,” a possible explanation for the “mysterious” very late thromboses, he said.

These “disappointing” results my be linked to inadequate lesion preparation, appropriate sizing of the BVS for the lesion, and inconsistent postdilatation of the BVS, three steps that became the guiding mantra for BVS use starting a couple of years ago, said Giulio G. Stefanini, MD, an interventional cardiologist at Humanitas Research Hospital in Milan and a discussant for the report at the meeting, which was sponsored by the Cardiovascular Research Foundation.

Dr. Giulio G. Stefanini
The guiding principles of preparation, sizing, and postdilatation have become so ingrained recently that operators now commonly refer to these steps as “PSP,” but this approach was not used nearly as uniformly when the ABSORB II trial began in 2011, he noted during an interview. “The techniques used in ABSORB II probably do not reflect today’s practice.”

Dr. Stefanini said that even though the Absorb stent became available for routine use in Europe starting in 2012, the device gained little traction since then in his own practice and throughout Italy. Currently it’s used for fewer than 5% of coronary interventions in Italy, he estimated. That’s largely because “we have failed to identify a population that benefits.” Other issues include the extra time needed to place a BVS, and the need for longer treatment with dual antiplatelet therapy for patients who receive a BVS, compared with when they receive a modern metallic drug-eluting stent. The Absorb BVS received Food and Drug Administration approval for routine U.S. use in July 2016.

“It would be beautiful to have a fully bioresorbable stent. It’s a lovely concept, but we’re not there yet,” Dr. Stefanini observed.

ABSORB II was sponsored by Abbott Vascular, which markets the Absorb device. Dr. Serruys has received research support from Abbott Vascular and has been a consultant to several other device and drug companies. Dr. Stefanini has been a consultant to Boston Scientific, B.Braun, and Edwards.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

 

Body

 

We were all disappointed by the ABSORB II 3-year results. It was really surprising that even the vasomotion endpoint was, if anything, a little better with Xience, which performed amazingly well in this trial. Both arms of the study did well out to 3 years, but the Xience patients did better.

The Absorb bioresorbable vascular scaffold (BVS) is an early-stage device, and based on these results I wouldn’t give up on the BVS concept. But we need to be very careful with Absorb and which patients we implant with it. We need to make sure we carefully use thorough lesion preparation, correct sizing, and postdilatation in every patient, and we need to carefully select the right patients.

The main issue with the Absorb BVS in this trial was scaffold thrombosis, so we need to use a BVS only in patients with the lowest thrombosis risk, which means younger patients without renal failure, calcified vessels, and a larger-diameter target coronary artery. Younger patients have the most to gain from receiving a BVS. Younger patients who need a coronary intervention often collect several stents over the balance of their life, and it’s in these patients where you’d prefer that the stents eventually disappear.

Paul S. Teirstein, MD , is chief of cardiology and director of interventional cardiology at the Scripps Clinic in La Jolla, Calif. He has received research support from and has been a consultant to Abbott Vascular, Boston Scientific, and Medtronic. He made these comments in an interview .

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Body

 

We were all disappointed by the ABSORB II 3-year results. It was really surprising that even the vasomotion endpoint was, if anything, a little better with Xience, which performed amazingly well in this trial. Both arms of the study did well out to 3 years, but the Xience patients did better.

The Absorb bioresorbable vascular scaffold (BVS) is an early-stage device, and based on these results I wouldn’t give up on the BVS concept. But we need to be very careful with Absorb and which patients we implant with it. We need to make sure we carefully use thorough lesion preparation, correct sizing, and postdilatation in every patient, and we need to carefully select the right patients.

The main issue with the Absorb BVS in this trial was scaffold thrombosis, so we need to use a BVS only in patients with the lowest thrombosis risk, which means younger patients without renal failure, calcified vessels, and a larger-diameter target coronary artery. Younger patients have the most to gain from receiving a BVS. Younger patients who need a coronary intervention often collect several stents over the balance of their life, and it’s in these patients where you’d prefer that the stents eventually disappear.

Paul S. Teirstein, MD , is chief of cardiology and director of interventional cardiology at the Scripps Clinic in La Jolla, Calif. He has received research support from and has been a consultant to Abbott Vascular, Boston Scientific, and Medtronic. He made these comments in an interview .

Body

 

We were all disappointed by the ABSORB II 3-year results. It was really surprising that even the vasomotion endpoint was, if anything, a little better with Xience, which performed amazingly well in this trial. Both arms of the study did well out to 3 years, but the Xience patients did better.

The Absorb bioresorbable vascular scaffold (BVS) is an early-stage device, and based on these results I wouldn’t give up on the BVS concept. But we need to be very careful with Absorb and which patients we implant with it. We need to make sure we carefully use thorough lesion preparation, correct sizing, and postdilatation in every patient, and we need to carefully select the right patients.

The main issue with the Absorb BVS in this trial was scaffold thrombosis, so we need to use a BVS only in patients with the lowest thrombosis risk, which means younger patients without renal failure, calcified vessels, and a larger-diameter target coronary artery. Younger patients have the most to gain from receiving a BVS. Younger patients who need a coronary intervention often collect several stents over the balance of their life, and it’s in these patients where you’d prefer that the stents eventually disappear.

Paul S. Teirstein, MD , is chief of cardiology and director of interventional cardiology at the Scripps Clinic in La Jolla, Calif. He has received research support from and has been a consultant to Abbott Vascular, Boston Scientific, and Medtronic. He made these comments in an interview .

Title
Disappointing results but BVS concept remains viable
Disappointing results but BVS concept remains viable

 

– The bioabsorbable vascular scaffold bubble suddenly burst with the first 3-year follow-up data from a randomized trial that unexpectedly showed that the Absorb device significantly underperformed compared with Xience, a widely-used, second-generation metallic drug-eluting stent.

“As a pioneer of BVS [bioresorbable vascular scaffold] I’m disappointed,” Patrick W. Serruys, MD, said at the Transcatheter Cardiovascular Therapeutics annual meeting. “The performance of the comparator stent was spectacular.”

Mitchel L. Zoler/Frontline Medical News
Dr. Patrick W. Serruys
The ABSORB II Randomized Controlled Trial (ABSORB II) randomized 501 patients to treatment with the Absorb BVS or the Xience everolimus-eluting metallic stent, with two coprimary endpoints designed for 3-year follow-up, which occurred in 468 of enrolled patients. One primary outcome was in-device vasomotion in response to nitrate challenge, which averaged 0.047 mm with Absorb and 0.056 mm with Xience, representing in failure by Absorb to meet the prespecified test for superiority. The second primary endpoint was angiographic late luminal loss, which was 0.371 mm with Absorb and 0.250 mm with Xience, a result that both failed to prove noninferiority with Absorb and actually showed statistical superiority for Xience. Concurrently with the report, an article with the results appeared online (Lancet. 2016 Oct 30. doi: 10.1016/S0140-6736[16]32050-5).

Xience surpassed Absorb in several other secondary endpoints. For example, the in-device binary restenosis rate was 7.0% with Absorb and 0.7% with Xience; the in-segment binary restenosis rate was 8% with Absorb and 3% with Xience. Target-vessel MIs occurred in 7% of the Absorb patients and 1% of the Xience patients, while clinically indicated target-lesion revascularization occurred in 6% of the Absorb patients and 1% of the Xience patients.

Another notable finding was that definite or probable in-device thrombosis occurred in nine Absorb patents and in none of the Xience patients, a statistically significant difference. Six of the Absorb thrombotic events occurred more than 1 year after the device was placed, and in several instances these thromboses occurred more than 900 days after placement, when the BVS had largely resorbed.

“These thromboses are occurring at the late stages of BVS degradation,” Dr. Serruys noted. “The Absorb polymer is basically gone after 3 years, but it’s replaced by a proteoglycan, and some proteoglycans are quite thrombogenic,” a possible explanation for the “mysterious” very late thromboses, he said.

These “disappointing” results my be linked to inadequate lesion preparation, appropriate sizing of the BVS for the lesion, and inconsistent postdilatation of the BVS, three steps that became the guiding mantra for BVS use starting a couple of years ago, said Giulio G. Stefanini, MD, an interventional cardiologist at Humanitas Research Hospital in Milan and a discussant for the report at the meeting, which was sponsored by the Cardiovascular Research Foundation.

Dr. Giulio G. Stefanini
The guiding principles of preparation, sizing, and postdilatation have become so ingrained recently that operators now commonly refer to these steps as “PSP,” but this approach was not used nearly as uniformly when the ABSORB II trial began in 2011, he noted during an interview. “The techniques used in ABSORB II probably do not reflect today’s practice.”

Dr. Stefanini said that even though the Absorb stent became available for routine use in Europe starting in 2012, the device gained little traction since then in his own practice and throughout Italy. Currently it’s used for fewer than 5% of coronary interventions in Italy, he estimated. That’s largely because “we have failed to identify a population that benefits.” Other issues include the extra time needed to place a BVS, and the need for longer treatment with dual antiplatelet therapy for patients who receive a BVS, compared with when they receive a modern metallic drug-eluting stent. The Absorb BVS received Food and Drug Administration approval for routine U.S. use in July 2016.

“It would be beautiful to have a fully bioresorbable stent. It’s a lovely concept, but we’re not there yet,” Dr. Stefanini observed.

ABSORB II was sponsored by Abbott Vascular, which markets the Absorb device. Dr. Serruys has received research support from Abbott Vascular and has been a consultant to several other device and drug companies. Dr. Stefanini has been a consultant to Boston Scientific, B.Braun, and Edwards.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

 

 

– The bioabsorbable vascular scaffold bubble suddenly burst with the first 3-year follow-up data from a randomized trial that unexpectedly showed that the Absorb device significantly underperformed compared with Xience, a widely-used, second-generation metallic drug-eluting stent.

“As a pioneer of BVS [bioresorbable vascular scaffold] I’m disappointed,” Patrick W. Serruys, MD, said at the Transcatheter Cardiovascular Therapeutics annual meeting. “The performance of the comparator stent was spectacular.”

Mitchel L. Zoler/Frontline Medical News
Dr. Patrick W. Serruys
The ABSORB II Randomized Controlled Trial (ABSORB II) randomized 501 patients to treatment with the Absorb BVS or the Xience everolimus-eluting metallic stent, with two coprimary endpoints designed for 3-year follow-up, which occurred in 468 of enrolled patients. One primary outcome was in-device vasomotion in response to nitrate challenge, which averaged 0.047 mm with Absorb and 0.056 mm with Xience, representing in failure by Absorb to meet the prespecified test for superiority. The second primary endpoint was angiographic late luminal loss, which was 0.371 mm with Absorb and 0.250 mm with Xience, a result that both failed to prove noninferiority with Absorb and actually showed statistical superiority for Xience. Concurrently with the report, an article with the results appeared online (Lancet. 2016 Oct 30. doi: 10.1016/S0140-6736[16]32050-5).

Xience surpassed Absorb in several other secondary endpoints. For example, the in-device binary restenosis rate was 7.0% with Absorb and 0.7% with Xience; the in-segment binary restenosis rate was 8% with Absorb and 3% with Xience. Target-vessel MIs occurred in 7% of the Absorb patients and 1% of the Xience patients, while clinically indicated target-lesion revascularization occurred in 6% of the Absorb patients and 1% of the Xience patients.

Another notable finding was that definite or probable in-device thrombosis occurred in nine Absorb patents and in none of the Xience patients, a statistically significant difference. Six of the Absorb thrombotic events occurred more than 1 year after the device was placed, and in several instances these thromboses occurred more than 900 days after placement, when the BVS had largely resorbed.

“These thromboses are occurring at the late stages of BVS degradation,” Dr. Serruys noted. “The Absorb polymer is basically gone after 3 years, but it’s replaced by a proteoglycan, and some proteoglycans are quite thrombogenic,” a possible explanation for the “mysterious” very late thromboses, he said.

These “disappointing” results my be linked to inadequate lesion preparation, appropriate sizing of the BVS for the lesion, and inconsistent postdilatation of the BVS, three steps that became the guiding mantra for BVS use starting a couple of years ago, said Giulio G. Stefanini, MD, an interventional cardiologist at Humanitas Research Hospital in Milan and a discussant for the report at the meeting, which was sponsored by the Cardiovascular Research Foundation.

Dr. Giulio G. Stefanini
The guiding principles of preparation, sizing, and postdilatation have become so ingrained recently that operators now commonly refer to these steps as “PSP,” but this approach was not used nearly as uniformly when the ABSORB II trial began in 2011, he noted during an interview. “The techniques used in ABSORB II probably do not reflect today’s practice.”

Dr. Stefanini said that even though the Absorb stent became available for routine use in Europe starting in 2012, the device gained little traction since then in his own practice and throughout Italy. Currently it’s used for fewer than 5% of coronary interventions in Italy, he estimated. That’s largely because “we have failed to identify a population that benefits.” Other issues include the extra time needed to place a BVS, and the need for longer treatment with dual antiplatelet therapy for patients who receive a BVS, compared with when they receive a modern metallic drug-eluting stent. The Absorb BVS received Food and Drug Administration approval for routine U.S. use in July 2016.

“It would be beautiful to have a fully bioresorbable stent. It’s a lovely concept, but we’re not there yet,” Dr. Stefanini observed.

ABSORB II was sponsored by Abbott Vascular, which markets the Absorb device. Dr. Serruys has received research support from Abbott Vascular and has been a consultant to several other device and drug companies. Dr. Stefanini has been a consultant to Boston Scientific, B.Braun, and Edwards.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Vitals

 

Key clinical point: The Absorb bioresorbable vascular scaffold showed inferior performance to the Xience everolimus-eluting metallic stent for coronary procedures in a multicenter randomized trial of 468 patients followed for 3 years.

Major finding: In-stent or in-scaffold late luminal loss averaged 0.25 mm with Xience and 0.37 mm with Absorb, a statistically significant difference.

Data source: ABSORB II, a multicenter, randomized trial that enrolled 501 patients.

Disclosures: ABSORB II was sponsored by Abbott Vascular, which markets the Absorb device. Dr. Serruys has received research support from Abbott Vascular and has been a consultant to several other device and drug companies. Dr. Stefanini has been a consultant to Boston Scientific, B.Braun and Edwards.

Absolute humidity most important environmental factor in global influenza

Article Type
Changed
Fri, 01/18/2019 - 16:18

 

Absolute humidity and temperature are the most important environmental drivers of global influenza, despite differences in outbreak patterns between tropical and temperate countries, according to a new analysis by U.S.-based researchers.

Using convergent cross-mapping and an empirical dynamic modeling approach on data collected by the World Health Organization, investigators led by George Sugihara, PhD, of the Scripps Institution of Oceanography at the University of California, San Diego, confirmed a hypothetical U-shaped relationship between influenza outbreaks and absolute humidity. At low latitudes in the tropics, absolute humidity has a positive effect, increasing the likelihood of influenza as humidity rises but at higher latitudes in temperate countries, absolute humidity has a negative effect, making influenza more likely when absolute humidity is low.

While absolute humidity was the most important factor in the likelihood of influenza outbreaks, the U-shaped relationship was dictated by average temperature. An average temperature below 70 °F had little effect on the negative relationship between absolute humidity and influenza at that range of temperatures, but if the temperature was between 75 °F and 85 °F, the effect was positive. Above 85 °F, aerosol transmission of influenza is blocked, the investigators noted.

“Augmented with further laboratory testing, these population-level results could help set the stage for public health initiatives such as placing humidifiers in schools and hospitals during cold, dry, temperate winter, and in the tropics, perhaps using dehumidifiers or air conditioners set above 75 °F to dry air in public buildings,” Dr. Sugihara and his colleagues wrote.

Find the full study in Proceedings of the National Academy of Sciences of the United States of America (doi: 10.1073/pnas.1607747113).
 

Publications
Topics
Sections

 

Absolute humidity and temperature are the most important environmental drivers of global influenza, despite differences in outbreak patterns between tropical and temperate countries, according to a new analysis by U.S.-based researchers.

Using convergent cross-mapping and an empirical dynamic modeling approach on data collected by the World Health Organization, investigators led by George Sugihara, PhD, of the Scripps Institution of Oceanography at the University of California, San Diego, confirmed a hypothetical U-shaped relationship between influenza outbreaks and absolute humidity. At low latitudes in the tropics, absolute humidity has a positive effect, increasing the likelihood of influenza as humidity rises but at higher latitudes in temperate countries, absolute humidity has a negative effect, making influenza more likely when absolute humidity is low.

While absolute humidity was the most important factor in the likelihood of influenza outbreaks, the U-shaped relationship was dictated by average temperature. An average temperature below 70 °F had little effect on the negative relationship between absolute humidity and influenza at that range of temperatures, but if the temperature was between 75 °F and 85 °F, the effect was positive. Above 85 °F, aerosol transmission of influenza is blocked, the investigators noted.

“Augmented with further laboratory testing, these population-level results could help set the stage for public health initiatives such as placing humidifiers in schools and hospitals during cold, dry, temperate winter, and in the tropics, perhaps using dehumidifiers or air conditioners set above 75 °F to dry air in public buildings,” Dr. Sugihara and his colleagues wrote.

Find the full study in Proceedings of the National Academy of Sciences of the United States of America (doi: 10.1073/pnas.1607747113).
 

 

Absolute humidity and temperature are the most important environmental drivers of global influenza, despite differences in outbreak patterns between tropical and temperate countries, according to a new analysis by U.S.-based researchers.

Using convergent cross-mapping and an empirical dynamic modeling approach on data collected by the World Health Organization, investigators led by George Sugihara, PhD, of the Scripps Institution of Oceanography at the University of California, San Diego, confirmed a hypothetical U-shaped relationship between influenza outbreaks and absolute humidity. At low latitudes in the tropics, absolute humidity has a positive effect, increasing the likelihood of influenza as humidity rises but at higher latitudes in temperate countries, absolute humidity has a negative effect, making influenza more likely when absolute humidity is low.

While absolute humidity was the most important factor in the likelihood of influenza outbreaks, the U-shaped relationship was dictated by average temperature. An average temperature below 70 °F had little effect on the negative relationship between absolute humidity and influenza at that range of temperatures, but if the temperature was between 75 °F and 85 °F, the effect was positive. Above 85 °F, aerosol transmission of influenza is blocked, the investigators noted.

“Augmented with further laboratory testing, these population-level results could help set the stage for public health initiatives such as placing humidifiers in schools and hospitals during cold, dry, temperate winter, and in the tropics, perhaps using dehumidifiers or air conditioners set above 75 °F to dry air in public buildings,” Dr. Sugihara and his colleagues wrote.

Find the full study in Proceedings of the National Academy of Sciences of the United States of America (doi: 10.1073/pnas.1607747113).
 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads

Treated bacteremia that clears, then recurs, termed ‘skip phenomenon’

Article Type
Changed
Sat, 12/08/2018 - 03:04

 

NEW ORLEANS – When Mayo Clinic physicians noticed some patients on appropriate antibiotic treatment for Staphylococcus aureus bacteremia cleared the infection, only to see it recur a few days later, Justin A. Fiala, MD and his colleagues grew curious.

Dr. Fiala, an infectious diseases internist at Mayo Clinic, Rochester, Minn., was intrigued by the possibility of fluctuating blood culture positivity in this subset of bacteremia patients.

“We wanted first to see whether or not this is a real entity and determine the prevalence of this ‘skip pattern,’” Dr. Fiala said at IDWeek 2016, the annual combined meetings of the Infectious Diseases Society of America, the Society for Healthcare Epidemiology of America, the HIV Medicine Association and the Pediatric Infectious Diseases Society. He said identifying predictors and finding any differences in clinical outcomes compared to control S. aureus bacteremia (SAB) patients were additional aims.

Dr. Fiala and his colleagues assessed a hospitalized cohort of 726 adults with SAB at Mayo Clinic between July 2006 and June 2011. Patients with one or more negative blood cultures followed by a positive culture were identified within this group, and compared with 2 to 4 patients matched for age, sex and duration of bacteremia who served as controls.

The investigators found 29 patients – or 4% – of the 726 had this ‘skip pattern’ of infection, clearance, and reinfection. Those with the phenomenon were 90% male and tended to be older, with a mean age of 69 years, compared to the controls. They had index bacteremia about two days longer than controls. The study also revealed a significant difference in mean number of central venous catheters: 2.7 in the skip phenomenon group versus 1.7 in controls.

©CDC/Janice Haney Carr
This scanning electron micrograph (SEM) shows a strain of Staphylococcus aureus bacteria taken from a vancomycin intermediate resistant culture.


Given the predominance of the skip phenomenon in older, immunosuppressed males, “the takeaway … is that serial negative blood cultures may be warranted in these patient groups,” Dr. Fiala said.

The groups did not differ significantly by presence of implants or foreign bodies or by whether SAB was nosocomial or acquired in the community. “We thought it was interesting that 90% had immune suppression, although it was not statistically significant,” Dr. Fiala said.

With no prior reports in the medical literature, the researchers named this clinical entity “skip phenomenon.” Dr. Fiala noted that published studies have assessed recurrence of SAB after completion of antibiotics, but not specifically during treatment.

“We think this is a topic that is quite clinically prevalent and applicable,” Dr. Fiala said. He pointed out that SAB is common, accounting for about 20% of all nosocomial bacteremia cases. SAB also highly virulent with a mortality rate estimated between 20% and 35%. Although the study did not reveal significant mortality differences in the subgroup with skip phenomenon, “we can say there is increased morbidity.”

The most recent IDSA guidelines state that a single set of negative blood cultures is sufficient to demonstrate clearance of SAB, Dr. Fiala said. “Could this be falsely reassuring if Staphylococcus aureus does have a tendency to exhibit this fluctuating pattern?”

The retrospective design of the study and the relatively small number of patients with the skip phenomenon were limitations, the investigators acknowledged. Dr. Fiala had no relevant disclosures.


 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

NEW ORLEANS – When Mayo Clinic physicians noticed some patients on appropriate antibiotic treatment for Staphylococcus aureus bacteremia cleared the infection, only to see it recur a few days later, Justin A. Fiala, MD and his colleagues grew curious.

Dr. Fiala, an infectious diseases internist at Mayo Clinic, Rochester, Minn., was intrigued by the possibility of fluctuating blood culture positivity in this subset of bacteremia patients.

“We wanted first to see whether or not this is a real entity and determine the prevalence of this ‘skip pattern,’” Dr. Fiala said at IDWeek 2016, the annual combined meetings of the Infectious Diseases Society of America, the Society for Healthcare Epidemiology of America, the HIV Medicine Association and the Pediatric Infectious Diseases Society. He said identifying predictors and finding any differences in clinical outcomes compared to control S. aureus bacteremia (SAB) patients were additional aims.

Dr. Fiala and his colleagues assessed a hospitalized cohort of 726 adults with SAB at Mayo Clinic between July 2006 and June 2011. Patients with one or more negative blood cultures followed by a positive culture were identified within this group, and compared with 2 to 4 patients matched for age, sex and duration of bacteremia who served as controls.

The investigators found 29 patients – or 4% – of the 726 had this ‘skip pattern’ of infection, clearance, and reinfection. Those with the phenomenon were 90% male and tended to be older, with a mean age of 69 years, compared to the controls. They had index bacteremia about two days longer than controls. The study also revealed a significant difference in mean number of central venous catheters: 2.7 in the skip phenomenon group versus 1.7 in controls.

©CDC/Janice Haney Carr
This scanning electron micrograph (SEM) shows a strain of Staphylococcus aureus bacteria taken from a vancomycin intermediate resistant culture.


Given the predominance of the skip phenomenon in older, immunosuppressed males, “the takeaway … is that serial negative blood cultures may be warranted in these patient groups,” Dr. Fiala said.

The groups did not differ significantly by presence of implants or foreign bodies or by whether SAB was nosocomial or acquired in the community. “We thought it was interesting that 90% had immune suppression, although it was not statistically significant,” Dr. Fiala said.

With no prior reports in the medical literature, the researchers named this clinical entity “skip phenomenon.” Dr. Fiala noted that published studies have assessed recurrence of SAB after completion of antibiotics, but not specifically during treatment.

“We think this is a topic that is quite clinically prevalent and applicable,” Dr. Fiala said. He pointed out that SAB is common, accounting for about 20% of all nosocomial bacteremia cases. SAB also highly virulent with a mortality rate estimated between 20% and 35%. Although the study did not reveal significant mortality differences in the subgroup with skip phenomenon, “we can say there is increased morbidity.”

The most recent IDSA guidelines state that a single set of negative blood cultures is sufficient to demonstrate clearance of SAB, Dr. Fiala said. “Could this be falsely reassuring if Staphylococcus aureus does have a tendency to exhibit this fluctuating pattern?”

The retrospective design of the study and the relatively small number of patients with the skip phenomenon were limitations, the investigators acknowledged. Dr. Fiala had no relevant disclosures.


 

 

NEW ORLEANS – When Mayo Clinic physicians noticed some patients on appropriate antibiotic treatment for Staphylococcus aureus bacteremia cleared the infection, only to see it recur a few days later, Justin A. Fiala, MD and his colleagues grew curious.

Dr. Fiala, an infectious diseases internist at Mayo Clinic, Rochester, Minn., was intrigued by the possibility of fluctuating blood culture positivity in this subset of bacteremia patients.

“We wanted first to see whether or not this is a real entity and determine the prevalence of this ‘skip pattern,’” Dr. Fiala said at IDWeek 2016, the annual combined meetings of the Infectious Diseases Society of America, the Society for Healthcare Epidemiology of America, the HIV Medicine Association and the Pediatric Infectious Diseases Society. He said identifying predictors and finding any differences in clinical outcomes compared to control S. aureus bacteremia (SAB) patients were additional aims.

Dr. Fiala and his colleagues assessed a hospitalized cohort of 726 adults with SAB at Mayo Clinic between July 2006 and June 2011. Patients with one or more negative blood cultures followed by a positive culture were identified within this group, and compared with 2 to 4 patients matched for age, sex and duration of bacteremia who served as controls.

The investigators found 29 patients – or 4% – of the 726 had this ‘skip pattern’ of infection, clearance, and reinfection. Those with the phenomenon were 90% male and tended to be older, with a mean age of 69 years, compared to the controls. They had index bacteremia about two days longer than controls. The study also revealed a significant difference in mean number of central venous catheters: 2.7 in the skip phenomenon group versus 1.7 in controls.

©CDC/Janice Haney Carr
This scanning electron micrograph (SEM) shows a strain of Staphylococcus aureus bacteria taken from a vancomycin intermediate resistant culture.


Given the predominance of the skip phenomenon in older, immunosuppressed males, “the takeaway … is that serial negative blood cultures may be warranted in these patient groups,” Dr. Fiala said.

The groups did not differ significantly by presence of implants or foreign bodies or by whether SAB was nosocomial or acquired in the community. “We thought it was interesting that 90% had immune suppression, although it was not statistically significant,” Dr. Fiala said.

With no prior reports in the medical literature, the researchers named this clinical entity “skip phenomenon.” Dr. Fiala noted that published studies have assessed recurrence of SAB after completion of antibiotics, but not specifically during treatment.

“We think this is a topic that is quite clinically prevalent and applicable,” Dr. Fiala said. He pointed out that SAB is common, accounting for about 20% of all nosocomial bacteremia cases. SAB also highly virulent with a mortality rate estimated between 20% and 35%. Although the study did not reveal significant mortality differences in the subgroup with skip phenomenon, “we can say there is increased morbidity.”

The most recent IDSA guidelines state that a single set of negative blood cultures is sufficient to demonstrate clearance of SAB, Dr. Fiala said. “Could this be falsely reassuring if Staphylococcus aureus does have a tendency to exhibit this fluctuating pattern?”

The retrospective design of the study and the relatively small number of patients with the skip phenomenon were limitations, the investigators acknowledged. Dr. Fiala had no relevant disclosures.


 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Vitals

 

Key clinical point: Serial negative blood cultures may be warranted in a subset of patients with S. aureus bacteremia to confirm disease clearance.

Major finding: About 4% of S. aureus bacteremia cases may not clear completely, as judged by one negative blood culture, contrary to recent IDSA guidelines.

Data source: Nested case-control study of 726 adult inpatients at Mayo Clinic between July 2006 and June 2011 with ≥3 days of S. aureus bacteremia.

Disclosures: Dr. Fiala had no relevant disclosures.