Part 2: The ABCs of managing COPD exacerbations

Article Type
Changed
Wed, 11/20/2019 - 09:26
How to use corticosteroids and new therapies.
Vidyard Video

 

Do you know the ABCs of medication management for chronic obstructive pulmonary disease exacerbations?

 

In the second episode of a two-part interview, Robert A. Wise, MD, outlines the evidence and best practices for treating patients with corticosteroids, and he discusses potential new approaches to preventing exacerbations.

 

Dr. Wise is a professor of medicine at the Johns Hopkins University, Baltimore. He is the coauthor of a review of medication regimens to manage COPD exacerbations (Respir Care. 2018 Jun;63[6]:773-82).

Publications
Topics
How to use corticosteroids and new therapies.
How to use corticosteroids and new therapies.
Vidyard Video

 

Do you know the ABCs of medication management for chronic obstructive pulmonary disease exacerbations?

 

In the second episode of a two-part interview, Robert A. Wise, MD, outlines the evidence and best practices for treating patients with corticosteroids, and he discusses potential new approaches to preventing exacerbations.

 

Dr. Wise is a professor of medicine at the Johns Hopkins University, Baltimore. He is the coauthor of a review of medication regimens to manage COPD exacerbations (Respir Care. 2018 Jun;63[6]:773-82).

Vidyard Video

 

Do you know the ABCs of medication management for chronic obstructive pulmonary disease exacerbations?

 

In the second episode of a two-part interview, Robert A. Wise, MD, outlines the evidence and best practices for treating patients with corticosteroids, and he discusses potential new approaches to preventing exacerbations.

 

Dr. Wise is a professor of medicine at the Johns Hopkins University, Baltimore. He is the coauthor of a review of medication regimens to manage COPD exacerbations (Respir Care. 2018 Jun;63[6]:773-82).

Publications
Publications
Topics
Article Type
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 11/15/2019 - 10:45
Un-Gate On Date
Fri, 11/15/2019 - 10:45
Use ProPublica
CFC Schedule Remove Status
Fri, 11/15/2019 - 10:45
Hide sidebar & use full width
render the right sidebar.

Part 1: The ABCs of managing COPD exacerbations

Article Type
Changed
Wed, 11/20/2019 - 09:27
How to use antibiotics and bronchodilators.
Vidyard Video

Do you know the ABCs of medication management for chronic obstructive pulmonary disease exacerbations?

Understanding how to effectively use the ABCs – antibiotics, bronchodilators, and corticosteroids – in COPD exacerbations can reduce morbidity and improve patient outcomes.

In the first episode of a two-part interview, Robert A. Wise, MD, outlines the evidence and best practices for treating patients with antibiotics and bronchodilators.

Dr. Wise is a professor of medicine at Johns Hopkins University, Baltimore. He is the coauthor of a review of medication regimens to manage COPD exacerbations (Respir Care. 2018 Jun;63[6]:773-82).

Publications
Topics
How to use antibiotics and bronchodilators.
How to use antibiotics and bronchodilators.
Vidyard Video

Do you know the ABCs of medication management for chronic obstructive pulmonary disease exacerbations?

Understanding how to effectively use the ABCs – antibiotics, bronchodilators, and corticosteroids – in COPD exacerbations can reduce morbidity and improve patient outcomes.

In the first episode of a two-part interview, Robert A. Wise, MD, outlines the evidence and best practices for treating patients with antibiotics and bronchodilators.

Dr. Wise is a professor of medicine at Johns Hopkins University, Baltimore. He is the coauthor of a review of medication regimens to manage COPD exacerbations (Respir Care. 2018 Jun;63[6]:773-82).

Vidyard Video

Do you know the ABCs of medication management for chronic obstructive pulmonary disease exacerbations?

Understanding how to effectively use the ABCs – antibiotics, bronchodilators, and corticosteroids – in COPD exacerbations can reduce morbidity and improve patient outcomes.

In the first episode of a two-part interview, Robert A. Wise, MD, outlines the evidence and best practices for treating patients with antibiotics and bronchodilators.

Dr. Wise is a professor of medicine at Johns Hopkins University, Baltimore. He is the coauthor of a review of medication regimens to manage COPD exacerbations (Respir Care. 2018 Jun;63[6]:773-82).

Publications
Publications
Topics
Article Type
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 11/13/2019 - 16:15
Un-Gate On Date
Wed, 11/13/2019 - 16:15
Use ProPublica
CFC Schedule Remove Status
Wed, 11/13/2019 - 16:15
Hide sidebar & use full width
render the right sidebar.

Ubrogepant may relieve migraine pain at 2 hours

Article Type
Changed
Fri, 01/03/2020 - 14:23

Ubrogepant, an oral calcitonin gene–related peptide (CGRP)–receptor antagonist, may relieve patients’ migraine pain and their most bothersome associated symptom, such as photophobia, phonophobia, or nausea, at 2 hours after acute treatment, according to phase 3 trial results published Nov. 19 in JAMA.

Dr. Richard B. Lipton

“Among adults with migraine, acute treatment with ubrogepant, compared with placebo, led to significantly greater rates of pain freedom at 2 hours with the 50-mg and 25-mg doses, and absence of the most bothersome migraine-associated symptom at 2 hours only with the 50-mg dose,” wrote first author Richard B. Lipton, MD, director of the Montefiore Headache Center at Albert Einstein College of Medicine, New York, and his colleagues. “Further research is needed to assess the effectiveness of ubrogepant against other acute treatments for migraine and to evaluate the long-term safety of ubrogepant among unselected patient populations.”

A researcher who commented on the results said that the drug appears “modestly better than placebo” and called for a trial comparing ubrogepant, aspirin, and oral sumatriptan.

The Food and Drug Administration is reviewing an application for ubrogepant. Allergan, the company developing the drug, has said it expects a regulatory decision in December.
 

ACHIEVE II

To evaluate the efficacy and tolerability of ubrogepant versus placebo for the acute treatment of a migraine attack, investigators conducted ACHIEVE II, a randomized, double-blind, placebo-controlled, single-attack clinical trial. The study was conducted at 99 primary care and research clinics during 2016-2018.

The trial included adults with migraine with or without aura who experienced two to eight migraine attacks per month. Participants had a mean age of 41.5 years, and 90% were female. The safety analysis included data from 1,465 participants, and the efficacy analysis included data from 1,355 participants. The primary efficacy outcomes were pain freedom and the absence of participants’ most bothersome migraine-associated symptom at 2 hours after taking the medication. Patients received ubrogepant 50 mg, ubrogepant 25 mg, or placebo to treat a migraine attack of moderate or severe pain intensity.



At 2 hours, pain freedom was reported by 101 of 464 participants in the ubrogepant 50-mg group (21.8%), 90 of 435 in the ubrogepant 25-mg group (20.7%), and 65 of 456 in the placebo group (14.3%). Absence of the most bothersome symptom was reported by 180 of 463 participants in the ubrogepant 50-mg group (38.9%), 148 of 434 in the ubrogepant 25-mg group (34.1%), and 125 of 456 in the placebo group (27.4%).

The most common adverse events within 48 hours were nausea and dizziness. Nausea occurred in 2.0% of the 50-mg group, 2.5% of the 25-mg group, and 2.0% of the placebo group. Dizziness occurred in 1.4% of the 50-mg group, 2.1% of the 25-mg group, and 1.6% of the placebo group.

At conferences, researchers have presented results from the phase 3 ACHIEVE I trial as well as an analysis that suggests ubrogepant may be effective in patients for whom triptans have been ineffective. In addition, studies have supported the safety of “gepants” after earlier concerns about potential liver toxicity. Physicians have called the safety data reassuring.

The ACHIEVE II trial was sponsored by Allergan. Several authors are Allergan employees. Dr. Lipton is a consultant, advisory board member, or has received honoraria from Allergan and other companies.

 

 

 

Number needed to treat

“The study was large, appears to have been well conducted, is clearly reported, and used appropriate outcome measures,” said Elizabeth Loder, MD, commenting on the trial.

Dr. Elizabeth Loder

A year ago, Dr. Loder, chief of the division of headache at Brigham and Women’s Hospital and professor of neurology at Harvard Medical School in Boston, coauthored a paper with Peer Tfelt-Hansen, MD, DMSc, of the University of Copenhagen, that said the phase 3 trials of gepants so far have found the drugs to have small effect sizes and low efficacy (Headache. 2019 Jan;59[1]:113-7. doi: 10.1111/head.13444).

Their publication included preliminary figures from ACHIEVE II, which are consistent with those published in JAMA. “The effect size for both doses of ubrogepant is small and of debatable clinical significance,” Dr. Loder said. “The therapeutic gain over placebo is 7.5% for the 50-mg dose and 6.4% for the 25-mg dose for the outcome of pain freedom at 2 hours. That corresponds to a number needed to treat of 13 and 15.6 people, respectively, in order to have one person achieve pain freedom at 2 hours that is attributable to the active treatment.”

For a secondary outcome of pain relief at 2 hours, defined as reduction of headache pain severity from moderate or severe to mild or none, the therapeutic gain versus placebo is 14.5% for the 50-mg dose and 12.3% for the 25-mg dose. “That corresponds to a number needed to treat of 6.8 and 8.1 people, respectively, to have one person achieve pain relief at 2 hours attributable to the drug,” Dr. Loder said.



“Although there are no head to head studies comparing ubrogepant to triptans, for reference the [number needed to treat] for a 100-mg oral dose of sumatriptan is on the order of 3.5 for pain relief at 2 hours, meaning that one needs to treat just 3.5 people with sumatriptan in order to have one person achieve pain relief at 2 hours attributable to the drug,” she said (Cochrane Database Syst Rev. 2014;5:CD009108. doi: 10.1002/14651858.CD009108.pub2).

“The bottom line is that in the ACHIEVE II study, ubrogepant appears, on average, to be modestly better than placebo to treat migraine. It does not appear to be in the same league as sumatriptan. Instead, as Dr. Tfelt-Hansen and I said in our article, the results look comparable to those likely to be achieved with inexpensive nonprescription medications such as NSAIDs.”

Dr. Loder called for a trial comparing ubrogepant and other therapies. “I challenge the authors and the company to conduct a large, placebo-controlled trial comparing ubrogepant to 100 mg of oral sumatriptan and to 650 mg of aspirin,” Dr. Loder said.

Dr. Loder has no financial connections with any pharmaceutical or device companies and is paid for her work as the head of research for the British Medical Journal.

SOURCE: Lipton RB et al. JAMA. 2019;322(19):1887-98. doi: 10.1001/jama.2019.16711.

Issue
Neurology Reviews- 28(1)
Publications
Topics
Sections

Ubrogepant, an oral calcitonin gene–related peptide (CGRP)–receptor antagonist, may relieve patients’ migraine pain and their most bothersome associated symptom, such as photophobia, phonophobia, or nausea, at 2 hours after acute treatment, according to phase 3 trial results published Nov. 19 in JAMA.

Dr. Richard B. Lipton

“Among adults with migraine, acute treatment with ubrogepant, compared with placebo, led to significantly greater rates of pain freedom at 2 hours with the 50-mg and 25-mg doses, and absence of the most bothersome migraine-associated symptom at 2 hours only with the 50-mg dose,” wrote first author Richard B. Lipton, MD, director of the Montefiore Headache Center at Albert Einstein College of Medicine, New York, and his colleagues. “Further research is needed to assess the effectiveness of ubrogepant against other acute treatments for migraine and to evaluate the long-term safety of ubrogepant among unselected patient populations.”

A researcher who commented on the results said that the drug appears “modestly better than placebo” and called for a trial comparing ubrogepant, aspirin, and oral sumatriptan.

The Food and Drug Administration is reviewing an application for ubrogepant. Allergan, the company developing the drug, has said it expects a regulatory decision in December.
 

ACHIEVE II

To evaluate the efficacy and tolerability of ubrogepant versus placebo for the acute treatment of a migraine attack, investigators conducted ACHIEVE II, a randomized, double-blind, placebo-controlled, single-attack clinical trial. The study was conducted at 99 primary care and research clinics during 2016-2018.

The trial included adults with migraine with or without aura who experienced two to eight migraine attacks per month. Participants had a mean age of 41.5 years, and 90% were female. The safety analysis included data from 1,465 participants, and the efficacy analysis included data from 1,355 participants. The primary efficacy outcomes were pain freedom and the absence of participants’ most bothersome migraine-associated symptom at 2 hours after taking the medication. Patients received ubrogepant 50 mg, ubrogepant 25 mg, or placebo to treat a migraine attack of moderate or severe pain intensity.



At 2 hours, pain freedom was reported by 101 of 464 participants in the ubrogepant 50-mg group (21.8%), 90 of 435 in the ubrogepant 25-mg group (20.7%), and 65 of 456 in the placebo group (14.3%). Absence of the most bothersome symptom was reported by 180 of 463 participants in the ubrogepant 50-mg group (38.9%), 148 of 434 in the ubrogepant 25-mg group (34.1%), and 125 of 456 in the placebo group (27.4%).

The most common adverse events within 48 hours were nausea and dizziness. Nausea occurred in 2.0% of the 50-mg group, 2.5% of the 25-mg group, and 2.0% of the placebo group. Dizziness occurred in 1.4% of the 50-mg group, 2.1% of the 25-mg group, and 1.6% of the placebo group.

At conferences, researchers have presented results from the phase 3 ACHIEVE I trial as well as an analysis that suggests ubrogepant may be effective in patients for whom triptans have been ineffective. In addition, studies have supported the safety of “gepants” after earlier concerns about potential liver toxicity. Physicians have called the safety data reassuring.

The ACHIEVE II trial was sponsored by Allergan. Several authors are Allergan employees. Dr. Lipton is a consultant, advisory board member, or has received honoraria from Allergan and other companies.

 

 

 

Number needed to treat

“The study was large, appears to have been well conducted, is clearly reported, and used appropriate outcome measures,” said Elizabeth Loder, MD, commenting on the trial.

Dr. Elizabeth Loder

A year ago, Dr. Loder, chief of the division of headache at Brigham and Women’s Hospital and professor of neurology at Harvard Medical School in Boston, coauthored a paper with Peer Tfelt-Hansen, MD, DMSc, of the University of Copenhagen, that said the phase 3 trials of gepants so far have found the drugs to have small effect sizes and low efficacy (Headache. 2019 Jan;59[1]:113-7. doi: 10.1111/head.13444).

Their publication included preliminary figures from ACHIEVE II, which are consistent with those published in JAMA. “The effect size for both doses of ubrogepant is small and of debatable clinical significance,” Dr. Loder said. “The therapeutic gain over placebo is 7.5% for the 50-mg dose and 6.4% for the 25-mg dose for the outcome of pain freedom at 2 hours. That corresponds to a number needed to treat of 13 and 15.6 people, respectively, in order to have one person achieve pain freedom at 2 hours that is attributable to the active treatment.”

For a secondary outcome of pain relief at 2 hours, defined as reduction of headache pain severity from moderate or severe to mild or none, the therapeutic gain versus placebo is 14.5% for the 50-mg dose and 12.3% for the 25-mg dose. “That corresponds to a number needed to treat of 6.8 and 8.1 people, respectively, to have one person achieve pain relief at 2 hours attributable to the drug,” Dr. Loder said.



“Although there are no head to head studies comparing ubrogepant to triptans, for reference the [number needed to treat] for a 100-mg oral dose of sumatriptan is on the order of 3.5 for pain relief at 2 hours, meaning that one needs to treat just 3.5 people with sumatriptan in order to have one person achieve pain relief at 2 hours attributable to the drug,” she said (Cochrane Database Syst Rev. 2014;5:CD009108. doi: 10.1002/14651858.CD009108.pub2).

“The bottom line is that in the ACHIEVE II study, ubrogepant appears, on average, to be modestly better than placebo to treat migraine. It does not appear to be in the same league as sumatriptan. Instead, as Dr. Tfelt-Hansen and I said in our article, the results look comparable to those likely to be achieved with inexpensive nonprescription medications such as NSAIDs.”

Dr. Loder called for a trial comparing ubrogepant and other therapies. “I challenge the authors and the company to conduct a large, placebo-controlled trial comparing ubrogepant to 100 mg of oral sumatriptan and to 650 mg of aspirin,” Dr. Loder said.

Dr. Loder has no financial connections with any pharmaceutical or device companies and is paid for her work as the head of research for the British Medical Journal.

SOURCE: Lipton RB et al. JAMA. 2019;322(19):1887-98. doi: 10.1001/jama.2019.16711.

Ubrogepant, an oral calcitonin gene–related peptide (CGRP)–receptor antagonist, may relieve patients’ migraine pain and their most bothersome associated symptom, such as photophobia, phonophobia, or nausea, at 2 hours after acute treatment, according to phase 3 trial results published Nov. 19 in JAMA.

Dr. Richard B. Lipton

“Among adults with migraine, acute treatment with ubrogepant, compared with placebo, led to significantly greater rates of pain freedom at 2 hours with the 50-mg and 25-mg doses, and absence of the most bothersome migraine-associated symptom at 2 hours only with the 50-mg dose,” wrote first author Richard B. Lipton, MD, director of the Montefiore Headache Center at Albert Einstein College of Medicine, New York, and his colleagues. “Further research is needed to assess the effectiveness of ubrogepant against other acute treatments for migraine and to evaluate the long-term safety of ubrogepant among unselected patient populations.”

A researcher who commented on the results said that the drug appears “modestly better than placebo” and called for a trial comparing ubrogepant, aspirin, and oral sumatriptan.

The Food and Drug Administration is reviewing an application for ubrogepant. Allergan, the company developing the drug, has said it expects a regulatory decision in December.
 

ACHIEVE II

To evaluate the efficacy and tolerability of ubrogepant versus placebo for the acute treatment of a migraine attack, investigators conducted ACHIEVE II, a randomized, double-blind, placebo-controlled, single-attack clinical trial. The study was conducted at 99 primary care and research clinics during 2016-2018.

The trial included adults with migraine with or without aura who experienced two to eight migraine attacks per month. Participants had a mean age of 41.5 years, and 90% were female. The safety analysis included data from 1,465 participants, and the efficacy analysis included data from 1,355 participants. The primary efficacy outcomes were pain freedom and the absence of participants’ most bothersome migraine-associated symptom at 2 hours after taking the medication. Patients received ubrogepant 50 mg, ubrogepant 25 mg, or placebo to treat a migraine attack of moderate or severe pain intensity.



At 2 hours, pain freedom was reported by 101 of 464 participants in the ubrogepant 50-mg group (21.8%), 90 of 435 in the ubrogepant 25-mg group (20.7%), and 65 of 456 in the placebo group (14.3%). Absence of the most bothersome symptom was reported by 180 of 463 participants in the ubrogepant 50-mg group (38.9%), 148 of 434 in the ubrogepant 25-mg group (34.1%), and 125 of 456 in the placebo group (27.4%).

The most common adverse events within 48 hours were nausea and dizziness. Nausea occurred in 2.0% of the 50-mg group, 2.5% of the 25-mg group, and 2.0% of the placebo group. Dizziness occurred in 1.4% of the 50-mg group, 2.1% of the 25-mg group, and 1.6% of the placebo group.

At conferences, researchers have presented results from the phase 3 ACHIEVE I trial as well as an analysis that suggests ubrogepant may be effective in patients for whom triptans have been ineffective. In addition, studies have supported the safety of “gepants” after earlier concerns about potential liver toxicity. Physicians have called the safety data reassuring.

The ACHIEVE II trial was sponsored by Allergan. Several authors are Allergan employees. Dr. Lipton is a consultant, advisory board member, or has received honoraria from Allergan and other companies.

 

 

 

Number needed to treat

“The study was large, appears to have been well conducted, is clearly reported, and used appropriate outcome measures,” said Elizabeth Loder, MD, commenting on the trial.

Dr. Elizabeth Loder

A year ago, Dr. Loder, chief of the division of headache at Brigham and Women’s Hospital and professor of neurology at Harvard Medical School in Boston, coauthored a paper with Peer Tfelt-Hansen, MD, DMSc, of the University of Copenhagen, that said the phase 3 trials of gepants so far have found the drugs to have small effect sizes and low efficacy (Headache. 2019 Jan;59[1]:113-7. doi: 10.1111/head.13444).

Their publication included preliminary figures from ACHIEVE II, which are consistent with those published in JAMA. “The effect size for both doses of ubrogepant is small and of debatable clinical significance,” Dr. Loder said. “The therapeutic gain over placebo is 7.5% for the 50-mg dose and 6.4% for the 25-mg dose for the outcome of pain freedom at 2 hours. That corresponds to a number needed to treat of 13 and 15.6 people, respectively, in order to have one person achieve pain freedom at 2 hours that is attributable to the active treatment.”

For a secondary outcome of pain relief at 2 hours, defined as reduction of headache pain severity from moderate or severe to mild or none, the therapeutic gain versus placebo is 14.5% for the 50-mg dose and 12.3% for the 25-mg dose. “That corresponds to a number needed to treat of 6.8 and 8.1 people, respectively, to have one person achieve pain relief at 2 hours attributable to the drug,” Dr. Loder said.



“Although there are no head to head studies comparing ubrogepant to triptans, for reference the [number needed to treat] for a 100-mg oral dose of sumatriptan is on the order of 3.5 for pain relief at 2 hours, meaning that one needs to treat just 3.5 people with sumatriptan in order to have one person achieve pain relief at 2 hours attributable to the drug,” she said (Cochrane Database Syst Rev. 2014;5:CD009108. doi: 10.1002/14651858.CD009108.pub2).

“The bottom line is that in the ACHIEVE II study, ubrogepant appears, on average, to be modestly better than placebo to treat migraine. It does not appear to be in the same league as sumatriptan. Instead, as Dr. Tfelt-Hansen and I said in our article, the results look comparable to those likely to be achieved with inexpensive nonprescription medications such as NSAIDs.”

Dr. Loder called for a trial comparing ubrogepant and other therapies. “I challenge the authors and the company to conduct a large, placebo-controlled trial comparing ubrogepant to 100 mg of oral sumatriptan and to 650 mg of aspirin,” Dr. Loder said.

Dr. Loder has no financial connections with any pharmaceutical or device companies and is paid for her work as the head of research for the British Medical Journal.

SOURCE: Lipton RB et al. JAMA. 2019;322(19):1887-98. doi: 10.1001/jama.2019.16711.

Issue
Neurology Reviews- 28(1)
Issue
Neurology Reviews- 28(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA

Citation Override
Publish date: November 19, 2019
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Novel antibody looks promising in lupus nephritis

Article Type
Changed
Mon, 03/22/2021 - 14:08

 

– A novel antibody, obinutuzumab, enhances renal responses in patients with lupus nephritis, through more complete B-cell depletion, compared with standard immunotherapy, and is well tolerated, according to results from the phase 2 NOBILITY trial.

“We know from our previous trials with anti–B-cell antibodies that results were mixed and we felt that these variable results were possibly due to variability in B-cell depletion with a type 1 anti-CD20 antibody such as rituximab,” Brad Rovin, MD, director, division of nephrology, Ohio State University in Columbus, told a press briefing here at Kidney Week 2019: American Society of Nephrology annual meeting.

“So we hypothesized that if we could deplete B cells more efficiently and completely, we would achieve better results. At week 52, 35% of patients in the obinutuzumab-treated group achieved a complete renal response, compared to 23% in the standard-of-care arm.”

And by week 76, the difference between obinutuzumab and the standard of care was actually larger at 40% vs. 18%, respectively, “and this was statistically significant at a P value of .01,” added Dr. Rovin, who presented the full findings of the study at the conference.

Obinutuzumab, a highly engineered anti-CD20 antibody, is already approved under the brand name Gazyva for use in certain leukemias and lymphomas. The NOBILITY study was funded by Genentech-Roche, and Dr. Rovin reported being a consultant for the company.

Asked by Medscape Medical News to comment on the study, Duvuru Geetha, MBBS, noted that with standard-of-care mycophenolate mofetil (MMF) plus corticosteroids, “the remissions rates we achieve [for lupus nephritis] are still not great,” ranging from 30% to 50%, depending on the patient population.

“This is why there is a need for alternative agents,” added Dr. Geetha, who is an associate professor of medicine, Johns Hopkins University, Baltimore.

With obinutuzumab, “the data look very promising because there is a much more profound and sustained effect on B-cell depletion and the renal response rate is much higher [than with MMF and corticosteroids],” she noted.

Dr. Geetha added, however, that she presumes patients were all premedicated with prophylactic agents to prevent infectious events, as they are when treated with rituximab.

“I think what is definitely different about this drug is that it induces direct cell death more efficiently than rituximab and that is probably what’s accounting for the higher efficacy seen with it,” said Dr. Geetha, who disclosed having received honoraria from Genentech a number of years ago.

“So yes, I believe the results are clinically meaningful,” she concluded.

NOBILITY study design

The NOBILITY trial randomized 125 patients with Class III or IV lupus nephritis to either obinutuzumab plus MMF and corticosteroids, or to MMF plus corticosteroids alone, for a treatment interval of 104 weeks.

Patients in the obinutuzumab group received two infusions of the highly engineered anti-CD20 antibody at week 0 and week 2 and another two infusions at 6 months.

“The primary endpoint was complete renal response at week 52,” the authors wrote, “while key secondary endpoints included overall renal response and modified complete renal response.”

Both at week 52 and week 76, more patients in the obinutuzumab group achieved an overall renal response as well as a modified complete renal response, compared with those treated with immunosuppression alone.

“If you look at the complete renal response over time, you can see that the curves separate after about 6 months but the placebo group starts to decline as you go further out, whereas the obinutuzumab group continues to separate, so my prediction is that we are going to see this trend continue because of the mechanism of action of obinutuzumab,” Dr. Rovin explained.

 

 

Phase 3 trials to start early 2020

All of the serologies relevant to lupus and lupus nephritis “including C3 and C4 improved while antidoubled stranded DNA levels declined, as did the urine protein-to-creatinine ratio, although the decline was more rapid and more profound in the obinutuzumab-treated patients,” Dr. Rovin said.

Importantly as well, despite the profound B-cell depletion produced by obinutuzumab, “the adverse event profile of this drug was very similar to the placebo group,” he stressed.

As expected, rates of infusion reactions were slightly higher in the experimental group than the immunosuppression alone group, but rates of serious adverse events were the same between groups, as were adverse infectious events, he noted.

Investigators have now initiated a global phase 3 trial, scheduled to start in early 2020, to evaluate the same treatment protocol in a larger group of patients.


Kidney Week 2019. Abstract #FR-OR136. Presented Nov. 8, 2019.
 

This story first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– A novel antibody, obinutuzumab, enhances renal responses in patients with lupus nephritis, through more complete B-cell depletion, compared with standard immunotherapy, and is well tolerated, according to results from the phase 2 NOBILITY trial.

“We know from our previous trials with anti–B-cell antibodies that results were mixed and we felt that these variable results were possibly due to variability in B-cell depletion with a type 1 anti-CD20 antibody such as rituximab,” Brad Rovin, MD, director, division of nephrology, Ohio State University in Columbus, told a press briefing here at Kidney Week 2019: American Society of Nephrology annual meeting.

“So we hypothesized that if we could deplete B cells more efficiently and completely, we would achieve better results. At week 52, 35% of patients in the obinutuzumab-treated group achieved a complete renal response, compared to 23% in the standard-of-care arm.”

And by week 76, the difference between obinutuzumab and the standard of care was actually larger at 40% vs. 18%, respectively, “and this was statistically significant at a P value of .01,” added Dr. Rovin, who presented the full findings of the study at the conference.

Obinutuzumab, a highly engineered anti-CD20 antibody, is already approved under the brand name Gazyva for use in certain leukemias and lymphomas. The NOBILITY study was funded by Genentech-Roche, and Dr. Rovin reported being a consultant for the company.

Asked by Medscape Medical News to comment on the study, Duvuru Geetha, MBBS, noted that with standard-of-care mycophenolate mofetil (MMF) plus corticosteroids, “the remissions rates we achieve [for lupus nephritis] are still not great,” ranging from 30% to 50%, depending on the patient population.

“This is why there is a need for alternative agents,” added Dr. Geetha, who is an associate professor of medicine, Johns Hopkins University, Baltimore.

With obinutuzumab, “the data look very promising because there is a much more profound and sustained effect on B-cell depletion and the renal response rate is much higher [than with MMF and corticosteroids],” she noted.

Dr. Geetha added, however, that she presumes patients were all premedicated with prophylactic agents to prevent infectious events, as they are when treated with rituximab.

“I think what is definitely different about this drug is that it induces direct cell death more efficiently than rituximab and that is probably what’s accounting for the higher efficacy seen with it,” said Dr. Geetha, who disclosed having received honoraria from Genentech a number of years ago.

“So yes, I believe the results are clinically meaningful,” she concluded.

NOBILITY study design

The NOBILITY trial randomized 125 patients with Class III or IV lupus nephritis to either obinutuzumab plus MMF and corticosteroids, or to MMF plus corticosteroids alone, for a treatment interval of 104 weeks.

Patients in the obinutuzumab group received two infusions of the highly engineered anti-CD20 antibody at week 0 and week 2 and another two infusions at 6 months.

“The primary endpoint was complete renal response at week 52,” the authors wrote, “while key secondary endpoints included overall renal response and modified complete renal response.”

Both at week 52 and week 76, more patients in the obinutuzumab group achieved an overall renal response as well as a modified complete renal response, compared with those treated with immunosuppression alone.

“If you look at the complete renal response over time, you can see that the curves separate after about 6 months but the placebo group starts to decline as you go further out, whereas the obinutuzumab group continues to separate, so my prediction is that we are going to see this trend continue because of the mechanism of action of obinutuzumab,” Dr. Rovin explained.

 

 

Phase 3 trials to start early 2020

All of the serologies relevant to lupus and lupus nephritis “including C3 and C4 improved while antidoubled stranded DNA levels declined, as did the urine protein-to-creatinine ratio, although the decline was more rapid and more profound in the obinutuzumab-treated patients,” Dr. Rovin said.

Importantly as well, despite the profound B-cell depletion produced by obinutuzumab, “the adverse event profile of this drug was very similar to the placebo group,” he stressed.

As expected, rates of infusion reactions were slightly higher in the experimental group than the immunosuppression alone group, but rates of serious adverse events were the same between groups, as were adverse infectious events, he noted.

Investigators have now initiated a global phase 3 trial, scheduled to start in early 2020, to evaluate the same treatment protocol in a larger group of patients.


Kidney Week 2019. Abstract #FR-OR136. Presented Nov. 8, 2019.
 

This story first appeared on Medscape.com.

 

– A novel antibody, obinutuzumab, enhances renal responses in patients with lupus nephritis, through more complete B-cell depletion, compared with standard immunotherapy, and is well tolerated, according to results from the phase 2 NOBILITY trial.

“We know from our previous trials with anti–B-cell antibodies that results were mixed and we felt that these variable results were possibly due to variability in B-cell depletion with a type 1 anti-CD20 antibody such as rituximab,” Brad Rovin, MD, director, division of nephrology, Ohio State University in Columbus, told a press briefing here at Kidney Week 2019: American Society of Nephrology annual meeting.

“So we hypothesized that if we could deplete B cells more efficiently and completely, we would achieve better results. At week 52, 35% of patients in the obinutuzumab-treated group achieved a complete renal response, compared to 23% in the standard-of-care arm.”

And by week 76, the difference between obinutuzumab and the standard of care was actually larger at 40% vs. 18%, respectively, “and this was statistically significant at a P value of .01,” added Dr. Rovin, who presented the full findings of the study at the conference.

Obinutuzumab, a highly engineered anti-CD20 antibody, is already approved under the brand name Gazyva for use in certain leukemias and lymphomas. The NOBILITY study was funded by Genentech-Roche, and Dr. Rovin reported being a consultant for the company.

Asked by Medscape Medical News to comment on the study, Duvuru Geetha, MBBS, noted that with standard-of-care mycophenolate mofetil (MMF) plus corticosteroids, “the remissions rates we achieve [for lupus nephritis] are still not great,” ranging from 30% to 50%, depending on the patient population.

“This is why there is a need for alternative agents,” added Dr. Geetha, who is an associate professor of medicine, Johns Hopkins University, Baltimore.

With obinutuzumab, “the data look very promising because there is a much more profound and sustained effect on B-cell depletion and the renal response rate is much higher [than with MMF and corticosteroids],” she noted.

Dr. Geetha added, however, that she presumes patients were all premedicated with prophylactic agents to prevent infectious events, as they are when treated with rituximab.

“I think what is definitely different about this drug is that it induces direct cell death more efficiently than rituximab and that is probably what’s accounting for the higher efficacy seen with it,” said Dr. Geetha, who disclosed having received honoraria from Genentech a number of years ago.

“So yes, I believe the results are clinically meaningful,” she concluded.

NOBILITY study design

The NOBILITY trial randomized 125 patients with Class III or IV lupus nephritis to either obinutuzumab plus MMF and corticosteroids, or to MMF plus corticosteroids alone, for a treatment interval of 104 weeks.

Patients in the obinutuzumab group received two infusions of the highly engineered anti-CD20 antibody at week 0 and week 2 and another two infusions at 6 months.

“The primary endpoint was complete renal response at week 52,” the authors wrote, “while key secondary endpoints included overall renal response and modified complete renal response.”

Both at week 52 and week 76, more patients in the obinutuzumab group achieved an overall renal response as well as a modified complete renal response, compared with those treated with immunosuppression alone.

“If you look at the complete renal response over time, you can see that the curves separate after about 6 months but the placebo group starts to decline as you go further out, whereas the obinutuzumab group continues to separate, so my prediction is that we are going to see this trend continue because of the mechanism of action of obinutuzumab,” Dr. Rovin explained.

 

 

Phase 3 trials to start early 2020

All of the serologies relevant to lupus and lupus nephritis “including C3 and C4 improved while antidoubled stranded DNA levels declined, as did the urine protein-to-creatinine ratio, although the decline was more rapid and more profound in the obinutuzumab-treated patients,” Dr. Rovin said.

Importantly as well, despite the profound B-cell depletion produced by obinutuzumab, “the adverse event profile of this drug was very similar to the placebo group,” he stressed.

As expected, rates of infusion reactions were slightly higher in the experimental group than the immunosuppression alone group, but rates of serious adverse events were the same between groups, as were adverse infectious events, he noted.

Investigators have now initiated a global phase 3 trial, scheduled to start in early 2020, to evaluate the same treatment protocol in a larger group of patients.


Kidney Week 2019. Abstract #FR-OR136. Presented Nov. 8, 2019.
 

This story first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Medscape Article

What’s new and different with ESVS guidelines for aortoabdominal aortic and iliac aneurysms?

Article Type
Changed
Tue, 11/19/2019 - 16:14

On Thursday afternoon, Karin Elisabeth Schmidt, MD, of the Center of Cardiovascular Surgery, Hospital Floridsdorf, Vienna, Austria, will discuss the guidelines of the European Society of Vascular Surgery (ESVS) for the management of abdominal and iliac aortic aneurysms, which were published in January 2019. “Since the last guideline, this field has experienced a rapid technological devices progress, significantly impacting our clinical practice as well as the care of the affected patients,” according to Dr. Schmidt and her colleagues.

They analyzed the different recommendations of the European, British and American guidelines for the treatment of abdominal aortic aneurysms was performed. The publications used for this literature study include the current and previous guidelines of the ESVS published in the European Journal of Vascular and Endovascular Surgery and the guideline published by the Society for Vascular Surgery (SVS) in January 2018, as well as the draft guideline of the National Institute for Health and Care Excellence (NICE) issued in May 2018.

There is consensus for the preference of endovascular treatment of a ruptured aortic aneurysm if this is anatomically possible, according to Dr. Schmidt. She will discuss how, for the majority of elective cases, endovascular care is favored in the SVS and ESVS guidelines in contrast to the NICE draft.

There are generally still more ambiguities than clear recommendations, especially regarding the preferred procedures for complex aortic pathologies, population screening, and follow-up after open and endovascular aortic intervention.

She recommended a critical analysis of the U.S. and European guidelines, as both partly cover different aspects.

The final version of the guideline for the United Kingdom is eagerly expected, according to Dr. Schmidt and her colleagues, as it currently prefers open surgical care in the elective setting. Many research possibilities exist in the search for biomarkers for better assessment of the progression of small aortic aneurysms coupled with functional imaging or pharmacologic influence on aneurysm growth progression. In addition, global platforms for data collection, in particular for newer devices (low profile) and their long-term performance with jointly defined endpoints, should be established.

Dr. Schmidt will discuss how techniques such as artificial intelligence and machine learning will be used in future for monitoring large amounts of data, finding patterns and thus gain new insights.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

On Thursday afternoon, Karin Elisabeth Schmidt, MD, of the Center of Cardiovascular Surgery, Hospital Floridsdorf, Vienna, Austria, will discuss the guidelines of the European Society of Vascular Surgery (ESVS) for the management of abdominal and iliac aortic aneurysms, which were published in January 2019. “Since the last guideline, this field has experienced a rapid technological devices progress, significantly impacting our clinical practice as well as the care of the affected patients,” according to Dr. Schmidt and her colleagues.

They analyzed the different recommendations of the European, British and American guidelines for the treatment of abdominal aortic aneurysms was performed. The publications used for this literature study include the current and previous guidelines of the ESVS published in the European Journal of Vascular and Endovascular Surgery and the guideline published by the Society for Vascular Surgery (SVS) in January 2018, as well as the draft guideline of the National Institute for Health and Care Excellence (NICE) issued in May 2018.

There is consensus for the preference of endovascular treatment of a ruptured aortic aneurysm if this is anatomically possible, according to Dr. Schmidt. She will discuss how, for the majority of elective cases, endovascular care is favored in the SVS and ESVS guidelines in contrast to the NICE draft.

There are generally still more ambiguities than clear recommendations, especially regarding the preferred procedures for complex aortic pathologies, population screening, and follow-up after open and endovascular aortic intervention.

She recommended a critical analysis of the U.S. and European guidelines, as both partly cover different aspects.

The final version of the guideline for the United Kingdom is eagerly expected, according to Dr. Schmidt and her colleagues, as it currently prefers open surgical care in the elective setting. Many research possibilities exist in the search for biomarkers for better assessment of the progression of small aortic aneurysms coupled with functional imaging or pharmacologic influence on aneurysm growth progression. In addition, global platforms for data collection, in particular for newer devices (low profile) and their long-term performance with jointly defined endpoints, should be established.

Dr. Schmidt will discuss how techniques such as artificial intelligence and machine learning will be used in future for monitoring large amounts of data, finding patterns and thus gain new insights.

On Thursday afternoon, Karin Elisabeth Schmidt, MD, of the Center of Cardiovascular Surgery, Hospital Floridsdorf, Vienna, Austria, will discuss the guidelines of the European Society of Vascular Surgery (ESVS) for the management of abdominal and iliac aortic aneurysms, which were published in January 2019. “Since the last guideline, this field has experienced a rapid technological devices progress, significantly impacting our clinical practice as well as the care of the affected patients,” according to Dr. Schmidt and her colleagues.

They analyzed the different recommendations of the European, British and American guidelines for the treatment of abdominal aortic aneurysms was performed. The publications used for this literature study include the current and previous guidelines of the ESVS published in the European Journal of Vascular and Endovascular Surgery and the guideline published by the Society for Vascular Surgery (SVS) in January 2018, as well as the draft guideline of the National Institute for Health and Care Excellence (NICE) issued in May 2018.

There is consensus for the preference of endovascular treatment of a ruptured aortic aneurysm if this is anatomically possible, according to Dr. Schmidt. She will discuss how, for the majority of elective cases, endovascular care is favored in the SVS and ESVS guidelines in contrast to the NICE draft.

There are generally still more ambiguities than clear recommendations, especially regarding the preferred procedures for complex aortic pathologies, population screening, and follow-up after open and endovascular aortic intervention.

She recommended a critical analysis of the U.S. and European guidelines, as both partly cover different aspects.

The final version of the guideline for the United Kingdom is eagerly expected, according to Dr. Schmidt and her colleagues, as it currently prefers open surgical care in the elective setting. Many research possibilities exist in the search for biomarkers for better assessment of the progression of small aortic aneurysms coupled with functional imaging or pharmacologic influence on aneurysm growth progression. In addition, global platforms for data collection, in particular for newer devices (low profile) and their long-term performance with jointly defined endpoints, should be established.

Dr. Schmidt will discuss how techniques such as artificial intelligence and machine learning will be used in future for monitoring large amounts of data, finding patterns and thus gain new insights.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Dealing with complications associated with central venous access catheters

Article Type
Changed
Tue, 11/19/2019 - 16:11

On Thursday morning, John T. Loree, a medical student at SUNY Upstate Medical School, Syracuse, will present a study that he and his colleagues performed to assess the risks and complications associated with the use of central venous access (CVA) catheters over the long term. They attempted to identify high-risk subgroups based upon patient characteristics and line type. The research is warranted so that modified follow-up regimens can be implemented to reduce risk and improve patient outcomes. In his presentation, Mr. Loree will discuss selected therapies for specific complications.

The researchers performed a PubMed data base search, which located 21 papers published between 2012 and 2018. In this sample, 6,781 catheters were placed in 6,183 patients, with a total dwell time of 2,538,323 days. Patients characteristics varied from children to adults. Various line types were used (peripherally inserted central catheter [PICC], central line, mediport, tunneled central venous catheter). Indications for catheterization included (chemotherapy, dialysis, total parenteral nutrition (TPN), and other medication infusion.

Mr. Loree will discuss the primary outcomes – overall complication rate and the infectious and mechanical complication rates per 1,000 catheter-days.

He and his colleagues found that port purpose was significantly predictive of infection rate, while port type was selectively predictive of overall and mechanical complication rate. Subgroup analysis demonstrated significantly increased overall complication rates in peripherally inserted catheters and patients receiving medications, and increased mechanical complication rates with central lines.

Shorter dwell time was significantly associated with an increased infection rate and overall complication rate.

Mr. Loree will discuss how the complication rates associated with long-term use of CVA catheters were associated with factors easily identifiable at the initial patient visit.

Their data will show how, overall, PICC lines used for TPN/medication administration were associated with the highest complication rate, while mediports used for chemotherapy were associated with the lowest complication rate. Based on these patient characteristics, stricter follow-up to monitor for complications can be used in select patients to improve patient outcomes, according to Mr. Loree.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

On Thursday morning, John T. Loree, a medical student at SUNY Upstate Medical School, Syracuse, will present a study that he and his colleagues performed to assess the risks and complications associated with the use of central venous access (CVA) catheters over the long term. They attempted to identify high-risk subgroups based upon patient characteristics and line type. The research is warranted so that modified follow-up regimens can be implemented to reduce risk and improve patient outcomes. In his presentation, Mr. Loree will discuss selected therapies for specific complications.

The researchers performed a PubMed data base search, which located 21 papers published between 2012 and 2018. In this sample, 6,781 catheters were placed in 6,183 patients, with a total dwell time of 2,538,323 days. Patients characteristics varied from children to adults. Various line types were used (peripherally inserted central catheter [PICC], central line, mediport, tunneled central venous catheter). Indications for catheterization included (chemotherapy, dialysis, total parenteral nutrition (TPN), and other medication infusion.

Mr. Loree will discuss the primary outcomes – overall complication rate and the infectious and mechanical complication rates per 1,000 catheter-days.

He and his colleagues found that port purpose was significantly predictive of infection rate, while port type was selectively predictive of overall and mechanical complication rate. Subgroup analysis demonstrated significantly increased overall complication rates in peripherally inserted catheters and patients receiving medications, and increased mechanical complication rates with central lines.

Shorter dwell time was significantly associated with an increased infection rate and overall complication rate.

Mr. Loree will discuss how the complication rates associated with long-term use of CVA catheters were associated with factors easily identifiable at the initial patient visit.

Their data will show how, overall, PICC lines used for TPN/medication administration were associated with the highest complication rate, while mediports used for chemotherapy were associated with the lowest complication rate. Based on these patient characteristics, stricter follow-up to monitor for complications can be used in select patients to improve patient outcomes, according to Mr. Loree.

On Thursday morning, John T. Loree, a medical student at SUNY Upstate Medical School, Syracuse, will present a study that he and his colleagues performed to assess the risks and complications associated with the use of central venous access (CVA) catheters over the long term. They attempted to identify high-risk subgroups based upon patient characteristics and line type. The research is warranted so that modified follow-up regimens can be implemented to reduce risk and improve patient outcomes. In his presentation, Mr. Loree will discuss selected therapies for specific complications.

The researchers performed a PubMed data base search, which located 21 papers published between 2012 and 2018. In this sample, 6,781 catheters were placed in 6,183 patients, with a total dwell time of 2,538,323 days. Patients characteristics varied from children to adults. Various line types were used (peripherally inserted central catheter [PICC], central line, mediport, tunneled central venous catheter). Indications for catheterization included (chemotherapy, dialysis, total parenteral nutrition (TPN), and other medication infusion.

Mr. Loree will discuss the primary outcomes – overall complication rate and the infectious and mechanical complication rates per 1,000 catheter-days.

He and his colleagues found that port purpose was significantly predictive of infection rate, while port type was selectively predictive of overall and mechanical complication rate. Subgroup analysis demonstrated significantly increased overall complication rates in peripherally inserted catheters and patients receiving medications, and increased mechanical complication rates with central lines.

Shorter dwell time was significantly associated with an increased infection rate and overall complication rate.

Mr. Loree will discuss how the complication rates associated with long-term use of CVA catheters were associated with factors easily identifiable at the initial patient visit.

Their data will show how, overall, PICC lines used for TPN/medication administration were associated with the highest complication rate, while mediports used for chemotherapy were associated with the lowest complication rate. Based on these patient characteristics, stricter follow-up to monitor for complications can be used in select patients to improve patient outcomes, according to Mr. Loree.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Satisfaction high among psoriasis patients on apremilast

Article Type
Changed
Tue, 11/19/2019 - 15:48

Psoriasis patients with a new prescription for oral apremilast were significantly less likely to switch to a different therapy within the next 12 months than were patients on their first tumor necrosis factor (TNF) inhibitor, in a large retrospective national propensity score-matched study.

Bruce Jancin/MDedge News
Dr. David L. Kaplan

“This was surprising to us,” David L. Kaplan, MD, admitted in presenting the study findings at the annual congress of the European Academy of Dermatology and Venereology.

The surprise came because apremilast, a phosphodiesterase 4 (PDE4)-inhibitor, is less potent than the injectable biologics at driving down Psoriasis Area and Severity Index (PASI) scores.

“This is real-world data. And this is what patients are saying at 1 year: that they’re actually happier [with apremilast] and they’re not interested in changing,” said Dr. Kaplan, a dermatologist at the University of Kansas and in private practice in Overland Park, Kan.

He and his coinvestigators tapped the IBM Watson MarketScan health insurance claims database for 2015-2016 and identified 1,645 biologic-naive adults with psoriasis who started on apremilast therapy and an equal number of biologic-naive psoriasis patients who initiated treatment with a biologic, of whom 1,207 started on an TNF inhibitor and 438 began on an interleukin inhibitor, which was ustekinumab in 81% of cases. The TNF inhibitor cohort was split 80/20 between adalimumab and etanercept. The three groups – new users of apremilast, a TNF inhibitor, or an interleukin inhibitor – were propensity-matched based upon age, prior usage of systemic psoriasis therapies, Charlson Comorbidity Index scores, and other potential confounders.

The primary endpoint was the switch rate to a different psoriasis treatment within 12 months. The switch rate was significantly lower in patients who had started on apremilast than in those on a TNF inhibitor by a margin of 14% to 25%, while the 11% switch rate among patients on an interleukin inhibitor was not significantly different from the rate in the apremilast group.

“I think this data kind of gives us pause,” the dermatologist said. “As a clinician myself, when patients come back in the first question I always ask is, ‘How’re you doing? Are you happy?’ And at the end of the day, the data in terms of switch rates shows where patients are at. And that doesn’t really follow what we see with PASI scores.”

A secondary endpoint was the switch rate through 24 months. The same pattern held true: 24.9% in the apremilast starters, which was similar to the 22.9% in patients initiated on an interleukin inhibitor, and significantly less than the 39.1% rate in the TNF inhibitor group.



Among patients who switched medications within the first 12 months, the mean number of days to the switch was similar across all three groups.

The study had several limitations. Propensity score–matching is not a cure-all that can eradicate all potential biases. And the claims database didn’t include information on why patients switched, nor what their PASI scores were. “This is real-world data, and clinicians don’t do PASI scores in the real world,” he noted.

Audience member Andrew Blauvelt, MD, a dermatologist and president of the Oregon Medical Research Center, Portland, rose to challenge Dr. Kaplan’s conclusion that patients on apremilast were happier with their care.

“How can you rule out that it’s just practices that don’t use biologics, and they’re keeping patients on apremilast regardless of whether they’re better or happy because they’re not using biologics?” inquired Dr. Blauvelt.

Dr. Kaplan conceded that might well be a partial explanation for the results.

“Reluctance to use biologics is out there,” he agreed.

Dr. Kaplan reported serving as a consultant and paid speaker for Celgene, the study sponsor, as well as several other pharmaceutical companies.

SOURCE: Kaplan DL. EADV Abstract FC04.04.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Psoriasis patients with a new prescription for oral apremilast were significantly less likely to switch to a different therapy within the next 12 months than were patients on their first tumor necrosis factor (TNF) inhibitor, in a large retrospective national propensity score-matched study.

Bruce Jancin/MDedge News
Dr. David L. Kaplan

“This was surprising to us,” David L. Kaplan, MD, admitted in presenting the study findings at the annual congress of the European Academy of Dermatology and Venereology.

The surprise came because apremilast, a phosphodiesterase 4 (PDE4)-inhibitor, is less potent than the injectable biologics at driving down Psoriasis Area and Severity Index (PASI) scores.

“This is real-world data. And this is what patients are saying at 1 year: that they’re actually happier [with apremilast] and they’re not interested in changing,” said Dr. Kaplan, a dermatologist at the University of Kansas and in private practice in Overland Park, Kan.

He and his coinvestigators tapped the IBM Watson MarketScan health insurance claims database for 2015-2016 and identified 1,645 biologic-naive adults with psoriasis who started on apremilast therapy and an equal number of biologic-naive psoriasis patients who initiated treatment with a biologic, of whom 1,207 started on an TNF inhibitor and 438 began on an interleukin inhibitor, which was ustekinumab in 81% of cases. The TNF inhibitor cohort was split 80/20 between adalimumab and etanercept. The three groups – new users of apremilast, a TNF inhibitor, or an interleukin inhibitor – were propensity-matched based upon age, prior usage of systemic psoriasis therapies, Charlson Comorbidity Index scores, and other potential confounders.

The primary endpoint was the switch rate to a different psoriasis treatment within 12 months. The switch rate was significantly lower in patients who had started on apremilast than in those on a TNF inhibitor by a margin of 14% to 25%, while the 11% switch rate among patients on an interleukin inhibitor was not significantly different from the rate in the apremilast group.

“I think this data kind of gives us pause,” the dermatologist said. “As a clinician myself, when patients come back in the first question I always ask is, ‘How’re you doing? Are you happy?’ And at the end of the day, the data in terms of switch rates shows where patients are at. And that doesn’t really follow what we see with PASI scores.”

A secondary endpoint was the switch rate through 24 months. The same pattern held true: 24.9% in the apremilast starters, which was similar to the 22.9% in patients initiated on an interleukin inhibitor, and significantly less than the 39.1% rate in the TNF inhibitor group.



Among patients who switched medications within the first 12 months, the mean number of days to the switch was similar across all three groups.

The study had several limitations. Propensity score–matching is not a cure-all that can eradicate all potential biases. And the claims database didn’t include information on why patients switched, nor what their PASI scores were. “This is real-world data, and clinicians don’t do PASI scores in the real world,” he noted.

Audience member Andrew Blauvelt, MD, a dermatologist and president of the Oregon Medical Research Center, Portland, rose to challenge Dr. Kaplan’s conclusion that patients on apremilast were happier with their care.

“How can you rule out that it’s just practices that don’t use biologics, and they’re keeping patients on apremilast regardless of whether they’re better or happy because they’re not using biologics?” inquired Dr. Blauvelt.

Dr. Kaplan conceded that might well be a partial explanation for the results.

“Reluctance to use biologics is out there,” he agreed.

Dr. Kaplan reported serving as a consultant and paid speaker for Celgene, the study sponsor, as well as several other pharmaceutical companies.

SOURCE: Kaplan DL. EADV Abstract FC04.04.

Psoriasis patients with a new prescription for oral apremilast were significantly less likely to switch to a different therapy within the next 12 months than were patients on their first tumor necrosis factor (TNF) inhibitor, in a large retrospective national propensity score-matched study.

Bruce Jancin/MDedge News
Dr. David L. Kaplan

“This was surprising to us,” David L. Kaplan, MD, admitted in presenting the study findings at the annual congress of the European Academy of Dermatology and Venereology.

The surprise came because apremilast, a phosphodiesterase 4 (PDE4)-inhibitor, is less potent than the injectable biologics at driving down Psoriasis Area and Severity Index (PASI) scores.

“This is real-world data. And this is what patients are saying at 1 year: that they’re actually happier [with apremilast] and they’re not interested in changing,” said Dr. Kaplan, a dermatologist at the University of Kansas and in private practice in Overland Park, Kan.

He and his coinvestigators tapped the IBM Watson MarketScan health insurance claims database for 2015-2016 and identified 1,645 biologic-naive adults with psoriasis who started on apremilast therapy and an equal number of biologic-naive psoriasis patients who initiated treatment with a biologic, of whom 1,207 started on an TNF inhibitor and 438 began on an interleukin inhibitor, which was ustekinumab in 81% of cases. The TNF inhibitor cohort was split 80/20 between adalimumab and etanercept. The three groups – new users of apremilast, a TNF inhibitor, or an interleukin inhibitor – were propensity-matched based upon age, prior usage of systemic psoriasis therapies, Charlson Comorbidity Index scores, and other potential confounders.

The primary endpoint was the switch rate to a different psoriasis treatment within 12 months. The switch rate was significantly lower in patients who had started on apremilast than in those on a TNF inhibitor by a margin of 14% to 25%, while the 11% switch rate among patients on an interleukin inhibitor was not significantly different from the rate in the apremilast group.

“I think this data kind of gives us pause,” the dermatologist said. “As a clinician myself, when patients come back in the first question I always ask is, ‘How’re you doing? Are you happy?’ And at the end of the day, the data in terms of switch rates shows where patients are at. And that doesn’t really follow what we see with PASI scores.”

A secondary endpoint was the switch rate through 24 months. The same pattern held true: 24.9% in the apremilast starters, which was similar to the 22.9% in patients initiated on an interleukin inhibitor, and significantly less than the 39.1% rate in the TNF inhibitor group.



Among patients who switched medications within the first 12 months, the mean number of days to the switch was similar across all three groups.

The study had several limitations. Propensity score–matching is not a cure-all that can eradicate all potential biases. And the claims database didn’t include information on why patients switched, nor what their PASI scores were. “This is real-world data, and clinicians don’t do PASI scores in the real world,” he noted.

Audience member Andrew Blauvelt, MD, a dermatologist and president of the Oregon Medical Research Center, Portland, rose to challenge Dr. Kaplan’s conclusion that patients on apremilast were happier with their care.

“How can you rule out that it’s just practices that don’t use biologics, and they’re keeping patients on apremilast regardless of whether they’re better or happy because they’re not using biologics?” inquired Dr. Blauvelt.

Dr. Kaplan conceded that might well be a partial explanation for the results.

“Reluctance to use biologics is out there,” he agreed.

Dr. Kaplan reported serving as a consultant and paid speaker for Celgene, the study sponsor, as well as several other pharmaceutical companies.

SOURCE: Kaplan DL. EADV Abstract FC04.04.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM EADV 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Early lenalidomide may delay progression of smoldering myeloma

Article Type
Changed
Wed, 11/20/2019 - 14:23

Early treatment with lenalidomide may delay disease progression and prevent end-organ damage in patients with high-risk smoldering multiple myeloma (SMM), according to findings from a phase 3 trial.

Dr. Sagar Lonial

While observation is the current standard of care in SMM, early therapy may represent a new standard for patients with high-risk disease, explained Sagar Lonial, MD, of Winship Cancer Institute, Emory University, Atlanta, and colleagues. Their findings were published in the Journal of Clinical Oncology.

The randomized, open-label, phase 3 study included 182 patients with intermediate- or high-risk SMM. Study patients were randomly allocated to receive either oral lenalidomide at 25 mg daily on days 1-21 of a 28-day cycle or observation.

Study subjects were stratified based on time since SMM diagnosis – 1 year or less vs. more than 1 year, and all patients in the lenalidomide arm received aspirin at 325 mg on days 1-28. Both interventions were maintained until unacceptable toxicity, disease progression, or withdrawal for other reasons.

The primary outcome was progression-free survival (PFS), measured from baseline to the development of symptomatic multiple myeloma (MM). The criteria for progression included evidence of end-organ damage in relation to MM and biochemical disease progression.

The researchers found that at 1 year PFS was 98% in the lenalidomide group and 89% in the observation group. At 2 years, PFS was 93% in the lenalidomide group and 76% in the observation group. PFS was 91% in the lenalidomide group and 66% in the observation group at 3 years (hazard ratio, 0.28; P = .002).



Among lenalidomide-treated patients, grade 3 or 4 hematologic and nonhematologic adverse events occurred in 36 patients (41%). Nonhematologic adverse events occurred in 25 patients (28%).

Frequent AEs among lenalidomide-treated patients included grade 4 decreased neutrophil count (4.5%), as well as grade 3 infections (20.5%), hypertension (9.1%), fatigue (6.8%), skin problems (5.7%), dyspnea (5.7%), and hypokalemia (3.4%). “In most cases, [adverse events] could be managed with dose modifications,” they wrote.

To reduce long-term toxicity, the researchers recommended a 2-year duration of therapy for patients at highest risk.

“Our results support the use of early intervention in patients with high-risk SMM – as defined by the 20/2/20 criteria where our magnitude of benefit was the greatest – rather than continued observation,” the researchers wrote.

The trial was funded by the National Cancer Institute. The authors reported financial affiliations with AbbVie, Aduro Biotech, Amgen, Bristol-Myers Squibb, Celgene, Juno Therapeutics, Kite Pharma, Sanofi, Takeda, and several other companies.

SOURCE: Lonial S et al. J Clin Oncol. 2019 Oct 25. doi: 10.1200/JCO.19.01740.

Publications
Topics
Sections

Early treatment with lenalidomide may delay disease progression and prevent end-organ damage in patients with high-risk smoldering multiple myeloma (SMM), according to findings from a phase 3 trial.

Dr. Sagar Lonial

While observation is the current standard of care in SMM, early therapy may represent a new standard for patients with high-risk disease, explained Sagar Lonial, MD, of Winship Cancer Institute, Emory University, Atlanta, and colleagues. Their findings were published in the Journal of Clinical Oncology.

The randomized, open-label, phase 3 study included 182 patients with intermediate- or high-risk SMM. Study patients were randomly allocated to receive either oral lenalidomide at 25 mg daily on days 1-21 of a 28-day cycle or observation.

Study subjects were stratified based on time since SMM diagnosis – 1 year or less vs. more than 1 year, and all patients in the lenalidomide arm received aspirin at 325 mg on days 1-28. Both interventions were maintained until unacceptable toxicity, disease progression, or withdrawal for other reasons.

The primary outcome was progression-free survival (PFS), measured from baseline to the development of symptomatic multiple myeloma (MM). The criteria for progression included evidence of end-organ damage in relation to MM and biochemical disease progression.

The researchers found that at 1 year PFS was 98% in the lenalidomide group and 89% in the observation group. At 2 years, PFS was 93% in the lenalidomide group and 76% in the observation group. PFS was 91% in the lenalidomide group and 66% in the observation group at 3 years (hazard ratio, 0.28; P = .002).



Among lenalidomide-treated patients, grade 3 or 4 hematologic and nonhematologic adverse events occurred in 36 patients (41%). Nonhematologic adverse events occurred in 25 patients (28%).

Frequent AEs among lenalidomide-treated patients included grade 4 decreased neutrophil count (4.5%), as well as grade 3 infections (20.5%), hypertension (9.1%), fatigue (6.8%), skin problems (5.7%), dyspnea (5.7%), and hypokalemia (3.4%). “In most cases, [adverse events] could be managed with dose modifications,” they wrote.

To reduce long-term toxicity, the researchers recommended a 2-year duration of therapy for patients at highest risk.

“Our results support the use of early intervention in patients with high-risk SMM – as defined by the 20/2/20 criteria where our magnitude of benefit was the greatest – rather than continued observation,” the researchers wrote.

The trial was funded by the National Cancer Institute. The authors reported financial affiliations with AbbVie, Aduro Biotech, Amgen, Bristol-Myers Squibb, Celgene, Juno Therapeutics, Kite Pharma, Sanofi, Takeda, and several other companies.

SOURCE: Lonial S et al. J Clin Oncol. 2019 Oct 25. doi: 10.1200/JCO.19.01740.

Early treatment with lenalidomide may delay disease progression and prevent end-organ damage in patients with high-risk smoldering multiple myeloma (SMM), according to findings from a phase 3 trial.

Dr. Sagar Lonial

While observation is the current standard of care in SMM, early therapy may represent a new standard for patients with high-risk disease, explained Sagar Lonial, MD, of Winship Cancer Institute, Emory University, Atlanta, and colleagues. Their findings were published in the Journal of Clinical Oncology.

The randomized, open-label, phase 3 study included 182 patients with intermediate- or high-risk SMM. Study patients were randomly allocated to receive either oral lenalidomide at 25 mg daily on days 1-21 of a 28-day cycle or observation.

Study subjects were stratified based on time since SMM diagnosis – 1 year or less vs. more than 1 year, and all patients in the lenalidomide arm received aspirin at 325 mg on days 1-28. Both interventions were maintained until unacceptable toxicity, disease progression, or withdrawal for other reasons.

The primary outcome was progression-free survival (PFS), measured from baseline to the development of symptomatic multiple myeloma (MM). The criteria for progression included evidence of end-organ damage in relation to MM and biochemical disease progression.

The researchers found that at 1 year PFS was 98% in the lenalidomide group and 89% in the observation group. At 2 years, PFS was 93% in the lenalidomide group and 76% in the observation group. PFS was 91% in the lenalidomide group and 66% in the observation group at 3 years (hazard ratio, 0.28; P = .002).



Among lenalidomide-treated patients, grade 3 or 4 hematologic and nonhematologic adverse events occurred in 36 patients (41%). Nonhematologic adverse events occurred in 25 patients (28%).

Frequent AEs among lenalidomide-treated patients included grade 4 decreased neutrophil count (4.5%), as well as grade 3 infections (20.5%), hypertension (9.1%), fatigue (6.8%), skin problems (5.7%), dyspnea (5.7%), and hypokalemia (3.4%). “In most cases, [adverse events] could be managed with dose modifications,” they wrote.

To reduce long-term toxicity, the researchers recommended a 2-year duration of therapy for patients at highest risk.

“Our results support the use of early intervention in patients with high-risk SMM – as defined by the 20/2/20 criteria where our magnitude of benefit was the greatest – rather than continued observation,” the researchers wrote.

The trial was funded by the National Cancer Institute. The authors reported financial affiliations with AbbVie, Aduro Biotech, Amgen, Bristol-Myers Squibb, Celgene, Juno Therapeutics, Kite Pharma, Sanofi, Takeda, and several other companies.

SOURCE: Lonial S et al. J Clin Oncol. 2019 Oct 25. doi: 10.1200/JCO.19.01740.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF CLINICAL ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Umbilical cord management matters less for mothers than for infants

Article Type
Changed
Wed, 11/20/2019 - 12:10

Immediate umbilical cord milking or delayed clamping of the umbilical cord had no significant impact on maternal outcomes, but infants were significantly more likely to experience severe intraventricular hemorrhage with umbilical cord milking, according to results of two studies published in JAMA.

arztsamui/Thinkstock

“While the evidence for neonatal benefit with delayed cord clamping at term is strong, data related to maternal outcomes, particularly after cesarean delivery, are largely lacking,” wrote Stephanie E. Purisch, MD, of Columbia University Irving Medical Center, New York, and colleagues.

In a randomized trial of 113 women who underwent cesarean deliveries of singleton infants, the researchers hypothesized that maternal blood loss would be greater with delayed cord clamping (JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.15995).

However, maternal blood loss, based on mean hemoglobin levels 1 day after delivery, was not significantly different between the delayed group (10.1 g/dL) and the immediate group (98 g/dL). The median time to cord clamping was 63 seconds in the delayed group and 6 seconds in the immediate group.

In addition, no significant differences occurred in 15 of 19 prespecified secondary outcomes. However, mean neonatal hemoglobin levels were significantly higher with delayed clamping, compared with levels associated with immediate clamping among 90 neonates for whom data were available (18.1 g/dL vs. 16.4 g/dL; P less than .001).

The results were limited by factors including lack of generalizability to other situations such as emergency or preterm deliveries and by the lack of a definition of a “clinically important postoperative hemoglobin change,” the researchers noted. However, the results show no significant impact of umbilical cord management on maternal hemoglobin in the study population.

In another study published in JAMA, Anup Katheria, MD, of Sharp Mary Birch Hospital for Women & Newborns, San Diego, and colleagues found no significant difference in rates of a composite outcome of death or severe intraventricular hemorrhage among infants randomized to umbilical cord milking (12%) vs. delayed umbilical cord clamping (8%). However, immediate umbilical cord milking was significantly associated with a higher rate of intraventricular hemorrhage alone, compared with delayed clamping (8% vs. 3%), and this signal of risk prompted the researchers to terminate the study earlier than intended.

The researchers randomized 474 infants born at less than 32 weeks’ gestation to umbilical cord milking or delayed umbilical cord clamping (JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.16004). The study was conducted at six sites in the United States and one site each in Ireland, Germany, and Canada between June 2017 and September 2018. “Because of the importance of long-term neurodevelopment, all surviving infants will be followed up to determine developmental outcomes at 22 to 26 months’ corrected gestational age,” they said.

The study was terminated early, which prevents definitive conclusions, the researchers noted, but a new study has been approved to compare umbilical cord milking with delayed umbilical cord clamping in infants of 30-32 weeks’ gestational age, they said.

“Although the safety of placental transfusion for the mother seems well established, it remains unclear which method of providing placental transfusion is best for the infant: delayed clamping and cutting the cord or milking the intact cord. The latter provides a transfusion more rapidly, which may facilitate initiation of resuscitation when needed,” Heike Rabe, MD, of the University of Sussex, Brighton, and Ola Andersson, PhD, of Lund (Sweden) University, wrote in an editorial accompanying the two studies (JAMA. 2019 Nov 19;322:1864-5. doi: 10.1001/jama.2019.16003).

The 8% incidence of severe intraventricular hemorrhage in the umbilical milking group in the study by Katheria and colleagues was higher than the 5.2% in a recent Cochrane review, but the 3% incidence of severe intraventricular hemorrhage in the delayed group was lower than the 4.5% in the Cochrane review, they said.

“Umbilical cord milking has been used in many hospitals without an increase in intraventricular hemorrhage being observed,” they noted.

“The study by Purisch et al. demonstrated the safety of delayed cord clamping for mothers delivering by cesarean at term,” the editorialists wrote. Studies are underway to identify the best techniques for cord clamping, they said.

“In the meantime, clinicians should follow the World Health Organization recommendation to delay cord clamping and cutting for 1 to 3 minutes for term infants and for at least 60 seconds for preterm infants to prevent iron deficiency and potentially enable more premature infants to survive,” they concluded.

Dr. Purisch received funding from the Maternal-Fetal Medicine Fellow Research Fund for the first study. Coauthor Cynthia Gyamfi-Bannerman, MD, reported receiving grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Society for Maternal-Fetal Medicine/AMAG Pharmaceuticals, and personal fees from Sera Prognostics outside the submitted work. The second study was supported by NICHD in a grant to Dr. Katheria, who had no financial conflicts to disclose. Coauthor Gary Cutter, PhD, had numerous ties to pharmaceutical companies. The editorialists had no financial conflicts to disclose.

SOURCES: Purisch SE et al. JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.15995; Katheria A et al. JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.16004; Rabe H and Andersson O. JAMA. 2019 Nov 19; 322:1864-5.

Publications
Topics
Sections

Immediate umbilical cord milking or delayed clamping of the umbilical cord had no significant impact on maternal outcomes, but infants were significantly more likely to experience severe intraventricular hemorrhage with umbilical cord milking, according to results of two studies published in JAMA.

arztsamui/Thinkstock

“While the evidence for neonatal benefit with delayed cord clamping at term is strong, data related to maternal outcomes, particularly after cesarean delivery, are largely lacking,” wrote Stephanie E. Purisch, MD, of Columbia University Irving Medical Center, New York, and colleagues.

In a randomized trial of 113 women who underwent cesarean deliveries of singleton infants, the researchers hypothesized that maternal blood loss would be greater with delayed cord clamping (JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.15995).

However, maternal blood loss, based on mean hemoglobin levels 1 day after delivery, was not significantly different between the delayed group (10.1 g/dL) and the immediate group (98 g/dL). The median time to cord clamping was 63 seconds in the delayed group and 6 seconds in the immediate group.

In addition, no significant differences occurred in 15 of 19 prespecified secondary outcomes. However, mean neonatal hemoglobin levels were significantly higher with delayed clamping, compared with levels associated with immediate clamping among 90 neonates for whom data were available (18.1 g/dL vs. 16.4 g/dL; P less than .001).

The results were limited by factors including lack of generalizability to other situations such as emergency or preterm deliveries and by the lack of a definition of a “clinically important postoperative hemoglobin change,” the researchers noted. However, the results show no significant impact of umbilical cord management on maternal hemoglobin in the study population.

In another study published in JAMA, Anup Katheria, MD, of Sharp Mary Birch Hospital for Women & Newborns, San Diego, and colleagues found no significant difference in rates of a composite outcome of death or severe intraventricular hemorrhage among infants randomized to umbilical cord milking (12%) vs. delayed umbilical cord clamping (8%). However, immediate umbilical cord milking was significantly associated with a higher rate of intraventricular hemorrhage alone, compared with delayed clamping (8% vs. 3%), and this signal of risk prompted the researchers to terminate the study earlier than intended.

The researchers randomized 474 infants born at less than 32 weeks’ gestation to umbilical cord milking or delayed umbilical cord clamping (JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.16004). The study was conducted at six sites in the United States and one site each in Ireland, Germany, and Canada between June 2017 and September 2018. “Because of the importance of long-term neurodevelopment, all surviving infants will be followed up to determine developmental outcomes at 22 to 26 months’ corrected gestational age,” they said.

The study was terminated early, which prevents definitive conclusions, the researchers noted, but a new study has been approved to compare umbilical cord milking with delayed umbilical cord clamping in infants of 30-32 weeks’ gestational age, they said.

“Although the safety of placental transfusion for the mother seems well established, it remains unclear which method of providing placental transfusion is best for the infant: delayed clamping and cutting the cord or milking the intact cord. The latter provides a transfusion more rapidly, which may facilitate initiation of resuscitation when needed,” Heike Rabe, MD, of the University of Sussex, Brighton, and Ola Andersson, PhD, of Lund (Sweden) University, wrote in an editorial accompanying the two studies (JAMA. 2019 Nov 19;322:1864-5. doi: 10.1001/jama.2019.16003).

The 8% incidence of severe intraventricular hemorrhage in the umbilical milking group in the study by Katheria and colleagues was higher than the 5.2% in a recent Cochrane review, but the 3% incidence of severe intraventricular hemorrhage in the delayed group was lower than the 4.5% in the Cochrane review, they said.

“Umbilical cord milking has been used in many hospitals without an increase in intraventricular hemorrhage being observed,” they noted.

“The study by Purisch et al. demonstrated the safety of delayed cord clamping for mothers delivering by cesarean at term,” the editorialists wrote. Studies are underway to identify the best techniques for cord clamping, they said.

“In the meantime, clinicians should follow the World Health Organization recommendation to delay cord clamping and cutting for 1 to 3 minutes for term infants and for at least 60 seconds for preterm infants to prevent iron deficiency and potentially enable more premature infants to survive,” they concluded.

Dr. Purisch received funding from the Maternal-Fetal Medicine Fellow Research Fund for the first study. Coauthor Cynthia Gyamfi-Bannerman, MD, reported receiving grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Society for Maternal-Fetal Medicine/AMAG Pharmaceuticals, and personal fees from Sera Prognostics outside the submitted work. The second study was supported by NICHD in a grant to Dr. Katheria, who had no financial conflicts to disclose. Coauthor Gary Cutter, PhD, had numerous ties to pharmaceutical companies. The editorialists had no financial conflicts to disclose.

SOURCES: Purisch SE et al. JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.15995; Katheria A et al. JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.16004; Rabe H and Andersson O. JAMA. 2019 Nov 19; 322:1864-5.

Immediate umbilical cord milking or delayed clamping of the umbilical cord had no significant impact on maternal outcomes, but infants were significantly more likely to experience severe intraventricular hemorrhage with umbilical cord milking, according to results of two studies published in JAMA.

arztsamui/Thinkstock

“While the evidence for neonatal benefit with delayed cord clamping at term is strong, data related to maternal outcomes, particularly after cesarean delivery, are largely lacking,” wrote Stephanie E. Purisch, MD, of Columbia University Irving Medical Center, New York, and colleagues.

In a randomized trial of 113 women who underwent cesarean deliveries of singleton infants, the researchers hypothesized that maternal blood loss would be greater with delayed cord clamping (JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.15995).

However, maternal blood loss, based on mean hemoglobin levels 1 day after delivery, was not significantly different between the delayed group (10.1 g/dL) and the immediate group (98 g/dL). The median time to cord clamping was 63 seconds in the delayed group and 6 seconds in the immediate group.

In addition, no significant differences occurred in 15 of 19 prespecified secondary outcomes. However, mean neonatal hemoglobin levels were significantly higher with delayed clamping, compared with levels associated with immediate clamping among 90 neonates for whom data were available (18.1 g/dL vs. 16.4 g/dL; P less than .001).

The results were limited by factors including lack of generalizability to other situations such as emergency or preterm deliveries and by the lack of a definition of a “clinically important postoperative hemoglobin change,” the researchers noted. However, the results show no significant impact of umbilical cord management on maternal hemoglobin in the study population.

In another study published in JAMA, Anup Katheria, MD, of Sharp Mary Birch Hospital for Women & Newborns, San Diego, and colleagues found no significant difference in rates of a composite outcome of death or severe intraventricular hemorrhage among infants randomized to umbilical cord milking (12%) vs. delayed umbilical cord clamping (8%). However, immediate umbilical cord milking was significantly associated with a higher rate of intraventricular hemorrhage alone, compared with delayed clamping (8% vs. 3%), and this signal of risk prompted the researchers to terminate the study earlier than intended.

The researchers randomized 474 infants born at less than 32 weeks’ gestation to umbilical cord milking or delayed umbilical cord clamping (JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.16004). The study was conducted at six sites in the United States and one site each in Ireland, Germany, and Canada between June 2017 and September 2018. “Because of the importance of long-term neurodevelopment, all surviving infants will be followed up to determine developmental outcomes at 22 to 26 months’ corrected gestational age,” they said.

The study was terminated early, which prevents definitive conclusions, the researchers noted, but a new study has been approved to compare umbilical cord milking with delayed umbilical cord clamping in infants of 30-32 weeks’ gestational age, they said.

“Although the safety of placental transfusion for the mother seems well established, it remains unclear which method of providing placental transfusion is best for the infant: delayed clamping and cutting the cord or milking the intact cord. The latter provides a transfusion more rapidly, which may facilitate initiation of resuscitation when needed,” Heike Rabe, MD, of the University of Sussex, Brighton, and Ola Andersson, PhD, of Lund (Sweden) University, wrote in an editorial accompanying the two studies (JAMA. 2019 Nov 19;322:1864-5. doi: 10.1001/jama.2019.16003).

The 8% incidence of severe intraventricular hemorrhage in the umbilical milking group in the study by Katheria and colleagues was higher than the 5.2% in a recent Cochrane review, but the 3% incidence of severe intraventricular hemorrhage in the delayed group was lower than the 4.5% in the Cochrane review, they said.

“Umbilical cord milking has been used in many hospitals without an increase in intraventricular hemorrhage being observed,” they noted.

“The study by Purisch et al. demonstrated the safety of delayed cord clamping for mothers delivering by cesarean at term,” the editorialists wrote. Studies are underway to identify the best techniques for cord clamping, they said.

“In the meantime, clinicians should follow the World Health Organization recommendation to delay cord clamping and cutting for 1 to 3 minutes for term infants and for at least 60 seconds for preterm infants to prevent iron deficiency and potentially enable more premature infants to survive,” they concluded.

Dr. Purisch received funding from the Maternal-Fetal Medicine Fellow Research Fund for the first study. Coauthor Cynthia Gyamfi-Bannerman, MD, reported receiving grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Society for Maternal-Fetal Medicine/AMAG Pharmaceuticals, and personal fees from Sera Prognostics outside the submitted work. The second study was supported by NICHD in a grant to Dr. Katheria, who had no financial conflicts to disclose. Coauthor Gary Cutter, PhD, had numerous ties to pharmaceutical companies. The editorialists had no financial conflicts to disclose.

SOURCES: Purisch SE et al. JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.15995; Katheria A et al. JAMA. 2019 Nov 19. doi: 10.1001/jama.2019.16004; Rabe H and Andersson O. JAMA. 2019 Nov 19; 322:1864-5.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Heavy metals linked with autoimmune liver disease

Article Type
Changed
Wed, 11/20/2019 - 14:24

– Exposure to heavy metals from natural and man-made sources may contribute to development of autoimmune liver disease, according to a recent U.K. study involving more than 3,500 patients.

Will Pass/MDedge News
Dr. Jessica Dyson

Coal mines were particularly implicated, as they accounted for 39% of the risk of developing primary biliary cholangitis (PBC), reported lead author Jessica Dyson, MBBS, of Newcastle (England) University, and colleagues.

“We know that the etiology of autoimmune liver disease remains unclear, but we’re increasingly coming to understand that it’s likely to be a complex interplay between genetic and environmental factors,” Dr. Dyson said during a presentation at the annual meeting of the American Association for the Study of Liver Diseases. Showing a map of England, she pointed out how three autoimmune liver diseases – PBC, primary sclerosing cholangitis (PSC), and autoimmune hepatitis (AIH) – each have unique clusters of distribution. “This implies that environmental exposure may have a role in disease pathogenesis.”

To investigate this possibility, Dr. Dyson and colleagues used structural equation modeling to look for associations between the above three autoimmune liver diseases, socioeconomic status, and environmental factors. Specific environmental factors included soil concentrations of heavy metals (cadmium, arsenic, lead, manganese, and iron), coal mines, lead mines, quarries, urban areas, traffic, stream pH, and landfills.

The study was conducted in the northeast of England, where migration rates are low, Dr. Dyson said. From this region, the investigators identified patients with PBC (n = 2,150), AIH (n = 963), and PSC (n = 472). Conceptual models were used to examine relationships between covariates and prevalence of disease, with good models exhibiting a root-mean-square error of association less than 0.05 and a 95% covariate significance. After adjusting for population density, comparative fit was used to measure variation within each model.

The best model for PBC revealed the aforementioned link with coal mines, proximity to which accounted for 39% of the pathogenesis of PBC. High levels of cadmium in soil had an interactive role with coal mines, and itself directly contributed 22% of the risk of PBC; however, Dr. Dyson noted that, while many cadmium-rich areas had high rates of PBC, not all did.

“This demonstrates the complexity of causality of disease, and we certainly can’t say that cadmium, in its own right, is a direct cause and effect,” Dr. Dyson said. “But I think [cadmium] certainly potentially is one of the factors at play.”

For AIH, coal mines contributed less (6%), although cadmium still accounted for 22% of variation of disease, as did alkaline areas. Finally, a significant link was found between PSC and regions with high arsenic levels.

“To conclude, our data suggest that heavy metals may be risk factors for autoimmune liver disease,” Dr. Dyson said. “There are a number of exposure routes that may be pertinent to patients, from heavy metals occurring via natural sources, and also via virtue of human activity, such as burning of fossil fuels, heavy-metal production, and pesticides.” Dr. Dyson emphasized this latter route, as some rural areas, where pesticide use is common, had high prevalence rates of autoimmune liver disease.

Dr. Dyson went on to put her findings in context. “Heavy metals are a well-recognized cause of immune dysregulation and epithelial injury and are actually actively transported into the bile, and that may be particularly relevant in terms of cholangiopathies. And this leads us to the possibility of interventions to reduce toxic exposure that may modify risk of disease.”

Looking to the future, Dr. Dyson described plans to build on this research with measurements of heavy metals in tissues, serum, and urine.

The investigators reported no relevant disclosures.

SOURCE: Dyson J et al. The Liver Meeting 2019, Abstract 48.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Exposure to heavy metals from natural and man-made sources may contribute to development of autoimmune liver disease, according to a recent U.K. study involving more than 3,500 patients.

Will Pass/MDedge News
Dr. Jessica Dyson

Coal mines were particularly implicated, as they accounted for 39% of the risk of developing primary biliary cholangitis (PBC), reported lead author Jessica Dyson, MBBS, of Newcastle (England) University, and colleagues.

“We know that the etiology of autoimmune liver disease remains unclear, but we’re increasingly coming to understand that it’s likely to be a complex interplay between genetic and environmental factors,” Dr. Dyson said during a presentation at the annual meeting of the American Association for the Study of Liver Diseases. Showing a map of England, she pointed out how three autoimmune liver diseases – PBC, primary sclerosing cholangitis (PSC), and autoimmune hepatitis (AIH) – each have unique clusters of distribution. “This implies that environmental exposure may have a role in disease pathogenesis.”

To investigate this possibility, Dr. Dyson and colleagues used structural equation modeling to look for associations between the above three autoimmune liver diseases, socioeconomic status, and environmental factors. Specific environmental factors included soil concentrations of heavy metals (cadmium, arsenic, lead, manganese, and iron), coal mines, lead mines, quarries, urban areas, traffic, stream pH, and landfills.

The study was conducted in the northeast of England, where migration rates are low, Dr. Dyson said. From this region, the investigators identified patients with PBC (n = 2,150), AIH (n = 963), and PSC (n = 472). Conceptual models were used to examine relationships between covariates and prevalence of disease, with good models exhibiting a root-mean-square error of association less than 0.05 and a 95% covariate significance. After adjusting for population density, comparative fit was used to measure variation within each model.

The best model for PBC revealed the aforementioned link with coal mines, proximity to which accounted for 39% of the pathogenesis of PBC. High levels of cadmium in soil had an interactive role with coal mines, and itself directly contributed 22% of the risk of PBC; however, Dr. Dyson noted that, while many cadmium-rich areas had high rates of PBC, not all did.

“This demonstrates the complexity of causality of disease, and we certainly can’t say that cadmium, in its own right, is a direct cause and effect,” Dr. Dyson said. “But I think [cadmium] certainly potentially is one of the factors at play.”

For AIH, coal mines contributed less (6%), although cadmium still accounted for 22% of variation of disease, as did alkaline areas. Finally, a significant link was found between PSC and regions with high arsenic levels.

“To conclude, our data suggest that heavy metals may be risk factors for autoimmune liver disease,” Dr. Dyson said. “There are a number of exposure routes that may be pertinent to patients, from heavy metals occurring via natural sources, and also via virtue of human activity, such as burning of fossil fuels, heavy-metal production, and pesticides.” Dr. Dyson emphasized this latter route, as some rural areas, where pesticide use is common, had high prevalence rates of autoimmune liver disease.

Dr. Dyson went on to put her findings in context. “Heavy metals are a well-recognized cause of immune dysregulation and epithelial injury and are actually actively transported into the bile, and that may be particularly relevant in terms of cholangiopathies. And this leads us to the possibility of interventions to reduce toxic exposure that may modify risk of disease.”

Looking to the future, Dr. Dyson described plans to build on this research with measurements of heavy metals in tissues, serum, and urine.

The investigators reported no relevant disclosures.

SOURCE: Dyson J et al. The Liver Meeting 2019, Abstract 48.

– Exposure to heavy metals from natural and man-made sources may contribute to development of autoimmune liver disease, according to a recent U.K. study involving more than 3,500 patients.

Will Pass/MDedge News
Dr. Jessica Dyson

Coal mines were particularly implicated, as they accounted for 39% of the risk of developing primary biliary cholangitis (PBC), reported lead author Jessica Dyson, MBBS, of Newcastle (England) University, and colleagues.

“We know that the etiology of autoimmune liver disease remains unclear, but we’re increasingly coming to understand that it’s likely to be a complex interplay between genetic and environmental factors,” Dr. Dyson said during a presentation at the annual meeting of the American Association for the Study of Liver Diseases. Showing a map of England, she pointed out how three autoimmune liver diseases – PBC, primary sclerosing cholangitis (PSC), and autoimmune hepatitis (AIH) – each have unique clusters of distribution. “This implies that environmental exposure may have a role in disease pathogenesis.”

To investigate this possibility, Dr. Dyson and colleagues used structural equation modeling to look for associations between the above three autoimmune liver diseases, socioeconomic status, and environmental factors. Specific environmental factors included soil concentrations of heavy metals (cadmium, arsenic, lead, manganese, and iron), coal mines, lead mines, quarries, urban areas, traffic, stream pH, and landfills.

The study was conducted in the northeast of England, where migration rates are low, Dr. Dyson said. From this region, the investigators identified patients with PBC (n = 2,150), AIH (n = 963), and PSC (n = 472). Conceptual models were used to examine relationships between covariates and prevalence of disease, with good models exhibiting a root-mean-square error of association less than 0.05 and a 95% covariate significance. After adjusting for population density, comparative fit was used to measure variation within each model.

The best model for PBC revealed the aforementioned link with coal mines, proximity to which accounted for 39% of the pathogenesis of PBC. High levels of cadmium in soil had an interactive role with coal mines, and itself directly contributed 22% of the risk of PBC; however, Dr. Dyson noted that, while many cadmium-rich areas had high rates of PBC, not all did.

“This demonstrates the complexity of causality of disease, and we certainly can’t say that cadmium, in its own right, is a direct cause and effect,” Dr. Dyson said. “But I think [cadmium] certainly potentially is one of the factors at play.”

For AIH, coal mines contributed less (6%), although cadmium still accounted for 22% of variation of disease, as did alkaline areas. Finally, a significant link was found between PSC and regions with high arsenic levels.

“To conclude, our data suggest that heavy metals may be risk factors for autoimmune liver disease,” Dr. Dyson said. “There are a number of exposure routes that may be pertinent to patients, from heavy metals occurring via natural sources, and also via virtue of human activity, such as burning of fossil fuels, heavy-metal production, and pesticides.” Dr. Dyson emphasized this latter route, as some rural areas, where pesticide use is common, had high prevalence rates of autoimmune liver disease.

Dr. Dyson went on to put her findings in context. “Heavy metals are a well-recognized cause of immune dysregulation and epithelial injury and are actually actively transported into the bile, and that may be particularly relevant in terms of cholangiopathies. And this leads us to the possibility of interventions to reduce toxic exposure that may modify risk of disease.”

Looking to the future, Dr. Dyson described plans to build on this research with measurements of heavy metals in tissues, serum, and urine.

The investigators reported no relevant disclosures.

SOURCE: Dyson J et al. The Liver Meeting 2019, Abstract 48.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM THE LIVER MEETING 2019

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.