Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Biosimilar equal to natalizumab for relapsing remitting MS

Article Type
Changed
Mon, 02/27/2023 - 15:10

An agent biologically similar to the humanized monoclonal antibody natalizumab is as effective and safe as the original reference drug for relapsing remitting multiple sclerosis (RRMS) – and has a similar level of immunogenicity, new research shows.

The investigators noted that these phase 3 trial findings are the final stage in the regulatory approval process.

“There will be a biosimilar that with respect to all parameters – efficacy, side effects, immunogenicity – doesn’t differ from the original drug and will probably be an option to consider to reduce treatment costs in MS,” said lead investigator Bernhard Hemmer, MD, a professor in the department of neurology, Technical University of Munich (Germany).

The findings were published online in JAMA Neurology.
 

Potential cost savings

Disease-modifying therapies (DMTs), particularly targeted biologics, have revolutionized the treatment of MS, including RRMS. Natalizumab, which was the first targeted biologic therapy approved for RRMS, is very effective and widely used, Dr. Hemmer said.

However, this and other DMTs are costly. Biosimilars, which are medicines clinically similar to an already marketed reference biologic medicine, can address this issue. In the areas of rheumatology and oncology, biosimilars have already demonstrated significant cost savings and improved treatment access.

The biosimilar natalizumab (biosim-NTZ), developed by Polpharma Biologics, is the first biosimilar monoclonal antibody therapy to be developed for MS.

Health authorities such as the Food and Drug Administration require comparative phase 3 studies to confirm there are no clinically relevant differences between a proposed biosimilar and its reference medicine.

The new multicenter, phase 3, double-blind, randomized trial – known as Antelope – included 264 adult patients with RRMS at 48 centers in seven Eastern European countries. Most study participants were women (61.4%), and their mean age was 36.7 years.

All study participants were randomly assigned to receive intravenous infusions every 4 weeks of 300 mg of biosim-NTZ or reference natalizumab (ref-NTZ) for a total of 12 infusions.

At week 24, 30 patients were switched from ref-NTZ to biosim-NTZ for the remainder of their infusions. Including such a population is required by regulatory agencies to ensure switching patients from a drug they’ve been taking to a new biosimilar does not introduce any concerns, said Dr. Hemmer.
 

Comparable efficacy, safety profile

The primary efficacy endpoint was the cumulative number of new active brain lesions on MRI.

At baseline, 48.1% of the biosimilar group and 45.9% of the reference drug group had at least one gadolinium-enhancing lesion. In addition, 96.9% of the biosimilar group had more than 15 T2 lesions, compared with 96.2% of the reference group.

At week 24, the mean difference between biosim-NTZ and ref-NTZ in the cumulative number of new active lesions was 0.17 (least square means, 0.34 vs. 0.45), with a 95% confidence interval of –0.61 to 0.94 and a point estimate within the prespecified margins of ± 2.1.

The annualized relapse rate for biosim-NTZ and ref-NTZ was similar at 24 weeks (0.21 vs. 0.15), as well as at 48 weeks (0.17 vs. 0.13). For Expanded Disability Status Scale scores, which were similar between treatment groups at baseline (mean, 3.4 vs. 3.2), change at 24 and 48 weeks was minimal and similar in both groups.

The safety profile was as expected for patients with RRMS receiving natalizumab. There were few adverse events of special interest, with similar proportions across all treatment groups.

The overall adverse-event profile for patients who switched from ref-NTZ to biosim-NTZ was similar to patients continuing ref-NTZ treatment and did not indicate any new or increased risks associated with switching.

Rates of treatment-emergent adverse events (TEAEs) were similar, at 64.9% for biosim-NTZ, 68.9% for ref-NTZ, and 73.3% for the switch group. The most-reported TEAEs among all treatment groups were nervous system disorders and infections and infestations.

Progressive multifocal leukoencephalopathy (PML), a rare and potentially fatal demyelinating disease of the central nervous system, is associated with some DMTs – notably ref-NTZ. It is caused by infection with the John Cunningham virus (JCV) (also referred to as human polyomavirus), the researchers noted.

As per the study protocol, no participant had a JCV-positive index of more than 1.5 at baseline. Proportions of patients positive for anti-JCV antibodies were similarly distributed between treatment groups throughout the study.
 

 

 

Similar immunogenicity

There was strong concordance regarding positivity for treatment-emergent antidrug antibodies between the biosim-NTZ and ref-NTZ groups (79.4% and 74.0%). This was also the case for antinatalizumab-neutralizing antibodies (69.0% and 66.2%).

“There was nothing that indicated immunogenicity is different” between the two agents, said Dr. Hemmer.

While this might change “when you look at longer time periods,” antibodies to natalizumab usually develop “very early on,” he added.

Dr. Hemmer noted that this comparison of the proposed biosimilar with the reference drug had no real surprises.

“If the immunogenicity is the same, the mode of action is the same, and the dose is the same, you would expect to have a similar clinical effect and also a similar side-effect profile, which is indeed the case,” he said.

Dr. Hemmer added that he has no insight as to when the drug might be approved but believes developers expect that to occur sometime this year.
 

Welcome results

Commenting on the study results, Torge Rempe, MD, assistant professor in the department of neurology, University of Florida, Gainesville, and the William T. And Janice M. Neely professor for research in MS, said he welcomes these new results showing the biosimilar matched the reference medication.

“The authors report no significant difference in their primary endpoint of cumulative number of active lesions as well as their secondary clinical endpoints of annualized relapse rate and changes from baseline Expanded Disability Status Scale scores,” said Dr. Rempe, who was not involved with the research.

The study also showed the reported adverse events were similar between the biosimilar and reference natalizumab, he noted.

However, although no cases of PML were uncovered during the study period, further research is needed to determine long-term safety in this area, Dr. Rempe said.

Finally, he agreed that the development of biosimilars such as this one addresses the issue of high annual costs for DMTs, an area of concern in the field of MS.

The study was funded by Polpharma Biologics. Dr. Hemmer has reported receiving personal fees from Polpharma and Sandoz during the conduct of the study and personal fees from Novartis, Biocom, and TG Therapeutics outside the submitted work. He has also received a patent for genetic determinants of antibodies against interferon-beta and a patent for KIR4.1 antibodies in MS; served on scientific advisory boards for Novartis; served as a data monitoring and safety committee member for AllergyCare, Polpharma Biologics, Sandoz, and TG Therapeutics; and received speaker honoraria from Desitin, grants from Regeneron for MS research, and funding from the Multiple MS EU consortium, the CLINSPECT-M consortium, and the German Research Foundation. Dr. Rempe has reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Publications
Topics
Sections

An agent biologically similar to the humanized monoclonal antibody natalizumab is as effective and safe as the original reference drug for relapsing remitting multiple sclerosis (RRMS) – and has a similar level of immunogenicity, new research shows.

The investigators noted that these phase 3 trial findings are the final stage in the regulatory approval process.

“There will be a biosimilar that with respect to all parameters – efficacy, side effects, immunogenicity – doesn’t differ from the original drug and will probably be an option to consider to reduce treatment costs in MS,” said lead investigator Bernhard Hemmer, MD, a professor in the department of neurology, Technical University of Munich (Germany).

The findings were published online in JAMA Neurology.
 

Potential cost savings

Disease-modifying therapies (DMTs), particularly targeted biologics, have revolutionized the treatment of MS, including RRMS. Natalizumab, which was the first targeted biologic therapy approved for RRMS, is very effective and widely used, Dr. Hemmer said.

However, this and other DMTs are costly. Biosimilars, which are medicines clinically similar to an already marketed reference biologic medicine, can address this issue. In the areas of rheumatology and oncology, biosimilars have already demonstrated significant cost savings and improved treatment access.

The biosimilar natalizumab (biosim-NTZ), developed by Polpharma Biologics, is the first biosimilar monoclonal antibody therapy to be developed for MS.

Health authorities such as the Food and Drug Administration require comparative phase 3 studies to confirm there are no clinically relevant differences between a proposed biosimilar and its reference medicine.

The new multicenter, phase 3, double-blind, randomized trial – known as Antelope – included 264 adult patients with RRMS at 48 centers in seven Eastern European countries. Most study participants were women (61.4%), and their mean age was 36.7 years.

All study participants were randomly assigned to receive intravenous infusions every 4 weeks of 300 mg of biosim-NTZ or reference natalizumab (ref-NTZ) for a total of 12 infusions.

At week 24, 30 patients were switched from ref-NTZ to biosim-NTZ for the remainder of their infusions. Including such a population is required by regulatory agencies to ensure switching patients from a drug they’ve been taking to a new biosimilar does not introduce any concerns, said Dr. Hemmer.
 

Comparable efficacy, safety profile

The primary efficacy endpoint was the cumulative number of new active brain lesions on MRI.

At baseline, 48.1% of the biosimilar group and 45.9% of the reference drug group had at least one gadolinium-enhancing lesion. In addition, 96.9% of the biosimilar group had more than 15 T2 lesions, compared with 96.2% of the reference group.

At week 24, the mean difference between biosim-NTZ and ref-NTZ in the cumulative number of new active lesions was 0.17 (least square means, 0.34 vs. 0.45), with a 95% confidence interval of –0.61 to 0.94 and a point estimate within the prespecified margins of ± 2.1.

The annualized relapse rate for biosim-NTZ and ref-NTZ was similar at 24 weeks (0.21 vs. 0.15), as well as at 48 weeks (0.17 vs. 0.13). For Expanded Disability Status Scale scores, which were similar between treatment groups at baseline (mean, 3.4 vs. 3.2), change at 24 and 48 weeks was minimal and similar in both groups.

The safety profile was as expected for patients with RRMS receiving natalizumab. There were few adverse events of special interest, with similar proportions across all treatment groups.

The overall adverse-event profile for patients who switched from ref-NTZ to biosim-NTZ was similar to patients continuing ref-NTZ treatment and did not indicate any new or increased risks associated with switching.

Rates of treatment-emergent adverse events (TEAEs) were similar, at 64.9% for biosim-NTZ, 68.9% for ref-NTZ, and 73.3% for the switch group. The most-reported TEAEs among all treatment groups were nervous system disorders and infections and infestations.

Progressive multifocal leukoencephalopathy (PML), a rare and potentially fatal demyelinating disease of the central nervous system, is associated with some DMTs – notably ref-NTZ. It is caused by infection with the John Cunningham virus (JCV) (also referred to as human polyomavirus), the researchers noted.

As per the study protocol, no participant had a JCV-positive index of more than 1.5 at baseline. Proportions of patients positive for anti-JCV antibodies were similarly distributed between treatment groups throughout the study.
 

 

 

Similar immunogenicity

There was strong concordance regarding positivity for treatment-emergent antidrug antibodies between the biosim-NTZ and ref-NTZ groups (79.4% and 74.0%). This was also the case for antinatalizumab-neutralizing antibodies (69.0% and 66.2%).

“There was nothing that indicated immunogenicity is different” between the two agents, said Dr. Hemmer.

While this might change “when you look at longer time periods,” antibodies to natalizumab usually develop “very early on,” he added.

Dr. Hemmer noted that this comparison of the proposed biosimilar with the reference drug had no real surprises.

“If the immunogenicity is the same, the mode of action is the same, and the dose is the same, you would expect to have a similar clinical effect and also a similar side-effect profile, which is indeed the case,” he said.

Dr. Hemmer added that he has no insight as to when the drug might be approved but believes developers expect that to occur sometime this year.
 

Welcome results

Commenting on the study results, Torge Rempe, MD, assistant professor in the department of neurology, University of Florida, Gainesville, and the William T. And Janice M. Neely professor for research in MS, said he welcomes these new results showing the biosimilar matched the reference medication.

“The authors report no significant difference in their primary endpoint of cumulative number of active lesions as well as their secondary clinical endpoints of annualized relapse rate and changes from baseline Expanded Disability Status Scale scores,” said Dr. Rempe, who was not involved with the research.

The study also showed the reported adverse events were similar between the biosimilar and reference natalizumab, he noted.

However, although no cases of PML were uncovered during the study period, further research is needed to determine long-term safety in this area, Dr. Rempe said.

Finally, he agreed that the development of biosimilars such as this one addresses the issue of high annual costs for DMTs, an area of concern in the field of MS.

The study was funded by Polpharma Biologics. Dr. Hemmer has reported receiving personal fees from Polpharma and Sandoz during the conduct of the study and personal fees from Novartis, Biocom, and TG Therapeutics outside the submitted work. He has also received a patent for genetic determinants of antibodies against interferon-beta and a patent for KIR4.1 antibodies in MS; served on scientific advisory boards for Novartis; served as a data monitoring and safety committee member for AllergyCare, Polpharma Biologics, Sandoz, and TG Therapeutics; and received speaker honoraria from Desitin, grants from Regeneron for MS research, and funding from the Multiple MS EU consortium, the CLINSPECT-M consortium, and the German Research Foundation. Dr. Rempe has reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

An agent biologically similar to the humanized monoclonal antibody natalizumab is as effective and safe as the original reference drug for relapsing remitting multiple sclerosis (RRMS) – and has a similar level of immunogenicity, new research shows.

The investigators noted that these phase 3 trial findings are the final stage in the regulatory approval process.

“There will be a biosimilar that with respect to all parameters – efficacy, side effects, immunogenicity – doesn’t differ from the original drug and will probably be an option to consider to reduce treatment costs in MS,” said lead investigator Bernhard Hemmer, MD, a professor in the department of neurology, Technical University of Munich (Germany).

The findings were published online in JAMA Neurology.
 

Potential cost savings

Disease-modifying therapies (DMTs), particularly targeted biologics, have revolutionized the treatment of MS, including RRMS. Natalizumab, which was the first targeted biologic therapy approved for RRMS, is very effective and widely used, Dr. Hemmer said.

However, this and other DMTs are costly. Biosimilars, which are medicines clinically similar to an already marketed reference biologic medicine, can address this issue. In the areas of rheumatology and oncology, biosimilars have already demonstrated significant cost savings and improved treatment access.

The biosimilar natalizumab (biosim-NTZ), developed by Polpharma Biologics, is the first biosimilar monoclonal antibody therapy to be developed for MS.

Health authorities such as the Food and Drug Administration require comparative phase 3 studies to confirm there are no clinically relevant differences between a proposed biosimilar and its reference medicine.

The new multicenter, phase 3, double-blind, randomized trial – known as Antelope – included 264 adult patients with RRMS at 48 centers in seven Eastern European countries. Most study participants were women (61.4%), and their mean age was 36.7 years.

All study participants were randomly assigned to receive intravenous infusions every 4 weeks of 300 mg of biosim-NTZ or reference natalizumab (ref-NTZ) for a total of 12 infusions.

At week 24, 30 patients were switched from ref-NTZ to biosim-NTZ for the remainder of their infusions. Including such a population is required by regulatory agencies to ensure switching patients from a drug they’ve been taking to a new biosimilar does not introduce any concerns, said Dr. Hemmer.
 

Comparable efficacy, safety profile

The primary efficacy endpoint was the cumulative number of new active brain lesions on MRI.

At baseline, 48.1% of the biosimilar group and 45.9% of the reference drug group had at least one gadolinium-enhancing lesion. In addition, 96.9% of the biosimilar group had more than 15 T2 lesions, compared with 96.2% of the reference group.

At week 24, the mean difference between biosim-NTZ and ref-NTZ in the cumulative number of new active lesions was 0.17 (least square means, 0.34 vs. 0.45), with a 95% confidence interval of –0.61 to 0.94 and a point estimate within the prespecified margins of ± 2.1.

The annualized relapse rate for biosim-NTZ and ref-NTZ was similar at 24 weeks (0.21 vs. 0.15), as well as at 48 weeks (0.17 vs. 0.13). For Expanded Disability Status Scale scores, which were similar between treatment groups at baseline (mean, 3.4 vs. 3.2), change at 24 and 48 weeks was minimal and similar in both groups.

The safety profile was as expected for patients with RRMS receiving natalizumab. There were few adverse events of special interest, with similar proportions across all treatment groups.

The overall adverse-event profile for patients who switched from ref-NTZ to biosim-NTZ was similar to patients continuing ref-NTZ treatment and did not indicate any new or increased risks associated with switching.

Rates of treatment-emergent adverse events (TEAEs) were similar, at 64.9% for biosim-NTZ, 68.9% for ref-NTZ, and 73.3% for the switch group. The most-reported TEAEs among all treatment groups were nervous system disorders and infections and infestations.

Progressive multifocal leukoencephalopathy (PML), a rare and potentially fatal demyelinating disease of the central nervous system, is associated with some DMTs – notably ref-NTZ. It is caused by infection with the John Cunningham virus (JCV) (also referred to as human polyomavirus), the researchers noted.

As per the study protocol, no participant had a JCV-positive index of more than 1.5 at baseline. Proportions of patients positive for anti-JCV antibodies were similarly distributed between treatment groups throughout the study.
 

 

 

Similar immunogenicity

There was strong concordance regarding positivity for treatment-emergent antidrug antibodies between the biosim-NTZ and ref-NTZ groups (79.4% and 74.0%). This was also the case for antinatalizumab-neutralizing antibodies (69.0% and 66.2%).

“There was nothing that indicated immunogenicity is different” between the two agents, said Dr. Hemmer.

While this might change “when you look at longer time periods,” antibodies to natalizumab usually develop “very early on,” he added.

Dr. Hemmer noted that this comparison of the proposed biosimilar with the reference drug had no real surprises.

“If the immunogenicity is the same, the mode of action is the same, and the dose is the same, you would expect to have a similar clinical effect and also a similar side-effect profile, which is indeed the case,” he said.

Dr. Hemmer added that he has no insight as to when the drug might be approved but believes developers expect that to occur sometime this year.
 

Welcome results

Commenting on the study results, Torge Rempe, MD, assistant professor in the department of neurology, University of Florida, Gainesville, and the William T. And Janice M. Neely professor for research in MS, said he welcomes these new results showing the biosimilar matched the reference medication.

“The authors report no significant difference in their primary endpoint of cumulative number of active lesions as well as their secondary clinical endpoints of annualized relapse rate and changes from baseline Expanded Disability Status Scale scores,” said Dr. Rempe, who was not involved with the research.

The study also showed the reported adverse events were similar between the biosimilar and reference natalizumab, he noted.

However, although no cases of PML were uncovered during the study period, further research is needed to determine long-term safety in this area, Dr. Rempe said.

Finally, he agreed that the development of biosimilars such as this one addresses the issue of high annual costs for DMTs, an area of concern in the field of MS.

The study was funded by Polpharma Biologics. Dr. Hemmer has reported receiving personal fees from Polpharma and Sandoz during the conduct of the study and personal fees from Novartis, Biocom, and TG Therapeutics outside the submitted work. He has also received a patent for genetic determinants of antibodies against interferon-beta and a patent for KIR4.1 antibodies in MS; served on scientific advisory boards for Novartis; served as a data monitoring and safety committee member for AllergyCare, Polpharma Biologics, Sandoz, and TG Therapeutics; and received speaker honoraria from Desitin, grants from Regeneron for MS research, and funding from the Multiple MS EU consortium, the CLINSPECT-M consortium, and the German Research Foundation. Dr. Rempe has reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Issue
Neurology Reviews - 31(3)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Minorities with epilepsy blocked from receiving ‘highest quality of care’

Article Type
Changed
Mon, 02/27/2023 - 15:16

People of color with epilepsy, including Black, Hispanic, and Native Hawaiian and other Pacific Islander patients, are significantly less likely to be prescribed the latest antiseizure medications (ASMs), compared with their White counterparts, new research shows.

Even after controlling for epilepsy severity, comorbid conditions, and other factors that might affect medication choice, researchers found that newer medication use was 29% less likely in Black patients, 23% less likely in Native Hawaiian and other Pacific Islander patients, and 7% less likely in Hispanic patients, compared with White individuals.

“I hope that clinicians will see from our findings that minoritized patients with epilepsy face a myriad of barriers in receiving the highest quality of care, including ASM use,” said lead investigator Wyatt P. Bensken, PhD, adjunct assistant professor of Population and Quantitative Health Sciences at Case Western Reserve University, Cleveland. “Considering your patients’ barriers, and how that influences their care – including ASM selection – will be critical to helping reduce these population-level inequities.”

Dr. Wyatt Bensken


The study was published online in Neurology Clinical Practice.
 

A prompt for practice change

For the study, researchers used Medicaid claims for more than 78,000 people who had filled at least two prescriptions for an ASM between 2010 and 2014.

Most patients were White (53.4%); 22.6% were Black; 11.9% were Hispanic; 1.6% were Asian; 1.5% were Native Hawaiian or other Pacific Islander; 0.6% American Indian or Alaskan Native; and 8.3% were classified as “other.”

One-quarter of participants were taking an older ASM, such as carbamazepine, phenytoin, and valproate. About 65% were taking second-generation ASMs, including gabapentin, levetiracetam, and zonisamide. A little less than 10% were taking lacosamide, perampenel, or another third-generation ASM.

Compared with White patients, newer medication prescriptions were significantly less likely in Black individuals (adjusted odds ratio, 0.71; 95% confidence interval, 0.68-0.75), Native Hawaiian or other Pacific Islanders (aOR, 0.77; 95% CI, 0.67-0.88), and Hispanic patients (aOR, 0.93; 95% CI, 0.88-0.99).

Third-generation ASMs were used by 10.7% of White patients versus 6% of Black individuals and 5.1% of American Indian or Alaskan Native patients.

Researchers also found that taking a second-generation ASM was associated with better treatment adherence (aOR, 1.17; 95% CI, 1.11-1.23) and that patients on newer ASMs were more than three times as likely to be under the care of a neurologist (aOR, 3.26; 95% CI, 3.13-3.41).

The findings draw attention to racial inequities surrounding access to medication and specialists and subspecialists, Dr. Bensken said. Identifying specific barriers and developing solutions is the long-range goal, he added.

“In the interim, increasing the attention to these inequities will, we hope, prompt changes across practices,” Dr. Bensken said.
 

A ‘wake-up call’

Commenting on the findings, Joseph Sirven, MD, professor of neurology at the Mayo Clinic Florida, Jacksonville, said the results were “striking” because newer ASMs are generally the go-to for most physicians who treat epilepsy. 

“Use of first-generation ASMs is typically reserved [for] if one runs out of options,” Dr. Sirven said.

This study and others like it should serve as a “wake-up call” for clinicians, Dr. Sirven added.

“This study is important because it shows that whether we realize it or not, race and ethnicities are playing a role in ASM, and this is related to financial access to newer-generation drugs,” he said. “Similar findings are seen in impoverished countries where first-generation ASM drugs are routinely used because of drug pricing.”
 

 

 

More to explore

Also commenting on the study, Scott Mintzer, MD, a professor and director of the Epilepsy Monitoring Unit at Thomas Jefferson University, Philadelphia, said using first-generation ASMs as a proxy for quality of care is “a very innovative concept.”

“From that perspective, the finding that racial minority patients are more likely to be on a first-generation drug is not surprising. But after that it gets far more complicated to interpret,” he added.  

Neither adherence nor care by a neurologist was different in a consistent direction within the various minority populations, Dr. Mintzer noted. In addition, Black patients were as likely to see a neurologist as White patients but still more likely to be on a first-generation drug.

There are also a few caveats to the findings that should be considered, Dr. Mintzer added. First, the sample included only Medicaid recipients, nearly 35% of whom had a comorbid psychosis. Those and other characteristics of the study pool suggest participants aren’t representative of the United States population as a whole. Second, significant shifts in ASM use have occurred since the study data cutoff in 2014, none of which are reflected in these findings.

“So, I don’t think we can really say how to address this yet,” Dr. Mintzer said. “There’s a lot to explore about whether this is still occurring, how generalizable these findings are, and what they might be due to, as there are a host of potential explanations, which the authors themselves acknowledge.”

The study was funded by the U.S. Centers for Disease Control and Prevention and the National Institute on Minority Health and Health Disparities. Dr. Bensken has received support for this work from NIMHD and serves on the Editorial Board of the journal Neurology. Dr. Sirven and Mintzer report no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Publications
Topics
Sections

People of color with epilepsy, including Black, Hispanic, and Native Hawaiian and other Pacific Islander patients, are significantly less likely to be prescribed the latest antiseizure medications (ASMs), compared with their White counterparts, new research shows.

Even after controlling for epilepsy severity, comorbid conditions, and other factors that might affect medication choice, researchers found that newer medication use was 29% less likely in Black patients, 23% less likely in Native Hawaiian and other Pacific Islander patients, and 7% less likely in Hispanic patients, compared with White individuals.

“I hope that clinicians will see from our findings that minoritized patients with epilepsy face a myriad of barriers in receiving the highest quality of care, including ASM use,” said lead investigator Wyatt P. Bensken, PhD, adjunct assistant professor of Population and Quantitative Health Sciences at Case Western Reserve University, Cleveland. “Considering your patients’ barriers, and how that influences their care – including ASM selection – will be critical to helping reduce these population-level inequities.”

Dr. Wyatt Bensken


The study was published online in Neurology Clinical Practice.
 

A prompt for practice change

For the study, researchers used Medicaid claims for more than 78,000 people who had filled at least two prescriptions for an ASM between 2010 and 2014.

Most patients were White (53.4%); 22.6% were Black; 11.9% were Hispanic; 1.6% were Asian; 1.5% were Native Hawaiian or other Pacific Islander; 0.6% American Indian or Alaskan Native; and 8.3% were classified as “other.”

One-quarter of participants were taking an older ASM, such as carbamazepine, phenytoin, and valproate. About 65% were taking second-generation ASMs, including gabapentin, levetiracetam, and zonisamide. A little less than 10% were taking lacosamide, perampenel, or another third-generation ASM.

Compared with White patients, newer medication prescriptions were significantly less likely in Black individuals (adjusted odds ratio, 0.71; 95% confidence interval, 0.68-0.75), Native Hawaiian or other Pacific Islanders (aOR, 0.77; 95% CI, 0.67-0.88), and Hispanic patients (aOR, 0.93; 95% CI, 0.88-0.99).

Third-generation ASMs were used by 10.7% of White patients versus 6% of Black individuals and 5.1% of American Indian or Alaskan Native patients.

Researchers also found that taking a second-generation ASM was associated with better treatment adherence (aOR, 1.17; 95% CI, 1.11-1.23) and that patients on newer ASMs were more than three times as likely to be under the care of a neurologist (aOR, 3.26; 95% CI, 3.13-3.41).

The findings draw attention to racial inequities surrounding access to medication and specialists and subspecialists, Dr. Bensken said. Identifying specific barriers and developing solutions is the long-range goal, he added.

“In the interim, increasing the attention to these inequities will, we hope, prompt changes across practices,” Dr. Bensken said.
 

A ‘wake-up call’

Commenting on the findings, Joseph Sirven, MD, professor of neurology at the Mayo Clinic Florida, Jacksonville, said the results were “striking” because newer ASMs are generally the go-to for most physicians who treat epilepsy. 

“Use of first-generation ASMs is typically reserved [for] if one runs out of options,” Dr. Sirven said.

This study and others like it should serve as a “wake-up call” for clinicians, Dr. Sirven added.

“This study is important because it shows that whether we realize it or not, race and ethnicities are playing a role in ASM, and this is related to financial access to newer-generation drugs,” he said. “Similar findings are seen in impoverished countries where first-generation ASM drugs are routinely used because of drug pricing.”
 

 

 

More to explore

Also commenting on the study, Scott Mintzer, MD, a professor and director of the Epilepsy Monitoring Unit at Thomas Jefferson University, Philadelphia, said using first-generation ASMs as a proxy for quality of care is “a very innovative concept.”

“From that perspective, the finding that racial minority patients are more likely to be on a first-generation drug is not surprising. But after that it gets far more complicated to interpret,” he added.  

Neither adherence nor care by a neurologist was different in a consistent direction within the various minority populations, Dr. Mintzer noted. In addition, Black patients were as likely to see a neurologist as White patients but still more likely to be on a first-generation drug.

There are also a few caveats to the findings that should be considered, Dr. Mintzer added. First, the sample included only Medicaid recipients, nearly 35% of whom had a comorbid psychosis. Those and other characteristics of the study pool suggest participants aren’t representative of the United States population as a whole. Second, significant shifts in ASM use have occurred since the study data cutoff in 2014, none of which are reflected in these findings.

“So, I don’t think we can really say how to address this yet,” Dr. Mintzer said. “There’s a lot to explore about whether this is still occurring, how generalizable these findings are, and what they might be due to, as there are a host of potential explanations, which the authors themselves acknowledge.”

The study was funded by the U.S. Centers for Disease Control and Prevention and the National Institute on Minority Health and Health Disparities. Dr. Bensken has received support for this work from NIMHD and serves on the Editorial Board of the journal Neurology. Dr. Sirven and Mintzer report no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

People of color with epilepsy, including Black, Hispanic, and Native Hawaiian and other Pacific Islander patients, are significantly less likely to be prescribed the latest antiseizure medications (ASMs), compared with their White counterparts, new research shows.

Even after controlling for epilepsy severity, comorbid conditions, and other factors that might affect medication choice, researchers found that newer medication use was 29% less likely in Black patients, 23% less likely in Native Hawaiian and other Pacific Islander patients, and 7% less likely in Hispanic patients, compared with White individuals.

“I hope that clinicians will see from our findings that minoritized patients with epilepsy face a myriad of barriers in receiving the highest quality of care, including ASM use,” said lead investigator Wyatt P. Bensken, PhD, adjunct assistant professor of Population and Quantitative Health Sciences at Case Western Reserve University, Cleveland. “Considering your patients’ barriers, and how that influences their care – including ASM selection – will be critical to helping reduce these population-level inequities.”

Dr. Wyatt Bensken


The study was published online in Neurology Clinical Practice.
 

A prompt for practice change

For the study, researchers used Medicaid claims for more than 78,000 people who had filled at least two prescriptions for an ASM between 2010 and 2014.

Most patients were White (53.4%); 22.6% were Black; 11.9% were Hispanic; 1.6% were Asian; 1.5% were Native Hawaiian or other Pacific Islander; 0.6% American Indian or Alaskan Native; and 8.3% were classified as “other.”

One-quarter of participants were taking an older ASM, such as carbamazepine, phenytoin, and valproate. About 65% were taking second-generation ASMs, including gabapentin, levetiracetam, and zonisamide. A little less than 10% were taking lacosamide, perampenel, or another third-generation ASM.

Compared with White patients, newer medication prescriptions were significantly less likely in Black individuals (adjusted odds ratio, 0.71; 95% confidence interval, 0.68-0.75), Native Hawaiian or other Pacific Islanders (aOR, 0.77; 95% CI, 0.67-0.88), and Hispanic patients (aOR, 0.93; 95% CI, 0.88-0.99).

Third-generation ASMs were used by 10.7% of White patients versus 6% of Black individuals and 5.1% of American Indian or Alaskan Native patients.

Researchers also found that taking a second-generation ASM was associated with better treatment adherence (aOR, 1.17; 95% CI, 1.11-1.23) and that patients on newer ASMs were more than three times as likely to be under the care of a neurologist (aOR, 3.26; 95% CI, 3.13-3.41).

The findings draw attention to racial inequities surrounding access to medication and specialists and subspecialists, Dr. Bensken said. Identifying specific barriers and developing solutions is the long-range goal, he added.

“In the interim, increasing the attention to these inequities will, we hope, prompt changes across practices,” Dr. Bensken said.
 

A ‘wake-up call’

Commenting on the findings, Joseph Sirven, MD, professor of neurology at the Mayo Clinic Florida, Jacksonville, said the results were “striking” because newer ASMs are generally the go-to for most physicians who treat epilepsy. 

“Use of first-generation ASMs is typically reserved [for] if one runs out of options,” Dr. Sirven said.

This study and others like it should serve as a “wake-up call” for clinicians, Dr. Sirven added.

“This study is important because it shows that whether we realize it or not, race and ethnicities are playing a role in ASM, and this is related to financial access to newer-generation drugs,” he said. “Similar findings are seen in impoverished countries where first-generation ASM drugs are routinely used because of drug pricing.”
 

 

 

More to explore

Also commenting on the study, Scott Mintzer, MD, a professor and director of the Epilepsy Monitoring Unit at Thomas Jefferson University, Philadelphia, said using first-generation ASMs as a proxy for quality of care is “a very innovative concept.”

“From that perspective, the finding that racial minority patients are more likely to be on a first-generation drug is not surprising. But after that it gets far more complicated to interpret,” he added.  

Neither adherence nor care by a neurologist was different in a consistent direction within the various minority populations, Dr. Mintzer noted. In addition, Black patients were as likely to see a neurologist as White patients but still more likely to be on a first-generation drug.

There are also a few caveats to the findings that should be considered, Dr. Mintzer added. First, the sample included only Medicaid recipients, nearly 35% of whom had a comorbid psychosis. Those and other characteristics of the study pool suggest participants aren’t representative of the United States population as a whole. Second, significant shifts in ASM use have occurred since the study data cutoff in 2014, none of which are reflected in these findings.

“So, I don’t think we can really say how to address this yet,” Dr. Mintzer said. “There’s a lot to explore about whether this is still occurring, how generalizable these findings are, and what they might be due to, as there are a host of potential explanations, which the authors themselves acknowledge.”

The study was funded by the U.S. Centers for Disease Control and Prevention and the National Institute on Minority Health and Health Disparities. Dr. Bensken has received support for this work from NIMHD and serves on the Editorial Board of the journal Neurology. Dr. Sirven and Mintzer report no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Issue
Neurology Reviews - 31(3)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY CLINICAL PRACTICE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Social isolation hikes dementia risk in older adults

Article Type
Changed
Mon, 02/27/2023 - 15:01

Social isolation in older adults increases the risk for developing dementia, new research suggests. Results from a longitudinal study that included more than 5,000 United States–based seniors showed that nearly one-quarter were socially isolated.

After adjusting for demographic and health factors, social isolation was found to be associated with a 28% higher risk for developing dementia over a 9-year period, compared with non-isolation. In addition, this finding held true regardless of race or ethnicity.

“Social connections are increasingly understood as a critical factor for the health of individuals as they age,” senior study author Thomas K.M. Cudjoe, MD, Robert and Jane Meyerhoff Endowed Professor and assistant professor of medicine, Division of Geriatric Medicine and Gerontology, Johns Hopkins University School of Medicine, Baltimore, said in a press release. “Our study expands our understanding of the deleterious impact of social isolation on one’s risk for dementia over time,” Dr. Cudjoe added.

The findings were published online  in the Journal of the American Geriatric Society.
 

Upstream resources, downstream outcomes

Social isolation is a “multidimensional construct” characterized by factors such as social connections, social support, resource sharing, and relationship strain. It also affects approximately a quarter of older adults, the investigators noted.

Although prior studies have pointed to an association between socially isolated older adults and increased risk for incident dementia, no study has described this longitudinal association in a nationally representative cohort of U.S. seniors. 

Dr. Cudjoe said he was motivated to conduct the current study because he wondered whether or not older adults throughout the United States were similar to some of his patients “who might be at risk for worse cognitive outcomes because they lacked social contact with friends, family, or neighbors.”

The study was also “informed by conceptual foundation that upstream social and personal resources are linked to downstream health outcomes, including cognitive health and function,” the researchers added.

They turned to 2011-2020 data from the National Health and Aging Trends Study, a nationally representative, longitudinal cohort of U.S. Medicare beneficiaries. The sample was drawn from the Medicare enrollment file and incorporated 95 counties and 655 zip codes.

Participants (n = 5,022; mean age, 76.4 years; 57.2% women; 71.7% White, non-Hispanic; 42.4% having more than a college education) were community-dwelling older adults who completed annual 2-hour interviews that included assessment of function, economic health status, and well-being. To be included, they had to attend at least the baseline and first follow-up visits.

NHATS “includes domains that are relevant for the characterization of social isolation,” the investigators wrote. It used a typology of structural social isolation that is informed by the Berkman-Syme Social Network Index.

Included domains were living arrangements, discussion networks, and participation. All are “clinically relevant, practical, and components of a comprehensive social history,” the researchers noted.

They added that individuals classified as “socially isolated” often live alone, have no one or only one person that they can rely upon to discuss important matters, and have limited or no engagement in social or religious groups.

Social isolation in the study was characterized using questions about living with at least one other person, talking to two or more other people about “important matters” in the past year, attending religious services in the past month, and participating in the past month in such things as clubs, meetings, group activities, or volunteer work.
 

 

 

Wake-up call

Study participants received 1 point for each item/domain, with a sum score of 0 or 1 classified as “socially isolated” and 2 or more points considered “not socially isolated.” They were classified as having probable dementia based either on self-report or lower-than-mean performance in 2 or more cognitive domains, or a score indicating probable dementia on the AD8 Dementia Screening Interview.

Covariates included demographic factors, education, and health factors. Mean follow-up was 5.1 years.

Results showed close to one-quarter (23.3%) of the study population was classified as socially isolated, with one-fifth (21.1%) developing dementia by the end of the follow-up period.

Compared with non-isolated older adults, those who were socially isolated were more likely to develop dementia during the follow-up period (19.6% vs. 25.9%, respectively).

After adjusting for demographic factors, social isolation was significantly associated with a higher risk for incident dementia (hazard ratio, 1.33; 95% confidence interval, 1.13-1.56). This association persisted after further adjustment for health factors (HR, 1.27; 95% CI, 1.08-1.49). Race and ethnicity had no bearing on the association.

In addition to the association between social isolation and dementia, the researchers also estimated the cause-specific hazard of death before dementia and found that, overall, 18% of participants died prior to dementia over the follow-up period. In particular, the social isolation–associated cause-specific HR of death before dementia was 1.28 (95% CI, 1.2-1.5).

Dr. Cudjoe noted that the mechanism behind the association between social isolation and dementia in this population needs further study. Still, he hopes that the findings will “serve as a wake-up call for all of us to be more thoughtful of the role of social connections on our cognitive health.”

Clinicians “should be thinking about and assessing the presence or absence of social connections in their patients,” Dr. Cudjoe added.
 

‘Instrumental role’

Commenting on the study, Nicole Purcell, DO, neurologist and senior director of clinical practice at the Alzheimer’s Association, said the study “contributes to the growing body of evidence that finds social isolation is a serious public health risk for many seniors living in the United States, increasing their risk for dementia and other serious mental conditions.”

Dr. Purcell, who was not involved with the study, added that “health care systems and medical professionals can play an instrumental role in identifying individuals at risk for social isolation.”

She noted that for those experiencing social isolation, “interaction with health care providers may be one of the few opportunities those individuals have for social engagement, [so] using these interactions to identify individuals at risk for social isolation and referring them to local resources and groups that promote engagement, well-being, and access to senior services may help decrease dementia risk for vulnerable seniors.”

Dr. Purcell added that the Alzheimer’s Association offers early-stage programs throughout the country, including support groups, education, art, music, and other socially engaging activities.

The study was funded by the National Institute on Aging, National Institute on Minority Health and Health Disparities, and Secunda Family Foundation. The investigators and Dr. Purcell have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Publications
Topics
Sections

Social isolation in older adults increases the risk for developing dementia, new research suggests. Results from a longitudinal study that included more than 5,000 United States–based seniors showed that nearly one-quarter were socially isolated.

After adjusting for demographic and health factors, social isolation was found to be associated with a 28% higher risk for developing dementia over a 9-year period, compared with non-isolation. In addition, this finding held true regardless of race or ethnicity.

“Social connections are increasingly understood as a critical factor for the health of individuals as they age,” senior study author Thomas K.M. Cudjoe, MD, Robert and Jane Meyerhoff Endowed Professor and assistant professor of medicine, Division of Geriatric Medicine and Gerontology, Johns Hopkins University School of Medicine, Baltimore, said in a press release. “Our study expands our understanding of the deleterious impact of social isolation on one’s risk for dementia over time,” Dr. Cudjoe added.

The findings were published online  in the Journal of the American Geriatric Society.
 

Upstream resources, downstream outcomes

Social isolation is a “multidimensional construct” characterized by factors such as social connections, social support, resource sharing, and relationship strain. It also affects approximately a quarter of older adults, the investigators noted.

Although prior studies have pointed to an association between socially isolated older adults and increased risk for incident dementia, no study has described this longitudinal association in a nationally representative cohort of U.S. seniors. 

Dr. Cudjoe said he was motivated to conduct the current study because he wondered whether or not older adults throughout the United States were similar to some of his patients “who might be at risk for worse cognitive outcomes because they lacked social contact with friends, family, or neighbors.”

The study was also “informed by conceptual foundation that upstream social and personal resources are linked to downstream health outcomes, including cognitive health and function,” the researchers added.

They turned to 2011-2020 data from the National Health and Aging Trends Study, a nationally representative, longitudinal cohort of U.S. Medicare beneficiaries. The sample was drawn from the Medicare enrollment file and incorporated 95 counties and 655 zip codes.

Participants (n = 5,022; mean age, 76.4 years; 57.2% women; 71.7% White, non-Hispanic; 42.4% having more than a college education) were community-dwelling older adults who completed annual 2-hour interviews that included assessment of function, economic health status, and well-being. To be included, they had to attend at least the baseline and first follow-up visits.

NHATS “includes domains that are relevant for the characterization of social isolation,” the investigators wrote. It used a typology of structural social isolation that is informed by the Berkman-Syme Social Network Index.

Included domains were living arrangements, discussion networks, and participation. All are “clinically relevant, practical, and components of a comprehensive social history,” the researchers noted.

They added that individuals classified as “socially isolated” often live alone, have no one or only one person that they can rely upon to discuss important matters, and have limited or no engagement in social or religious groups.

Social isolation in the study was characterized using questions about living with at least one other person, talking to two or more other people about “important matters” in the past year, attending religious services in the past month, and participating in the past month in such things as clubs, meetings, group activities, or volunteer work.
 

 

 

Wake-up call

Study participants received 1 point for each item/domain, with a sum score of 0 or 1 classified as “socially isolated” and 2 or more points considered “not socially isolated.” They were classified as having probable dementia based either on self-report or lower-than-mean performance in 2 or more cognitive domains, or a score indicating probable dementia on the AD8 Dementia Screening Interview.

Covariates included demographic factors, education, and health factors. Mean follow-up was 5.1 years.

Results showed close to one-quarter (23.3%) of the study population was classified as socially isolated, with one-fifth (21.1%) developing dementia by the end of the follow-up period.

Compared with non-isolated older adults, those who were socially isolated were more likely to develop dementia during the follow-up period (19.6% vs. 25.9%, respectively).

After adjusting for demographic factors, social isolation was significantly associated with a higher risk for incident dementia (hazard ratio, 1.33; 95% confidence interval, 1.13-1.56). This association persisted after further adjustment for health factors (HR, 1.27; 95% CI, 1.08-1.49). Race and ethnicity had no bearing on the association.

In addition to the association between social isolation and dementia, the researchers also estimated the cause-specific hazard of death before dementia and found that, overall, 18% of participants died prior to dementia over the follow-up period. In particular, the social isolation–associated cause-specific HR of death before dementia was 1.28 (95% CI, 1.2-1.5).

Dr. Cudjoe noted that the mechanism behind the association between social isolation and dementia in this population needs further study. Still, he hopes that the findings will “serve as a wake-up call for all of us to be more thoughtful of the role of social connections on our cognitive health.”

Clinicians “should be thinking about and assessing the presence or absence of social connections in their patients,” Dr. Cudjoe added.
 

‘Instrumental role’

Commenting on the study, Nicole Purcell, DO, neurologist and senior director of clinical practice at the Alzheimer’s Association, said the study “contributes to the growing body of evidence that finds social isolation is a serious public health risk for many seniors living in the United States, increasing their risk for dementia and other serious mental conditions.”

Dr. Purcell, who was not involved with the study, added that “health care systems and medical professionals can play an instrumental role in identifying individuals at risk for social isolation.”

She noted that for those experiencing social isolation, “interaction with health care providers may be one of the few opportunities those individuals have for social engagement, [so] using these interactions to identify individuals at risk for social isolation and referring them to local resources and groups that promote engagement, well-being, and access to senior services may help decrease dementia risk for vulnerable seniors.”

Dr. Purcell added that the Alzheimer’s Association offers early-stage programs throughout the country, including support groups, education, art, music, and other socially engaging activities.

The study was funded by the National Institute on Aging, National Institute on Minority Health and Health Disparities, and Secunda Family Foundation. The investigators and Dr. Purcell have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Social isolation in older adults increases the risk for developing dementia, new research suggests. Results from a longitudinal study that included more than 5,000 United States–based seniors showed that nearly one-quarter were socially isolated.

After adjusting for demographic and health factors, social isolation was found to be associated with a 28% higher risk for developing dementia over a 9-year period, compared with non-isolation. In addition, this finding held true regardless of race or ethnicity.

“Social connections are increasingly understood as a critical factor for the health of individuals as they age,” senior study author Thomas K.M. Cudjoe, MD, Robert and Jane Meyerhoff Endowed Professor and assistant professor of medicine, Division of Geriatric Medicine and Gerontology, Johns Hopkins University School of Medicine, Baltimore, said in a press release. “Our study expands our understanding of the deleterious impact of social isolation on one’s risk for dementia over time,” Dr. Cudjoe added.

The findings were published online  in the Journal of the American Geriatric Society.
 

Upstream resources, downstream outcomes

Social isolation is a “multidimensional construct” characterized by factors such as social connections, social support, resource sharing, and relationship strain. It also affects approximately a quarter of older adults, the investigators noted.

Although prior studies have pointed to an association between socially isolated older adults and increased risk for incident dementia, no study has described this longitudinal association in a nationally representative cohort of U.S. seniors. 

Dr. Cudjoe said he was motivated to conduct the current study because he wondered whether or not older adults throughout the United States were similar to some of his patients “who might be at risk for worse cognitive outcomes because they lacked social contact with friends, family, or neighbors.”

The study was also “informed by conceptual foundation that upstream social and personal resources are linked to downstream health outcomes, including cognitive health and function,” the researchers added.

They turned to 2011-2020 data from the National Health and Aging Trends Study, a nationally representative, longitudinal cohort of U.S. Medicare beneficiaries. The sample was drawn from the Medicare enrollment file and incorporated 95 counties and 655 zip codes.

Participants (n = 5,022; mean age, 76.4 years; 57.2% women; 71.7% White, non-Hispanic; 42.4% having more than a college education) were community-dwelling older adults who completed annual 2-hour interviews that included assessment of function, economic health status, and well-being. To be included, they had to attend at least the baseline and first follow-up visits.

NHATS “includes domains that are relevant for the characterization of social isolation,” the investigators wrote. It used a typology of structural social isolation that is informed by the Berkman-Syme Social Network Index.

Included domains were living arrangements, discussion networks, and participation. All are “clinically relevant, practical, and components of a comprehensive social history,” the researchers noted.

They added that individuals classified as “socially isolated” often live alone, have no one or only one person that they can rely upon to discuss important matters, and have limited or no engagement in social or religious groups.

Social isolation in the study was characterized using questions about living with at least one other person, talking to two or more other people about “important matters” in the past year, attending religious services in the past month, and participating in the past month in such things as clubs, meetings, group activities, or volunteer work.
 

 

 

Wake-up call

Study participants received 1 point for each item/domain, with a sum score of 0 or 1 classified as “socially isolated” and 2 or more points considered “not socially isolated.” They were classified as having probable dementia based either on self-report or lower-than-mean performance in 2 or more cognitive domains, or a score indicating probable dementia on the AD8 Dementia Screening Interview.

Covariates included demographic factors, education, and health factors. Mean follow-up was 5.1 years.

Results showed close to one-quarter (23.3%) of the study population was classified as socially isolated, with one-fifth (21.1%) developing dementia by the end of the follow-up period.

Compared with non-isolated older adults, those who were socially isolated were more likely to develop dementia during the follow-up period (19.6% vs. 25.9%, respectively).

After adjusting for demographic factors, social isolation was significantly associated with a higher risk for incident dementia (hazard ratio, 1.33; 95% confidence interval, 1.13-1.56). This association persisted after further adjustment for health factors (HR, 1.27; 95% CI, 1.08-1.49). Race and ethnicity had no bearing on the association.

In addition to the association between social isolation and dementia, the researchers also estimated the cause-specific hazard of death before dementia and found that, overall, 18% of participants died prior to dementia over the follow-up period. In particular, the social isolation–associated cause-specific HR of death before dementia was 1.28 (95% CI, 1.2-1.5).

Dr. Cudjoe noted that the mechanism behind the association between social isolation and dementia in this population needs further study. Still, he hopes that the findings will “serve as a wake-up call for all of us to be more thoughtful of the role of social connections on our cognitive health.”

Clinicians “should be thinking about and assessing the presence or absence of social connections in their patients,” Dr. Cudjoe added.
 

‘Instrumental role’

Commenting on the study, Nicole Purcell, DO, neurologist and senior director of clinical practice at the Alzheimer’s Association, said the study “contributes to the growing body of evidence that finds social isolation is a serious public health risk for many seniors living in the United States, increasing their risk for dementia and other serious mental conditions.”

Dr. Purcell, who was not involved with the study, added that “health care systems and medical professionals can play an instrumental role in identifying individuals at risk for social isolation.”

She noted that for those experiencing social isolation, “interaction with health care providers may be one of the few opportunities those individuals have for social engagement, [so] using these interactions to identify individuals at risk for social isolation and referring them to local resources and groups that promote engagement, well-being, and access to senior services may help decrease dementia risk for vulnerable seniors.”

Dr. Purcell added that the Alzheimer’s Association offers early-stage programs throughout the country, including support groups, education, art, music, and other socially engaging activities.

The study was funded by the National Institute on Aging, National Institute on Minority Health and Health Disparities, and Secunda Family Foundation. The investigators and Dr. Purcell have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Issue
Neurology Reviews - 31(3)
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Diet packed with fast food found hard on the liver

Article Type
Changed
Fri, 01/20/2023 - 16:19

A new study that quantifies the harm to the liver of eating fast food might motivate people to eat less of it – especially those with obesity or diabetes.

The study finds that getting one-fifth or more of total daily calories from fast food can increase the risk of nonalcoholic fatty liver disease, which can lead to cirrhosis and its complications, including liver failure and liver cancer.

Annbozhko/iStock/Getty Images

Although the magnitude of association was modest among the general population, “striking” elevations in steatosis were evident among persons with obesity and diabetes who consumed fast food, in comparison with their counterparts who did not have obesity and diabetes, the researchers reported.

“My hope is that this study encourages people to seek out more nutritious, healthy food options and provides information that clinicians can use to counsel their patients, particularly those with underlying metabolic risk factors, of the importance of avoiding foods that are high in fat, carbohydrates, and processed sugars,” lead investigator Ani Kardashian, MD, hepatologist with the University of Southern California, Los Angeles, said in an interview.

“At a policy level, public health efforts are needed to improve access to affordable, healthy, and nutritious food options across the U.S. This is especially important as more people have turned to fast foods during the pandemic and as the price of food as risen dramatically over the past year due to food inflation,” Dr. Kardashian added.

The study was published online in Clinical Gastroenterology and Hepatology.
 

More fast food, greater steatosis

The findings are based on data from 3,954 adults who participated in the National Health and Nutrition Examination Survey (NHANES) of 2017-2018 and who underwent vibration-controlled transient elastography. Of these participants, data regarding 1- or 2-day dietary recall were available.

Steatosis, the primary outcome, was measured via controlled attenuation parameter (CAP). Two validated cutoffs were utilized (CAP ≥ 263 dB/m and CAP ≥ 285 dB/m).

Of those surveyed, 52% consumed any fast food, and 29% derived 20% or more of their daily calories from fast food.

Fast-food intake of 20% or more of daily calories was significantly associated with greater steatosis after multivariable adjustment, both as a continuous measure (4.6 dB/m higher CAP score) and with respect to the CAP ≥ 263 dB/m cutoff (odds ratio [OR], 1.45).

“The negative effects are particularly severe in people who already have diabetes and obesity,” Dr. Kardashian told this news organization.

For example, with diabetes and fast-food intake of 20% or more of daily calories, the ORs of meeting the CAP ≥ 263 dB/m cutoff and the CAP ≥ 285 dB/m cutoff were 2.3 and 2.48, respectively.

The researchers said their findings are particularly “alarming,” given the overall increase in fast-food consumption over the past 50 years in the United States, regardless of socioeconomic status.
 

Diet coaching

The finding that fast food has more deleterious impact on those with obesity and diabetes “emphasizes that it is not just one insult but multiple factors that contribute to overall health,” said Nancy Reau, MD, section chief of hepatology at Rush University Medical Center in Chicago.

“This is actually great news, because diet is modifiable, vs. your genetics, which you currently can’t change. This doesn’t mean if you’re lean you can eat whatever you want, but if you are overweight, being careful with your diet does have impact, even if it doesn’t lead to substantial weight changes,” said Dr. Reau, who is not affiliated with the study.

For people who have limited options and need to eat fast food, “there are healthy choices at most restaurants; you just need to be smart about reading labels, watching calories, and ordering the healthier options,” Dr. Reau said in an interview.

Fast food and fatty liver go “hand in hand,” Lisa Ganjhu, DO, gastroenterologist and hepatologist at NYU Langone Health in New York, told this news organization.

“I counsel and coach my patients on healthy diet and exercise, and I’ve been pretty successful,” said Dr. Ganjhu, who was not involved with the study.

“If my patient is eating at McDonald’s a lot, I basically walk through the menu with them and help them find something healthy. When patients see the benefits of cutting out fat and reducing carbohydrates, they are more apt to continue,” Dr. Ganjhu said.

The study was funded by the University of Southern California. Dr. Kardashian, Dr. Reau, and Dr. Ganjhu have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

A new study that quantifies the harm to the liver of eating fast food might motivate people to eat less of it – especially those with obesity or diabetes.

The study finds that getting one-fifth or more of total daily calories from fast food can increase the risk of nonalcoholic fatty liver disease, which can lead to cirrhosis and its complications, including liver failure and liver cancer.

Annbozhko/iStock/Getty Images

Although the magnitude of association was modest among the general population, “striking” elevations in steatosis were evident among persons with obesity and diabetes who consumed fast food, in comparison with their counterparts who did not have obesity and diabetes, the researchers reported.

“My hope is that this study encourages people to seek out more nutritious, healthy food options and provides information that clinicians can use to counsel their patients, particularly those with underlying metabolic risk factors, of the importance of avoiding foods that are high in fat, carbohydrates, and processed sugars,” lead investigator Ani Kardashian, MD, hepatologist with the University of Southern California, Los Angeles, said in an interview.

“At a policy level, public health efforts are needed to improve access to affordable, healthy, and nutritious food options across the U.S. This is especially important as more people have turned to fast foods during the pandemic and as the price of food as risen dramatically over the past year due to food inflation,” Dr. Kardashian added.

The study was published online in Clinical Gastroenterology and Hepatology.
 

More fast food, greater steatosis

The findings are based on data from 3,954 adults who participated in the National Health and Nutrition Examination Survey (NHANES) of 2017-2018 and who underwent vibration-controlled transient elastography. Of these participants, data regarding 1- or 2-day dietary recall were available.

Steatosis, the primary outcome, was measured via controlled attenuation parameter (CAP). Two validated cutoffs were utilized (CAP ≥ 263 dB/m and CAP ≥ 285 dB/m).

Of those surveyed, 52% consumed any fast food, and 29% derived 20% or more of their daily calories from fast food.

Fast-food intake of 20% or more of daily calories was significantly associated with greater steatosis after multivariable adjustment, both as a continuous measure (4.6 dB/m higher CAP score) and with respect to the CAP ≥ 263 dB/m cutoff (odds ratio [OR], 1.45).

“The negative effects are particularly severe in people who already have diabetes and obesity,” Dr. Kardashian told this news organization.

For example, with diabetes and fast-food intake of 20% or more of daily calories, the ORs of meeting the CAP ≥ 263 dB/m cutoff and the CAP ≥ 285 dB/m cutoff were 2.3 and 2.48, respectively.

The researchers said their findings are particularly “alarming,” given the overall increase in fast-food consumption over the past 50 years in the United States, regardless of socioeconomic status.
 

Diet coaching

The finding that fast food has more deleterious impact on those with obesity and diabetes “emphasizes that it is not just one insult but multiple factors that contribute to overall health,” said Nancy Reau, MD, section chief of hepatology at Rush University Medical Center in Chicago.

“This is actually great news, because diet is modifiable, vs. your genetics, which you currently can’t change. This doesn’t mean if you’re lean you can eat whatever you want, but if you are overweight, being careful with your diet does have impact, even if it doesn’t lead to substantial weight changes,” said Dr. Reau, who is not affiliated with the study.

For people who have limited options and need to eat fast food, “there are healthy choices at most restaurants; you just need to be smart about reading labels, watching calories, and ordering the healthier options,” Dr. Reau said in an interview.

Fast food and fatty liver go “hand in hand,” Lisa Ganjhu, DO, gastroenterologist and hepatologist at NYU Langone Health in New York, told this news organization.

“I counsel and coach my patients on healthy diet and exercise, and I’ve been pretty successful,” said Dr. Ganjhu, who was not involved with the study.

“If my patient is eating at McDonald’s a lot, I basically walk through the menu with them and help them find something healthy. When patients see the benefits of cutting out fat and reducing carbohydrates, they are more apt to continue,” Dr. Ganjhu said.

The study was funded by the University of Southern California. Dr. Kardashian, Dr. Reau, and Dr. Ganjhu have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

A new study that quantifies the harm to the liver of eating fast food might motivate people to eat less of it – especially those with obesity or diabetes.

The study finds that getting one-fifth or more of total daily calories from fast food can increase the risk of nonalcoholic fatty liver disease, which can lead to cirrhosis and its complications, including liver failure and liver cancer.

Annbozhko/iStock/Getty Images

Although the magnitude of association was modest among the general population, “striking” elevations in steatosis were evident among persons with obesity and diabetes who consumed fast food, in comparison with their counterparts who did not have obesity and diabetes, the researchers reported.

“My hope is that this study encourages people to seek out more nutritious, healthy food options and provides information that clinicians can use to counsel their patients, particularly those with underlying metabolic risk factors, of the importance of avoiding foods that are high in fat, carbohydrates, and processed sugars,” lead investigator Ani Kardashian, MD, hepatologist with the University of Southern California, Los Angeles, said in an interview.

“At a policy level, public health efforts are needed to improve access to affordable, healthy, and nutritious food options across the U.S. This is especially important as more people have turned to fast foods during the pandemic and as the price of food as risen dramatically over the past year due to food inflation,” Dr. Kardashian added.

The study was published online in Clinical Gastroenterology and Hepatology.
 

More fast food, greater steatosis

The findings are based on data from 3,954 adults who participated in the National Health and Nutrition Examination Survey (NHANES) of 2017-2018 and who underwent vibration-controlled transient elastography. Of these participants, data regarding 1- or 2-day dietary recall were available.

Steatosis, the primary outcome, was measured via controlled attenuation parameter (CAP). Two validated cutoffs were utilized (CAP ≥ 263 dB/m and CAP ≥ 285 dB/m).

Of those surveyed, 52% consumed any fast food, and 29% derived 20% or more of their daily calories from fast food.

Fast-food intake of 20% or more of daily calories was significantly associated with greater steatosis after multivariable adjustment, both as a continuous measure (4.6 dB/m higher CAP score) and with respect to the CAP ≥ 263 dB/m cutoff (odds ratio [OR], 1.45).

“The negative effects are particularly severe in people who already have diabetes and obesity,” Dr. Kardashian told this news organization.

For example, with diabetes and fast-food intake of 20% or more of daily calories, the ORs of meeting the CAP ≥ 263 dB/m cutoff and the CAP ≥ 285 dB/m cutoff were 2.3 and 2.48, respectively.

The researchers said their findings are particularly “alarming,” given the overall increase in fast-food consumption over the past 50 years in the United States, regardless of socioeconomic status.
 

Diet coaching

The finding that fast food has more deleterious impact on those with obesity and diabetes “emphasizes that it is not just one insult but multiple factors that contribute to overall health,” said Nancy Reau, MD, section chief of hepatology at Rush University Medical Center in Chicago.

“This is actually great news, because diet is modifiable, vs. your genetics, which you currently can’t change. This doesn’t mean if you’re lean you can eat whatever you want, but if you are overweight, being careful with your diet does have impact, even if it doesn’t lead to substantial weight changes,” said Dr. Reau, who is not affiliated with the study.

For people who have limited options and need to eat fast food, “there are healthy choices at most restaurants; you just need to be smart about reading labels, watching calories, and ordering the healthier options,” Dr. Reau said in an interview.

Fast food and fatty liver go “hand in hand,” Lisa Ganjhu, DO, gastroenterologist and hepatologist at NYU Langone Health in New York, told this news organization.

“I counsel and coach my patients on healthy diet and exercise, and I’ve been pretty successful,” said Dr. Ganjhu, who was not involved with the study.

“If my patient is eating at McDonald’s a lot, I basically walk through the menu with them and help them find something healthy. When patients see the benefits of cutting out fat and reducing carbohydrates, they are more apt to continue,” Dr. Ganjhu said.

The study was funded by the University of Southern California. Dr. Kardashian, Dr. Reau, and Dr. Ganjhu have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Nearly 50% of patients with dementia experience falls

Article Type
Changed
Mon, 02/27/2023 - 15:22

Nearly half of older adults with dementia experience falls, suggests new research that also identifies multiple risk factors for these falls.

In a study of more than 5,500 participants, 45.5% of those with dementia experienced one or more falls, compared with 30.9% of their peers without dementia.

Vision impairment and living with a spouse were among the strongest predictors of future fall risk among participants living with dementia. Interestingly, high neighborhood social deprivation, which is reflected by such things as income and education, was associated with lower odds of falling.

Overall, the results highlight the need for a multidisciplinary approach to preventing falls among elderly individuals with dementia, said lead author Safiyyah M. Okoye, PhD, assistant professor, College of Nursing and Health Professions, Drexel University, Philadelphia.

“We need to consider different dimensions and figure out how we can try to go beyond the clinic in our interactions,” she said.

Dr. Okoye noted that in addition to reviewing medications that may contribute to falls and screening for vision problems, clinicians might also consider resources to improve the home environment and ensure that families have appropriate caregiving.

The findings were published online  in Alzheimer’s and Dementia: The Journal of the Alzheimer’s Association.
 

No ‘silver bullet’

Every year, falls cause millions of injuries in older adults, and those with dementia are especially vulnerable. This population has twice the risk of falling and up to three times the risk of incurring serious fall-related injuries, such as fractures, the researchers noted.

Falls are a leading cause of hospitalization among those with dementia. Previous evidence has shown that persons with dementia are more likely to experience negative health consequences, such as delirium, while in hospital, compared with those without dementia. Even minor fall-related injuries are associated with the patient’s being discharged to a nursing home rather than returning home.

Dr. Okoye stressed that many factors contribute to falls, including health status; function, such as the ability to walk and balance; medications; home environment; and activity level.

“There are multidimensional aspects, and we can’t just find one silver bullet to address falls. It should be addressed comprehensively,” she said.

Existing studies “overwhelmingly” focus on factors related to health and function that could be addressed in the doctor’s office or with a referral, rather than on environmental and social factors, Dr. Okoye noted.

And even though the risk of falling is high among community-dwelling seniors with dementia, very few studies have addressed the risk of falls among these adults, she added.

The new analysis included a nationally representative sample of 5,581 community-dwelling adults who participated in both the 2015 and 2016 National Health and Aging Trends Study (NHATS). The NHATS is a population-based survey of health and disability trends and trajectories among Americans aged 65 years and older.

During interviews, participants were asked, personally or by proxy, about falls during the previous 12 months. Having fallen at baseline was evaluated as a possible predictor of falls in the subsequent 12 months.

To determine probable dementia, researchers asked whether a doctor had ever told the participants that they had dementia or Alzheimer’s disease. They also used a dementia screening questionnaire and neuropsychological tests of memory, orientation, and executive function.

Of the total sample, most (n = 5,093) did not have dementia.

Physical environmental factors that were assessed included conditions at home, such as clutter, tripping hazards, and structural issues, as well as neighborhood social and economic deprivation – such as income, education levels, and employment status.
 

 

 

Fall rates and counterintuitive findings

Results showed that significantly more of those with dementia than without experienced one or more falls (45.5% vs. 30.9%; P < .001).

In addition, a history of falling was significantly associated with subsequent falls among those with dementia (odds ratio, 6.20; 95% confidence interval, 3.81-10.09), as was vision impairment (OR, 2.22; 95% CI, 1.12-4.40) and living with a spouse versus alone (OR, 2.43; 95% CI, 1.09-5.43).

A possible explanation for higher fall risk among those living with a partner is that those living alone usually have better functioning, the investigators noted. Also, live-in partners tend to be of a similar age as the person with dementia and may have challenges of their own.

Interestingly, high neighborhood social deprivation was associated with lower odds of falling (OR, 0.55 for the highest deprivation scores; 95% CI, 0.31-0.98), a finding Dr. Okoye said was “counterintuitive.”

This result could be related to the social environment, she noted. “Maybe there are more people around in the house, more people with eyes on the person, or more people in the community who know the person. Despite the low economic resources, there could be social resources there,” she said.

The new findings underscore the idea that falling is a multidimensional phenomenon among older adults with dementia as well as those without dementia, Dr. Okoye noted.

Doctors can play a role in reducing falls among patients with dementia by asking about falls, possibly eliminating medications that are associated with risk of falling, and screening for and correcting vision and hearing impairments, she suggested.

They may also help determine household hazards for a patient, such as clutter and poor lighting, and ensure that these are addressed, Dr. Okoye added.
 

No surprise

Commenting on the study, David S. Knopman, MD, a clinical neurologist at Mayo Clinic, Rochester, Minn., said the finding that visual impairment and a prior history of falling are predictive of subsequent falls “comes as no surprise.”

Dr. Knopman, whose research focuses on late-life cognitive disorders, was not involved with the current study.

Risk reduction is “of course” a key management goal, he said. “Vigilance and optimizing the patient’s living space to reduce fall risks are the major strategies,” he added.

Dr. Knopman reiterated that falls among those with dementia are associated with higher mortality and often lead to loss of the capacity to live outside of an institution.

The study was supported by the National Institute on Aging. The investigators and Dr. Knopman report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Publications
Topics
Sections

Nearly half of older adults with dementia experience falls, suggests new research that also identifies multiple risk factors for these falls.

In a study of more than 5,500 participants, 45.5% of those with dementia experienced one or more falls, compared with 30.9% of their peers without dementia.

Vision impairment and living with a spouse were among the strongest predictors of future fall risk among participants living with dementia. Interestingly, high neighborhood social deprivation, which is reflected by such things as income and education, was associated with lower odds of falling.

Overall, the results highlight the need for a multidisciplinary approach to preventing falls among elderly individuals with dementia, said lead author Safiyyah M. Okoye, PhD, assistant professor, College of Nursing and Health Professions, Drexel University, Philadelphia.

“We need to consider different dimensions and figure out how we can try to go beyond the clinic in our interactions,” she said.

Dr. Okoye noted that in addition to reviewing medications that may contribute to falls and screening for vision problems, clinicians might also consider resources to improve the home environment and ensure that families have appropriate caregiving.

The findings were published online  in Alzheimer’s and Dementia: The Journal of the Alzheimer’s Association.
 

No ‘silver bullet’

Every year, falls cause millions of injuries in older adults, and those with dementia are especially vulnerable. This population has twice the risk of falling and up to three times the risk of incurring serious fall-related injuries, such as fractures, the researchers noted.

Falls are a leading cause of hospitalization among those with dementia. Previous evidence has shown that persons with dementia are more likely to experience negative health consequences, such as delirium, while in hospital, compared with those without dementia. Even minor fall-related injuries are associated with the patient’s being discharged to a nursing home rather than returning home.

Dr. Okoye stressed that many factors contribute to falls, including health status; function, such as the ability to walk and balance; medications; home environment; and activity level.

“There are multidimensional aspects, and we can’t just find one silver bullet to address falls. It should be addressed comprehensively,” she said.

Existing studies “overwhelmingly” focus on factors related to health and function that could be addressed in the doctor’s office or with a referral, rather than on environmental and social factors, Dr. Okoye noted.

And even though the risk of falling is high among community-dwelling seniors with dementia, very few studies have addressed the risk of falls among these adults, she added.

The new analysis included a nationally representative sample of 5,581 community-dwelling adults who participated in both the 2015 and 2016 National Health and Aging Trends Study (NHATS). The NHATS is a population-based survey of health and disability trends and trajectories among Americans aged 65 years and older.

During interviews, participants were asked, personally or by proxy, about falls during the previous 12 months. Having fallen at baseline was evaluated as a possible predictor of falls in the subsequent 12 months.

To determine probable dementia, researchers asked whether a doctor had ever told the participants that they had dementia or Alzheimer’s disease. They also used a dementia screening questionnaire and neuropsychological tests of memory, orientation, and executive function.

Of the total sample, most (n = 5,093) did not have dementia.

Physical environmental factors that were assessed included conditions at home, such as clutter, tripping hazards, and structural issues, as well as neighborhood social and economic deprivation – such as income, education levels, and employment status.
 

 

 

Fall rates and counterintuitive findings

Results showed that significantly more of those with dementia than without experienced one or more falls (45.5% vs. 30.9%; P < .001).

In addition, a history of falling was significantly associated with subsequent falls among those with dementia (odds ratio, 6.20; 95% confidence interval, 3.81-10.09), as was vision impairment (OR, 2.22; 95% CI, 1.12-4.40) and living with a spouse versus alone (OR, 2.43; 95% CI, 1.09-5.43).

A possible explanation for higher fall risk among those living with a partner is that those living alone usually have better functioning, the investigators noted. Also, live-in partners tend to be of a similar age as the person with dementia and may have challenges of their own.

Interestingly, high neighborhood social deprivation was associated with lower odds of falling (OR, 0.55 for the highest deprivation scores; 95% CI, 0.31-0.98), a finding Dr. Okoye said was “counterintuitive.”

This result could be related to the social environment, she noted. “Maybe there are more people around in the house, more people with eyes on the person, or more people in the community who know the person. Despite the low economic resources, there could be social resources there,” she said.

The new findings underscore the idea that falling is a multidimensional phenomenon among older adults with dementia as well as those without dementia, Dr. Okoye noted.

Doctors can play a role in reducing falls among patients with dementia by asking about falls, possibly eliminating medications that are associated with risk of falling, and screening for and correcting vision and hearing impairments, she suggested.

They may also help determine household hazards for a patient, such as clutter and poor lighting, and ensure that these are addressed, Dr. Okoye added.
 

No surprise

Commenting on the study, David S. Knopman, MD, a clinical neurologist at Mayo Clinic, Rochester, Minn., said the finding that visual impairment and a prior history of falling are predictive of subsequent falls “comes as no surprise.”

Dr. Knopman, whose research focuses on late-life cognitive disorders, was not involved with the current study.

Risk reduction is “of course” a key management goal, he said. “Vigilance and optimizing the patient’s living space to reduce fall risks are the major strategies,” he added.

Dr. Knopman reiterated that falls among those with dementia are associated with higher mortality and often lead to loss of the capacity to live outside of an institution.

The study was supported by the National Institute on Aging. The investigators and Dr. Knopman report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Nearly half of older adults with dementia experience falls, suggests new research that also identifies multiple risk factors for these falls.

In a study of more than 5,500 participants, 45.5% of those with dementia experienced one or more falls, compared with 30.9% of their peers without dementia.

Vision impairment and living with a spouse were among the strongest predictors of future fall risk among participants living with dementia. Interestingly, high neighborhood social deprivation, which is reflected by such things as income and education, was associated with lower odds of falling.

Overall, the results highlight the need for a multidisciplinary approach to preventing falls among elderly individuals with dementia, said lead author Safiyyah M. Okoye, PhD, assistant professor, College of Nursing and Health Professions, Drexel University, Philadelphia.

“We need to consider different dimensions and figure out how we can try to go beyond the clinic in our interactions,” she said.

Dr. Okoye noted that in addition to reviewing medications that may contribute to falls and screening for vision problems, clinicians might also consider resources to improve the home environment and ensure that families have appropriate caregiving.

The findings were published online  in Alzheimer’s and Dementia: The Journal of the Alzheimer’s Association.
 

No ‘silver bullet’

Every year, falls cause millions of injuries in older adults, and those with dementia are especially vulnerable. This population has twice the risk of falling and up to three times the risk of incurring serious fall-related injuries, such as fractures, the researchers noted.

Falls are a leading cause of hospitalization among those with dementia. Previous evidence has shown that persons with dementia are more likely to experience negative health consequences, such as delirium, while in hospital, compared with those without dementia. Even minor fall-related injuries are associated with the patient’s being discharged to a nursing home rather than returning home.

Dr. Okoye stressed that many factors contribute to falls, including health status; function, such as the ability to walk and balance; medications; home environment; and activity level.

“There are multidimensional aspects, and we can’t just find one silver bullet to address falls. It should be addressed comprehensively,” she said.

Existing studies “overwhelmingly” focus on factors related to health and function that could be addressed in the doctor’s office or with a referral, rather than on environmental and social factors, Dr. Okoye noted.

And even though the risk of falling is high among community-dwelling seniors with dementia, very few studies have addressed the risk of falls among these adults, she added.

The new analysis included a nationally representative sample of 5,581 community-dwelling adults who participated in both the 2015 and 2016 National Health and Aging Trends Study (NHATS). The NHATS is a population-based survey of health and disability trends and trajectories among Americans aged 65 years and older.

During interviews, participants were asked, personally or by proxy, about falls during the previous 12 months. Having fallen at baseline was evaluated as a possible predictor of falls in the subsequent 12 months.

To determine probable dementia, researchers asked whether a doctor had ever told the participants that they had dementia or Alzheimer’s disease. They also used a dementia screening questionnaire and neuropsychological tests of memory, orientation, and executive function.

Of the total sample, most (n = 5,093) did not have dementia.

Physical environmental factors that were assessed included conditions at home, such as clutter, tripping hazards, and structural issues, as well as neighborhood social and economic deprivation – such as income, education levels, and employment status.
 

 

 

Fall rates and counterintuitive findings

Results showed that significantly more of those with dementia than without experienced one or more falls (45.5% vs. 30.9%; P < .001).

In addition, a history of falling was significantly associated with subsequent falls among those with dementia (odds ratio, 6.20; 95% confidence interval, 3.81-10.09), as was vision impairment (OR, 2.22; 95% CI, 1.12-4.40) and living with a spouse versus alone (OR, 2.43; 95% CI, 1.09-5.43).

A possible explanation for higher fall risk among those living with a partner is that those living alone usually have better functioning, the investigators noted. Also, live-in partners tend to be of a similar age as the person with dementia and may have challenges of their own.

Interestingly, high neighborhood social deprivation was associated with lower odds of falling (OR, 0.55 for the highest deprivation scores; 95% CI, 0.31-0.98), a finding Dr. Okoye said was “counterintuitive.”

This result could be related to the social environment, she noted. “Maybe there are more people around in the house, more people with eyes on the person, or more people in the community who know the person. Despite the low economic resources, there could be social resources there,” she said.

The new findings underscore the idea that falling is a multidimensional phenomenon among older adults with dementia as well as those without dementia, Dr. Okoye noted.

Doctors can play a role in reducing falls among patients with dementia by asking about falls, possibly eliminating medications that are associated with risk of falling, and screening for and correcting vision and hearing impairments, she suggested.

They may also help determine household hazards for a patient, such as clutter and poor lighting, and ensure that these are addressed, Dr. Okoye added.
 

No surprise

Commenting on the study, David S. Knopman, MD, a clinical neurologist at Mayo Clinic, Rochester, Minn., said the finding that visual impairment and a prior history of falling are predictive of subsequent falls “comes as no surprise.”

Dr. Knopman, whose research focuses on late-life cognitive disorders, was not involved with the current study.

Risk reduction is “of course” a key management goal, he said. “Vigilance and optimizing the patient’s living space to reduce fall risks are the major strategies,” he added.

Dr. Knopman reiterated that falls among those with dementia are associated with higher mortality and often lead to loss of the capacity to live outside of an institution.

The study was supported by the National Institute on Aging. The investigators and Dr. Knopman report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(3)
Issue
Neurology Reviews - 31(3)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S AND DEMENTIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Hearing loss strongly tied to increased dementia risk

Article Type
Changed
Thu, 01/19/2023 - 16:25

Dementia prevalence is 61% higher among older people with moderate to severe hearing loss compared with those with normal hearing, new national data show. Investigators also found that even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and that hearing aid use was tied to a 32% decrease in dementia prevalence.

“Every 10-decibel increase in hearing loss was associated with 16% greater prevalence of dementia, such that prevalence of dementia in older adults with moderate or greater hearing loss was 61% higher than prevalence in those with normal hearing,” said lead investigator Alison Huang, PhD, senior research associate in epidemiology at Johns Hopkins Bloomberg School of Public Health and core faculty in the Cochlear Center for Hearing and Public Health, Baltimore.

The findings were published online in JAMA.
 

Dose dependent effect

For their study, researchers analyzed data on 2,413 community-dwelling participants in the National Health and Aging Trends Study, a nationally representative, continuous panel study of U.S. Medicare beneficiaries aged 65 and older.

Data from the study was collected during in-home interviews, setting it apart from previous work that relied on data collected in a clinical setting, Dr. Huang said.

“This study was able to capture more vulnerable populations, such as the oldest old and older adults with disabilities, typically excluded from prior epidemiologic studies of the hearing loss–dementia association that use clinic-based data collection, which only captures people who have the ability and means to get to clinics,” Dr. Huang said.

Weighted hearing loss prevalence was 36.7% for mild and 29.8% for moderate to severe hearing loss, and weighted prevalence of dementia was 10.3%.

Those with moderate to severe hearing loss were 61% more likely to have dementia than were those with normal hearing (prevalence ratio, 1.61; 95% confidence interval [CI], 1.09-2.38).

Dementia prevalence increased with increasing severity of hearing loss: Normal hearing: 6.19% (95% CI, 4.31-8.80); mild hearing loss: 8.93% (95% CI, 6.99-11.34); moderate to severe hearing loss: 16.52% (95% CI, 13.81-19.64). But only moderate to severe hearing loss showed a statistically significant association with dementia (P = .02).

Dementia prevalence increased 16% per 10-decibel increase in hearing loss (prevalence ratio 1.16; P < .001).

Among the 853 individuals in the study with moderate to severe hearing loss, those who used hearing aids (n = 414) had a 32% lower risk of dementia compared with those who didn’t use assisted devices (prevalence ratio, 0.68; 95% CI, 0.47-1.00). Similar data were published in JAMA Neurology, suggesting that hearing aids reduce dementia risk.

“With this study, we were able to refine our understanding of the strength of the hearing loss–dementia association in a study more representative of older adults in the United States,” said Dr. Huang.
 

Robust association

Commenting on the findings, Justin S. Golub, MD, associate professor in the department of otolaryngology–head and neck surgery at Columbia University, New York, said the study supports earlier research and suggests a “robust” association between hearing loss and dementia.

“The particular advantage of this study was that it was high quality and nationally representative,” Dr. Golub said. “It is also among a smaller set of studies that have shown hearing aid use to be associated with lower risk of dementia.”

Although not statistically significant, researchers did find increasing prevalence of dementia among people with only mild hearing loss, and clinicians should take note, said Dr. Golub, who was not involved with this study.

“We would expect the relationship between mild hearing loss and dementia to be weaker than severe hearing loss and dementia and, as a result, it might take more participants to show an association among the mild group,” Dr. Golub said.

“Even though this particular study did not specifically find a relationship between mild hearing loss and dementia, I would still recommend people to start treating their hearing loss when it is early,” Dr. Golub added.

The study was funded by the National Institute on Aging. Dr. Golub reports no relevant financial relationships. Full disclosures for study authors are included in the original article.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Dementia prevalence is 61% higher among older people with moderate to severe hearing loss compared with those with normal hearing, new national data show. Investigators also found that even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and that hearing aid use was tied to a 32% decrease in dementia prevalence.

“Every 10-decibel increase in hearing loss was associated with 16% greater prevalence of dementia, such that prevalence of dementia in older adults with moderate or greater hearing loss was 61% higher than prevalence in those with normal hearing,” said lead investigator Alison Huang, PhD, senior research associate in epidemiology at Johns Hopkins Bloomberg School of Public Health and core faculty in the Cochlear Center for Hearing and Public Health, Baltimore.

The findings were published online in JAMA.
 

Dose dependent effect

For their study, researchers analyzed data on 2,413 community-dwelling participants in the National Health and Aging Trends Study, a nationally representative, continuous panel study of U.S. Medicare beneficiaries aged 65 and older.

Data from the study was collected during in-home interviews, setting it apart from previous work that relied on data collected in a clinical setting, Dr. Huang said.

“This study was able to capture more vulnerable populations, such as the oldest old and older adults with disabilities, typically excluded from prior epidemiologic studies of the hearing loss–dementia association that use clinic-based data collection, which only captures people who have the ability and means to get to clinics,” Dr. Huang said.

Weighted hearing loss prevalence was 36.7% for mild and 29.8% for moderate to severe hearing loss, and weighted prevalence of dementia was 10.3%.

Those with moderate to severe hearing loss were 61% more likely to have dementia than were those with normal hearing (prevalence ratio, 1.61; 95% confidence interval [CI], 1.09-2.38).

Dementia prevalence increased with increasing severity of hearing loss: Normal hearing: 6.19% (95% CI, 4.31-8.80); mild hearing loss: 8.93% (95% CI, 6.99-11.34); moderate to severe hearing loss: 16.52% (95% CI, 13.81-19.64). But only moderate to severe hearing loss showed a statistically significant association with dementia (P = .02).

Dementia prevalence increased 16% per 10-decibel increase in hearing loss (prevalence ratio 1.16; P < .001).

Among the 853 individuals in the study with moderate to severe hearing loss, those who used hearing aids (n = 414) had a 32% lower risk of dementia compared with those who didn’t use assisted devices (prevalence ratio, 0.68; 95% CI, 0.47-1.00). Similar data were published in JAMA Neurology, suggesting that hearing aids reduce dementia risk.

“With this study, we were able to refine our understanding of the strength of the hearing loss–dementia association in a study more representative of older adults in the United States,” said Dr. Huang.
 

Robust association

Commenting on the findings, Justin S. Golub, MD, associate professor in the department of otolaryngology–head and neck surgery at Columbia University, New York, said the study supports earlier research and suggests a “robust” association between hearing loss and dementia.

“The particular advantage of this study was that it was high quality and nationally representative,” Dr. Golub said. “It is also among a smaller set of studies that have shown hearing aid use to be associated with lower risk of dementia.”

Although not statistically significant, researchers did find increasing prevalence of dementia among people with only mild hearing loss, and clinicians should take note, said Dr. Golub, who was not involved with this study.

“We would expect the relationship between mild hearing loss and dementia to be weaker than severe hearing loss and dementia and, as a result, it might take more participants to show an association among the mild group,” Dr. Golub said.

“Even though this particular study did not specifically find a relationship between mild hearing loss and dementia, I would still recommend people to start treating their hearing loss when it is early,” Dr. Golub added.

The study was funded by the National Institute on Aging. Dr. Golub reports no relevant financial relationships. Full disclosures for study authors are included in the original article.

A version of this article first appeared on Medscape.com.

Dementia prevalence is 61% higher among older people with moderate to severe hearing loss compared with those with normal hearing, new national data show. Investigators also found that even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and that hearing aid use was tied to a 32% decrease in dementia prevalence.

“Every 10-decibel increase in hearing loss was associated with 16% greater prevalence of dementia, such that prevalence of dementia in older adults with moderate or greater hearing loss was 61% higher than prevalence in those with normal hearing,” said lead investigator Alison Huang, PhD, senior research associate in epidemiology at Johns Hopkins Bloomberg School of Public Health and core faculty in the Cochlear Center for Hearing and Public Health, Baltimore.

The findings were published online in JAMA.
 

Dose dependent effect

For their study, researchers analyzed data on 2,413 community-dwelling participants in the National Health and Aging Trends Study, a nationally representative, continuous panel study of U.S. Medicare beneficiaries aged 65 and older.

Data from the study was collected during in-home interviews, setting it apart from previous work that relied on data collected in a clinical setting, Dr. Huang said.

“This study was able to capture more vulnerable populations, such as the oldest old and older adults with disabilities, typically excluded from prior epidemiologic studies of the hearing loss–dementia association that use clinic-based data collection, which only captures people who have the ability and means to get to clinics,” Dr. Huang said.

Weighted hearing loss prevalence was 36.7% for mild and 29.8% for moderate to severe hearing loss, and weighted prevalence of dementia was 10.3%.

Those with moderate to severe hearing loss were 61% more likely to have dementia than were those with normal hearing (prevalence ratio, 1.61; 95% confidence interval [CI], 1.09-2.38).

Dementia prevalence increased with increasing severity of hearing loss: Normal hearing: 6.19% (95% CI, 4.31-8.80); mild hearing loss: 8.93% (95% CI, 6.99-11.34); moderate to severe hearing loss: 16.52% (95% CI, 13.81-19.64). But only moderate to severe hearing loss showed a statistically significant association with dementia (P = .02).

Dementia prevalence increased 16% per 10-decibel increase in hearing loss (prevalence ratio 1.16; P < .001).

Among the 853 individuals in the study with moderate to severe hearing loss, those who used hearing aids (n = 414) had a 32% lower risk of dementia compared with those who didn’t use assisted devices (prevalence ratio, 0.68; 95% CI, 0.47-1.00). Similar data were published in JAMA Neurology, suggesting that hearing aids reduce dementia risk.

“With this study, we were able to refine our understanding of the strength of the hearing loss–dementia association in a study more representative of older adults in the United States,” said Dr. Huang.
 

Robust association

Commenting on the findings, Justin S. Golub, MD, associate professor in the department of otolaryngology–head and neck surgery at Columbia University, New York, said the study supports earlier research and suggests a “robust” association between hearing loss and dementia.

“The particular advantage of this study was that it was high quality and nationally representative,” Dr. Golub said. “It is also among a smaller set of studies that have shown hearing aid use to be associated with lower risk of dementia.”

Although not statistically significant, researchers did find increasing prevalence of dementia among people with only mild hearing loss, and clinicians should take note, said Dr. Golub, who was not involved with this study.

“We would expect the relationship between mild hearing loss and dementia to be weaker than severe hearing loss and dementia and, as a result, it might take more participants to show an association among the mild group,” Dr. Golub said.

“Even though this particular study did not specifically find a relationship between mild hearing loss and dementia, I would still recommend people to start treating their hearing loss when it is early,” Dr. Golub added.

The study was funded by the National Institute on Aging. Dr. Golub reports no relevant financial relationships. Full disclosures for study authors are included in the original article.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Dietary zinc seen reducing migraine risk

Article Type
Changed
Mon, 02/27/2023 - 15:18

People with higher dietary zinc intake have a nearly one-third lower risk of migraine than those who get little zinc in their diets, according to results from a cross-sectional study of more than 11,000 American adults.

For their research, published online in Headache, Huanxian Liu, MD, and colleagues at Chinese PLA General Hospital in Beijing, analyzed publicly available data from the U.S. National Health and Nutrition Examination Survey to determine whether people self-reporting migraine or severe headache saw lower zinc intake, compared with people without migraine. The data used in the analysis was collected between 1999 and 2004, and contained information on foods and drinks consumed by participants in a 24-hour period, along with additional health information.
 

An inverse relationship

The investigators divided their study’s 11,088 participants (mean age, 46.5 years; 50% female) into quintiles based on dietary zinc consumption as inferred from foods eaten. They also considered zinc supplementation, for which data was available for 4,324 participants, of whom 2,607 reported use of supplements containing zinc.

Some 20% of the cohort (n = 2,236) reported migraine or severe headache within the previous 3 months. Pregnant women were excluded from analysis, and the investigators adjusted for a range of covariates, including age, sex, ethnicity, education level, body mass, smoking, diabetes, cardiovascular disease, and nutritional factors.

Dr. Liu and colleagues reported an inverse association between dietary zinc consumption and migraine, with the highest-consuming quintile of the cohort (15.8 mg or more zinc per day) seeing lowest risk of migraine (odds ratio, 0.70; 95% confidence interval, 0.52-0.94; P = .029), compared with the low-consuming quintile (5.9 mg or less daily). Among people getting high levels of zinc (19.3-32.5 mg daily) through supplements, risk of migraine was lower still, to between an OR of 0.62 (95% CI: 0.46–0.83, P = 0.019) and an OR of 0.67 (95% CI, 0.49–0.91; P = .045).

While the investigators acknowledged limitations of the study, including its cross-sectional design and use of a broad question to discern prevalence of migraine, the findings suggest that “zinc is an important nutrient that influences migraine,” they wrote, citing evidence for its antioxidant and anti-inflammatory properties.
 

The importance of nutritional factors

Commenting on the research findings, Deborah I. Friedman, MD, MPH, a headache specialist in Dallas, said that Dr. Liu and colleagues’ findings added to a growing information base about nutritional factors and migraine. For example, “low magnesium levels are common in people with migraine, and magnesium supplementation is a recommended preventive treatment for migraine.”

Dr. Friedman cited a recent study showing that vitamin B12 and magnesium supplementation in women (, combined with high-intensity interval training, “silenced” the inflammation signaling pathway, helped migraine pain and decreased levels of calcitonin gene-related peptide. A 2022 randomized trial found that alpha lipoic acid supplementation reduced migraine severity, frequency and disability in women with episodic migraine.

Vitamin D levels are also lower in people with migraine, compared with controls, Dr. Friedman noted, and a randomized trial of 2,000 IU of vitamin D3 daily saw reduced monthly headache days, attack duration, severe headaches, and analgesic use, compared with placebo. Other nutrients implicated in migraine include coenzyme Q10, calcium, folic acid, vitamin B6, and vitamin B1.

“What should a patient with migraine do with all of this information? First, eat a healthy and balanced diet,” Dr. Friedman said. “Sources of dietary zinc include red meat, nuts, legumes, poultry, shellfish (especially oysters), whole grains, some cereals, and even dark chocolate. The recommended daily dosage of zinc is 9.5 mg in men and 7 mg in women. Most people get enough zinc in their diet; vegetarians, vegans, pregnant or breastfeeding women, and adults over age 65 may need to take supplemental zinc.”

Dr. Liu and colleagues’ work was supported by China’s National Natural Science Foundation. The investigators reported no financial conflicts of interest. Dr. Friedman has received financial support from Alder, Allergan, Amgen, Biohaven, Eli Lilly, Merck, Teva, and other pharmaceutical manufacturers.

Issue
Neurology Reviews - 31(3)
Publications
Topics
Sections

People with higher dietary zinc intake have a nearly one-third lower risk of migraine than those who get little zinc in their diets, according to results from a cross-sectional study of more than 11,000 American adults.

For their research, published online in Headache, Huanxian Liu, MD, and colleagues at Chinese PLA General Hospital in Beijing, analyzed publicly available data from the U.S. National Health and Nutrition Examination Survey to determine whether people self-reporting migraine or severe headache saw lower zinc intake, compared with people without migraine. The data used in the analysis was collected between 1999 and 2004, and contained information on foods and drinks consumed by participants in a 24-hour period, along with additional health information.
 

An inverse relationship

The investigators divided their study’s 11,088 participants (mean age, 46.5 years; 50% female) into quintiles based on dietary zinc consumption as inferred from foods eaten. They also considered zinc supplementation, for which data was available for 4,324 participants, of whom 2,607 reported use of supplements containing zinc.

Some 20% of the cohort (n = 2,236) reported migraine or severe headache within the previous 3 months. Pregnant women were excluded from analysis, and the investigators adjusted for a range of covariates, including age, sex, ethnicity, education level, body mass, smoking, diabetes, cardiovascular disease, and nutritional factors.

Dr. Liu and colleagues reported an inverse association between dietary zinc consumption and migraine, with the highest-consuming quintile of the cohort (15.8 mg or more zinc per day) seeing lowest risk of migraine (odds ratio, 0.70; 95% confidence interval, 0.52-0.94; P = .029), compared with the low-consuming quintile (5.9 mg or less daily). Among people getting high levels of zinc (19.3-32.5 mg daily) through supplements, risk of migraine was lower still, to between an OR of 0.62 (95% CI: 0.46–0.83, P = 0.019) and an OR of 0.67 (95% CI, 0.49–0.91; P = .045).

While the investigators acknowledged limitations of the study, including its cross-sectional design and use of a broad question to discern prevalence of migraine, the findings suggest that “zinc is an important nutrient that influences migraine,” they wrote, citing evidence for its antioxidant and anti-inflammatory properties.
 

The importance of nutritional factors

Commenting on the research findings, Deborah I. Friedman, MD, MPH, a headache specialist in Dallas, said that Dr. Liu and colleagues’ findings added to a growing information base about nutritional factors and migraine. For example, “low magnesium levels are common in people with migraine, and magnesium supplementation is a recommended preventive treatment for migraine.”

Dr. Friedman cited a recent study showing that vitamin B12 and magnesium supplementation in women (, combined with high-intensity interval training, “silenced” the inflammation signaling pathway, helped migraine pain and decreased levels of calcitonin gene-related peptide. A 2022 randomized trial found that alpha lipoic acid supplementation reduced migraine severity, frequency and disability in women with episodic migraine.

Vitamin D levels are also lower in people with migraine, compared with controls, Dr. Friedman noted, and a randomized trial of 2,000 IU of vitamin D3 daily saw reduced monthly headache days, attack duration, severe headaches, and analgesic use, compared with placebo. Other nutrients implicated in migraine include coenzyme Q10, calcium, folic acid, vitamin B6, and vitamin B1.

“What should a patient with migraine do with all of this information? First, eat a healthy and balanced diet,” Dr. Friedman said. “Sources of dietary zinc include red meat, nuts, legumes, poultry, shellfish (especially oysters), whole grains, some cereals, and even dark chocolate. The recommended daily dosage of zinc is 9.5 mg in men and 7 mg in women. Most people get enough zinc in their diet; vegetarians, vegans, pregnant or breastfeeding women, and adults over age 65 may need to take supplemental zinc.”

Dr. Liu and colleagues’ work was supported by China’s National Natural Science Foundation. The investigators reported no financial conflicts of interest. Dr. Friedman has received financial support from Alder, Allergan, Amgen, Biohaven, Eli Lilly, Merck, Teva, and other pharmaceutical manufacturers.

People with higher dietary zinc intake have a nearly one-third lower risk of migraine than those who get little zinc in their diets, according to results from a cross-sectional study of more than 11,000 American adults.

For their research, published online in Headache, Huanxian Liu, MD, and colleagues at Chinese PLA General Hospital in Beijing, analyzed publicly available data from the U.S. National Health and Nutrition Examination Survey to determine whether people self-reporting migraine or severe headache saw lower zinc intake, compared with people without migraine. The data used in the analysis was collected between 1999 and 2004, and contained information on foods and drinks consumed by participants in a 24-hour period, along with additional health information.
 

An inverse relationship

The investigators divided their study’s 11,088 participants (mean age, 46.5 years; 50% female) into quintiles based on dietary zinc consumption as inferred from foods eaten. They also considered zinc supplementation, for which data was available for 4,324 participants, of whom 2,607 reported use of supplements containing zinc.

Some 20% of the cohort (n = 2,236) reported migraine or severe headache within the previous 3 months. Pregnant women were excluded from analysis, and the investigators adjusted for a range of covariates, including age, sex, ethnicity, education level, body mass, smoking, diabetes, cardiovascular disease, and nutritional factors.

Dr. Liu and colleagues reported an inverse association between dietary zinc consumption and migraine, with the highest-consuming quintile of the cohort (15.8 mg or more zinc per day) seeing lowest risk of migraine (odds ratio, 0.70; 95% confidence interval, 0.52-0.94; P = .029), compared with the low-consuming quintile (5.9 mg or less daily). Among people getting high levels of zinc (19.3-32.5 mg daily) through supplements, risk of migraine was lower still, to between an OR of 0.62 (95% CI: 0.46–0.83, P = 0.019) and an OR of 0.67 (95% CI, 0.49–0.91; P = .045).

While the investigators acknowledged limitations of the study, including its cross-sectional design and use of a broad question to discern prevalence of migraine, the findings suggest that “zinc is an important nutrient that influences migraine,” they wrote, citing evidence for its antioxidant and anti-inflammatory properties.
 

The importance of nutritional factors

Commenting on the research findings, Deborah I. Friedman, MD, MPH, a headache specialist in Dallas, said that Dr. Liu and colleagues’ findings added to a growing information base about nutritional factors and migraine. For example, “low magnesium levels are common in people with migraine, and magnesium supplementation is a recommended preventive treatment for migraine.”

Dr. Friedman cited a recent study showing that vitamin B12 and magnesium supplementation in women (, combined with high-intensity interval training, “silenced” the inflammation signaling pathway, helped migraine pain and decreased levels of calcitonin gene-related peptide. A 2022 randomized trial found that alpha lipoic acid supplementation reduced migraine severity, frequency and disability in women with episodic migraine.

Vitamin D levels are also lower in people with migraine, compared with controls, Dr. Friedman noted, and a randomized trial of 2,000 IU of vitamin D3 daily saw reduced monthly headache days, attack duration, severe headaches, and analgesic use, compared with placebo. Other nutrients implicated in migraine include coenzyme Q10, calcium, folic acid, vitamin B6, and vitamin B1.

“What should a patient with migraine do with all of this information? First, eat a healthy and balanced diet,” Dr. Friedman said. “Sources of dietary zinc include red meat, nuts, legumes, poultry, shellfish (especially oysters), whole grains, some cereals, and even dark chocolate. The recommended daily dosage of zinc is 9.5 mg in men and 7 mg in women. Most people get enough zinc in their diet; vegetarians, vegans, pregnant or breastfeeding women, and adults over age 65 may need to take supplemental zinc.”

Dr. Liu and colleagues’ work was supported by China’s National Natural Science Foundation. The investigators reported no financial conflicts of interest. Dr. Friedman has received financial support from Alder, Allergan, Amgen, Biohaven, Eli Lilly, Merck, Teva, and other pharmaceutical manufacturers.

Issue
Neurology Reviews - 31(3)
Issue
Neurology Reviews - 31(3)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM HEADACHE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Modified Atkins diet beneficial in drug-resistant epilepsy

Article Type
Changed
Thu, 02/09/2023 - 15:10

Adding a modified Atkins diet to standard antiseizure treatments significantly reduces seizure frequency in patients with drug-resistant epilepsy compared with medication alone, new research shows.

In a randomized prospective study, the number of seizures per month dropped by more than half in one-quarter of patients following the high-fat, low-carb diet; and 5% of the group were free from all seizure activity after 6 months.

Both adults and adolescents reported benefits from the diet, which is a less strict version of a traditional ketogenic diet that many patients find difficult to follow. The modified Atkins diet includes foods such as leafy green vegetables and eggs, chicken, fish, bacon, and other animal proteins.

“The use of an exchange list and recipe booklet with local recipes and spices helped in the initiation of modified Atkins diet with the flexibility of meal choices and ease of administration,” said coinvestigator Manjari Tripathi, MD, DM, department of neurology, All India Institute of Medical Science, New Delhi.

“As items were everyday household ingredients in proportion to the requirement of the modified Atkins diet, this diet is possible in low-income countries also,” Dr. Tripathi added.

The findings were published online in the journal Neurology.
 

Low carbs, high benefit

The modified Atkins diet includes around 65% fat, 25% protein, and 10% carbohydrates. Unlike a traditional ketogenic diet, the modified Atkins diet includes no restrictions on protein, calories, or fluids.

Researchers have long known that ketogenic and Atkins diets are associated with reduced seizure activity in adolescents with epilepsy. But previous studies were small, and many were retrospective analyses.

The current investigators enrolled 160 patients (80 adults, 80 adolescents) aged 10-55 years whose epilepsy was not controlled despite using at least three antiseizure medications at maximum tolerated doses.

The intervention group received training in the modified Atkins diet and were given a food exchange list, sample menu, and recipe booklet. Carbohydrate intake was restricted to 20 grams per day.

Participants took supplemental multivitamins and minerals, kept a food diary, logged seizure activity, and measured urine ketone levels three times a day. They also received weekly check-up phone calls to ensure diet adherence.

The control group received a normal diet with no carbohydrate restrictions. All participants continued their prescribed antiseizure therapy throughout the trial.
 

Primary outcome met

The primary study outcome was a reduction in seizures of more than 50%. At 6 months, 26.2% of the intervention group had reached that goal, compared with just 2.5% of the control group (P < .001).

When the median number of seizures in the modified Atkins diet group was analyzed, the frequency dropped in the intervention group from 37.5 per month at baseline to 27.5 per month after 3 months of the modified Atkins diet and to 21.5 per month after 6 months.

Adding the modified Atkins diet had a larger effect on seizure activity in adults than in adolescents. At the end of 6 months, 36% of adolescents on the modified Atkins diet had more than a 50% reduction in seizures, while 57.1% of adults on the diet reached that level.

Quality-of-life scores were also higher in the intervention group.

By the end of the trial, 5% of patients on the modified Atkins diet had no seizure activity at all versus none of the control group. In fact, the median number of seizures increased in the control group during the study.

The mean morning and evening levels of urine ketosis in the intervention group were 58.3 ± 8.0 mg/dL and 62.2 ± 22.6 mg/dL, respectively, suggesting satisfactory diet adherence. There was no significant difference between groups in weight loss.

Dr. Tripathi noted that 33% of participants did not complete the study because of poor tolerance of the diet, lack of benefit, or the inability to follow up – in part due to COVID-19. However, she said tolerance of the modified Atkins diet was better than what has been reported with the ketogenic diet.

“Though the exact mechanism by which such a diet protects against seizures is unknown, there is evidence that it causes effects on intermediary metabolism that influences the dynamics of the major inhibitory and excitatory neurotransmitter systems in the brain,” Dr. Tripathi said.
 

 

 

Benefits outweigh cost

Commenting on the research findings, Mackenzie Cervenka, MD, professor of neurology and director of the Adult Epilepsy Diet Center at Johns Hopkins University, Baltimore, noted that the study is the first randomized controlled trial of this size to demonstrate a benefit from adding the modified Atkins diet to standard antiseizure therapy in treatment-resistant epilepsy.

“Importantly, the study also showed improvement in quality of life and behavior over standard-of-care therapies without significant adverse effects,” said Dr. Cervenka, who was not part of the research.

The investigators noted that the flexibility of the modified Atkins diet allows more variation in menu options and a greater intake of protein, making it easier to follow than a traditional ketogenic diet.

One area of debate, however, is whether these diets are manageable for individuals with low income. Poultry, meat, and fish, all of which are staples of a modified Atkins diet, can be more expensive than other high-carb options such as pasta and rice.

“While some of the foods such as protein sources that patients purchase when they are on a ketogenic diet therapy can be more expensive, if you take into account the cost of antiseizure medications and other antiseizure treatments, hospital visits, and missed work related to seizures, et cetera, the overall financial benefits of seizure reduction with incorporating a ketogenic diet therapy may outweigh these costs,” Dr. Cervenka said.

“There are also low-cost foods that can be used since there is a great deal of flexibility with a modified Atkins diet,” she added.

The study was funded by the Centre of Excellence for Epilepsy, which is funded by the Department of Biotechnology, Government of India. Dr. Tripathi and Dr. Cervenka report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(2)
Publications
Topics
Sections

Adding a modified Atkins diet to standard antiseizure treatments significantly reduces seizure frequency in patients with drug-resistant epilepsy compared with medication alone, new research shows.

In a randomized prospective study, the number of seizures per month dropped by more than half in one-quarter of patients following the high-fat, low-carb diet; and 5% of the group were free from all seizure activity after 6 months.

Both adults and adolescents reported benefits from the diet, which is a less strict version of a traditional ketogenic diet that many patients find difficult to follow. The modified Atkins diet includes foods such as leafy green vegetables and eggs, chicken, fish, bacon, and other animal proteins.

“The use of an exchange list and recipe booklet with local recipes and spices helped in the initiation of modified Atkins diet with the flexibility of meal choices and ease of administration,” said coinvestigator Manjari Tripathi, MD, DM, department of neurology, All India Institute of Medical Science, New Delhi.

“As items were everyday household ingredients in proportion to the requirement of the modified Atkins diet, this diet is possible in low-income countries also,” Dr. Tripathi added.

The findings were published online in the journal Neurology.
 

Low carbs, high benefit

The modified Atkins diet includes around 65% fat, 25% protein, and 10% carbohydrates. Unlike a traditional ketogenic diet, the modified Atkins diet includes no restrictions on protein, calories, or fluids.

Researchers have long known that ketogenic and Atkins diets are associated with reduced seizure activity in adolescents with epilepsy. But previous studies were small, and many were retrospective analyses.

The current investigators enrolled 160 patients (80 adults, 80 adolescents) aged 10-55 years whose epilepsy was not controlled despite using at least three antiseizure medications at maximum tolerated doses.

The intervention group received training in the modified Atkins diet and were given a food exchange list, sample menu, and recipe booklet. Carbohydrate intake was restricted to 20 grams per day.

Participants took supplemental multivitamins and minerals, kept a food diary, logged seizure activity, and measured urine ketone levels three times a day. They also received weekly check-up phone calls to ensure diet adherence.

The control group received a normal diet with no carbohydrate restrictions. All participants continued their prescribed antiseizure therapy throughout the trial.
 

Primary outcome met

The primary study outcome was a reduction in seizures of more than 50%. At 6 months, 26.2% of the intervention group had reached that goal, compared with just 2.5% of the control group (P < .001).

When the median number of seizures in the modified Atkins diet group was analyzed, the frequency dropped in the intervention group from 37.5 per month at baseline to 27.5 per month after 3 months of the modified Atkins diet and to 21.5 per month after 6 months.

Adding the modified Atkins diet had a larger effect on seizure activity in adults than in adolescents. At the end of 6 months, 36% of adolescents on the modified Atkins diet had more than a 50% reduction in seizures, while 57.1% of adults on the diet reached that level.

Quality-of-life scores were also higher in the intervention group.

By the end of the trial, 5% of patients on the modified Atkins diet had no seizure activity at all versus none of the control group. In fact, the median number of seizures increased in the control group during the study.

The mean morning and evening levels of urine ketosis in the intervention group were 58.3 ± 8.0 mg/dL and 62.2 ± 22.6 mg/dL, respectively, suggesting satisfactory diet adherence. There was no significant difference between groups in weight loss.

Dr. Tripathi noted that 33% of participants did not complete the study because of poor tolerance of the diet, lack of benefit, or the inability to follow up – in part due to COVID-19. However, she said tolerance of the modified Atkins diet was better than what has been reported with the ketogenic diet.

“Though the exact mechanism by which such a diet protects against seizures is unknown, there is evidence that it causes effects on intermediary metabolism that influences the dynamics of the major inhibitory and excitatory neurotransmitter systems in the brain,” Dr. Tripathi said.
 

 

 

Benefits outweigh cost

Commenting on the research findings, Mackenzie Cervenka, MD, professor of neurology and director of the Adult Epilepsy Diet Center at Johns Hopkins University, Baltimore, noted that the study is the first randomized controlled trial of this size to demonstrate a benefit from adding the modified Atkins diet to standard antiseizure therapy in treatment-resistant epilepsy.

“Importantly, the study also showed improvement in quality of life and behavior over standard-of-care therapies without significant adverse effects,” said Dr. Cervenka, who was not part of the research.

The investigators noted that the flexibility of the modified Atkins diet allows more variation in menu options and a greater intake of protein, making it easier to follow than a traditional ketogenic diet.

One area of debate, however, is whether these diets are manageable for individuals with low income. Poultry, meat, and fish, all of which are staples of a modified Atkins diet, can be more expensive than other high-carb options such as pasta and rice.

“While some of the foods such as protein sources that patients purchase when they are on a ketogenic diet therapy can be more expensive, if you take into account the cost of antiseizure medications and other antiseizure treatments, hospital visits, and missed work related to seizures, et cetera, the overall financial benefits of seizure reduction with incorporating a ketogenic diet therapy may outweigh these costs,” Dr. Cervenka said.

“There are also low-cost foods that can be used since there is a great deal of flexibility with a modified Atkins diet,” she added.

The study was funded by the Centre of Excellence for Epilepsy, which is funded by the Department of Biotechnology, Government of India. Dr. Tripathi and Dr. Cervenka report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Adding a modified Atkins diet to standard antiseizure treatments significantly reduces seizure frequency in patients with drug-resistant epilepsy compared with medication alone, new research shows.

In a randomized prospective study, the number of seizures per month dropped by more than half in one-quarter of patients following the high-fat, low-carb diet; and 5% of the group were free from all seizure activity after 6 months.

Both adults and adolescents reported benefits from the diet, which is a less strict version of a traditional ketogenic diet that many patients find difficult to follow. The modified Atkins diet includes foods such as leafy green vegetables and eggs, chicken, fish, bacon, and other animal proteins.

“The use of an exchange list and recipe booklet with local recipes and spices helped in the initiation of modified Atkins diet with the flexibility of meal choices and ease of administration,” said coinvestigator Manjari Tripathi, MD, DM, department of neurology, All India Institute of Medical Science, New Delhi.

“As items were everyday household ingredients in proportion to the requirement of the modified Atkins diet, this diet is possible in low-income countries also,” Dr. Tripathi added.

The findings were published online in the journal Neurology.
 

Low carbs, high benefit

The modified Atkins diet includes around 65% fat, 25% protein, and 10% carbohydrates. Unlike a traditional ketogenic diet, the modified Atkins diet includes no restrictions on protein, calories, or fluids.

Researchers have long known that ketogenic and Atkins diets are associated with reduced seizure activity in adolescents with epilepsy. But previous studies were small, and many were retrospective analyses.

The current investigators enrolled 160 patients (80 adults, 80 adolescents) aged 10-55 years whose epilepsy was not controlled despite using at least three antiseizure medications at maximum tolerated doses.

The intervention group received training in the modified Atkins diet and were given a food exchange list, sample menu, and recipe booklet. Carbohydrate intake was restricted to 20 grams per day.

Participants took supplemental multivitamins and minerals, kept a food diary, logged seizure activity, and measured urine ketone levels three times a day. They also received weekly check-up phone calls to ensure diet adherence.

The control group received a normal diet with no carbohydrate restrictions. All participants continued their prescribed antiseizure therapy throughout the trial.
 

Primary outcome met

The primary study outcome was a reduction in seizures of more than 50%. At 6 months, 26.2% of the intervention group had reached that goal, compared with just 2.5% of the control group (P < .001).

When the median number of seizures in the modified Atkins diet group was analyzed, the frequency dropped in the intervention group from 37.5 per month at baseline to 27.5 per month after 3 months of the modified Atkins diet and to 21.5 per month after 6 months.

Adding the modified Atkins diet had a larger effect on seizure activity in adults than in adolescents. At the end of 6 months, 36% of adolescents on the modified Atkins diet had more than a 50% reduction in seizures, while 57.1% of adults on the diet reached that level.

Quality-of-life scores were also higher in the intervention group.

By the end of the trial, 5% of patients on the modified Atkins diet had no seizure activity at all versus none of the control group. In fact, the median number of seizures increased in the control group during the study.

The mean morning and evening levels of urine ketosis in the intervention group were 58.3 ± 8.0 mg/dL and 62.2 ± 22.6 mg/dL, respectively, suggesting satisfactory diet adherence. There was no significant difference between groups in weight loss.

Dr. Tripathi noted that 33% of participants did not complete the study because of poor tolerance of the diet, lack of benefit, or the inability to follow up – in part due to COVID-19. However, she said tolerance of the modified Atkins diet was better than what has been reported with the ketogenic diet.

“Though the exact mechanism by which such a diet protects against seizures is unknown, there is evidence that it causes effects on intermediary metabolism that influences the dynamics of the major inhibitory and excitatory neurotransmitter systems in the brain,” Dr. Tripathi said.
 

 

 

Benefits outweigh cost

Commenting on the research findings, Mackenzie Cervenka, MD, professor of neurology and director of the Adult Epilepsy Diet Center at Johns Hopkins University, Baltimore, noted that the study is the first randomized controlled trial of this size to demonstrate a benefit from adding the modified Atkins diet to standard antiseizure therapy in treatment-resistant epilepsy.

“Importantly, the study also showed improvement in quality of life and behavior over standard-of-care therapies without significant adverse effects,” said Dr. Cervenka, who was not part of the research.

The investigators noted that the flexibility of the modified Atkins diet allows more variation in menu options and a greater intake of protein, making it easier to follow than a traditional ketogenic diet.

One area of debate, however, is whether these diets are manageable for individuals with low income. Poultry, meat, and fish, all of which are staples of a modified Atkins diet, can be more expensive than other high-carb options such as pasta and rice.

“While some of the foods such as protein sources that patients purchase when they are on a ketogenic diet therapy can be more expensive, if you take into account the cost of antiseizure medications and other antiseizure treatments, hospital visits, and missed work related to seizures, et cetera, the overall financial benefits of seizure reduction with incorporating a ketogenic diet therapy may outweigh these costs,” Dr. Cervenka said.

“There are also low-cost foods that can be used since there is a great deal of flexibility with a modified Atkins diet,” she added.

The study was funded by the Centre of Excellence for Epilepsy, which is funded by the Department of Biotechnology, Government of India. Dr. Tripathi and Dr. Cervenka report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(2)
Issue
Neurology Reviews - 31(2)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Screen all patients for cannabis use before surgery: Guideline

Article Type
Changed
Wed, 01/11/2023 - 14:38

If you smoke, vape, or ingest cannabis, your anesthesiologist should know before you undergo a surgical procedure, according to new medical guidelines.

All patients who undergo procedures that require regional or general anesthesia should be asked if, how often, and in what forms they use the drug, according to recommendations from the American Society of Regional Anesthesia and Pain Medicine.

One reason: Patients who regularly use cannabis may experience worse pain and nausea after surgery and may require more opioid analgesia, the group said.

The society’s recommendations – published in Regional Anesthesia and Pain Medicine – are the first guidelines in the United States to cover cannabis use as it relates to surgery, the group said.
 

Possible interactions

Use of cannabis has increased in recent years, and researchers have been concerned that the drug may interact with anesthesia and complicate pain management. Few studies have evaluated interactions between cannabis and anesthetic agents, however, according to the authors of the new guidelines.

“With the rising prevalence of both medical and recreational cannabis use in the general population, anesthesiologists, surgeons, and perioperative physicians must have an understanding of the effects of cannabis on physiology in order to provide safe perioperative care,” the guideline said.

“Before surgery, anesthesiologists should ask patients if they use cannabis – whether medicinally or recreationally – and be prepared to possibly change the anesthesia plan or delay the procedure in certain situations,” Samer Narouze, MD, PhD, ASRA president and senior author of the guidelines, said in a news release about the recommendations.

Although some patients may use cannabis to relieve pain, research shows that “regular users may have more pain and nausea after surgery, not less, and may need more medications, including opioids, to manage the discomfort,” said Dr. Narouze, chairman of the Center for Pain Medicine at Western Reserve Hospital in Cuyahoga Falls, Ohio.
 

Risks for vomiting, heart attack

The new recommendations were created by a committee of 13 experts, including anesthesiologists, chronic pain physicians, and a patient advocate. Shalini Shah, MD, vice chair of anesthesiology at the University of California, Irvine, was lead author of the document.

Four of 21 recommendations were classified as grade A, meaning that following them would be expected to provide substantial benefits. Those recommendations are to screen all patients before surgery; postpone elective surgery for patients who have altered mental status or impaired decision-making capacity at the time of surgery; counsel frequent, heavy users about the potential for cannabis use to impair postoperative pain control; and counsel pregnant patients about the risks of cannabis use to unborn children.

The authors cited studies to support their recommendations, including one showing that long-term cannabis use was associated with a 20% increase in the incidence of postoperative nausea and vomiting, a leading complaint of surgery patients. Other research has shown that cannabis use is linked to more pain and use of opioids after surgery.

Other recommendations include delaying elective surgery for at least 2 hours after a patient has smoked cannabis, owing to an increased risk for heart attack, and considering adjustment of ventilation settings during surgery for regular smokers of cannabis. Research has shown that smoking cannabis may be a rare trigger for myocardial infarction and is associated with airway inflammation and self-reported respiratory symptoms.

Nevertheless, doctors should not conduct universal toxicology screening, given a lack of evidence supporting this practice, the guideline stated.

The authors did not have enough information to make recommendations about reducing cannabis use before surgery or adjusting opioid prescriptions after surgery for patients who use cannabis, they said.

Kenneth Finn, MD, president of the American Board of Pain Medicine, welcomed the publication of the new guidelines. Dr. Finn, who practices at Springs Rehabilitation in Colorado Springs, has edited a textbook about cannabis in medicine and founded the International Academy on the Science and Impact of Cannabis.

“The vast majority of medical providers really have no idea about cannabis and what its impacts are on the human body,” Dr. Finn said.

For one, it can interact with numerous other drugs, including warfarin.

Guideline coauthor Eugene R. Viscusi, MD, professor of anesthesiology at the Sidney Kimmel Medical College, Philadelphia, emphasized that, while cannabis may be perceived as “natural,” it should not be considered differently from manufactured drugs.

Cannabis and cannabinoids represent “a class of very potent and pharmacologically active compounds,” Dr. Viscusi said in an interview. While researchers continue to assess possible medically beneficial effects of cannabis compounds, clinicians also need to be aware of the risks.

“The literature continues to emerge, and while we are always hopeful for good news, as physicians, we need to be very well versed on potential risks, especially in a high-risk situation like surgery,” he said.

Dr. Shah has consulted for companies that develop medical devices and drugs. Dr. Finn is the editor of the textbook, “Cannabis in Medicine: An Evidence-Based Approach” (Springer: New York, 2020), for which he receives royalties.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

If you smoke, vape, or ingest cannabis, your anesthesiologist should know before you undergo a surgical procedure, according to new medical guidelines.

All patients who undergo procedures that require regional or general anesthesia should be asked if, how often, and in what forms they use the drug, according to recommendations from the American Society of Regional Anesthesia and Pain Medicine.

One reason: Patients who regularly use cannabis may experience worse pain and nausea after surgery and may require more opioid analgesia, the group said.

The society’s recommendations – published in Regional Anesthesia and Pain Medicine – are the first guidelines in the United States to cover cannabis use as it relates to surgery, the group said.
 

Possible interactions

Use of cannabis has increased in recent years, and researchers have been concerned that the drug may interact with anesthesia and complicate pain management. Few studies have evaluated interactions between cannabis and anesthetic agents, however, according to the authors of the new guidelines.

“With the rising prevalence of both medical and recreational cannabis use in the general population, anesthesiologists, surgeons, and perioperative physicians must have an understanding of the effects of cannabis on physiology in order to provide safe perioperative care,” the guideline said.

“Before surgery, anesthesiologists should ask patients if they use cannabis – whether medicinally or recreationally – and be prepared to possibly change the anesthesia plan or delay the procedure in certain situations,” Samer Narouze, MD, PhD, ASRA president and senior author of the guidelines, said in a news release about the recommendations.

Although some patients may use cannabis to relieve pain, research shows that “regular users may have more pain and nausea after surgery, not less, and may need more medications, including opioids, to manage the discomfort,” said Dr. Narouze, chairman of the Center for Pain Medicine at Western Reserve Hospital in Cuyahoga Falls, Ohio.
 

Risks for vomiting, heart attack

The new recommendations were created by a committee of 13 experts, including anesthesiologists, chronic pain physicians, and a patient advocate. Shalini Shah, MD, vice chair of anesthesiology at the University of California, Irvine, was lead author of the document.

Four of 21 recommendations were classified as grade A, meaning that following them would be expected to provide substantial benefits. Those recommendations are to screen all patients before surgery; postpone elective surgery for patients who have altered mental status or impaired decision-making capacity at the time of surgery; counsel frequent, heavy users about the potential for cannabis use to impair postoperative pain control; and counsel pregnant patients about the risks of cannabis use to unborn children.

The authors cited studies to support their recommendations, including one showing that long-term cannabis use was associated with a 20% increase in the incidence of postoperative nausea and vomiting, a leading complaint of surgery patients. Other research has shown that cannabis use is linked to more pain and use of opioids after surgery.

Other recommendations include delaying elective surgery for at least 2 hours after a patient has smoked cannabis, owing to an increased risk for heart attack, and considering adjustment of ventilation settings during surgery for regular smokers of cannabis. Research has shown that smoking cannabis may be a rare trigger for myocardial infarction and is associated with airway inflammation and self-reported respiratory symptoms.

Nevertheless, doctors should not conduct universal toxicology screening, given a lack of evidence supporting this practice, the guideline stated.

The authors did not have enough information to make recommendations about reducing cannabis use before surgery or adjusting opioid prescriptions after surgery for patients who use cannabis, they said.

Kenneth Finn, MD, president of the American Board of Pain Medicine, welcomed the publication of the new guidelines. Dr. Finn, who practices at Springs Rehabilitation in Colorado Springs, has edited a textbook about cannabis in medicine and founded the International Academy on the Science and Impact of Cannabis.

“The vast majority of medical providers really have no idea about cannabis and what its impacts are on the human body,” Dr. Finn said.

For one, it can interact with numerous other drugs, including warfarin.

Guideline coauthor Eugene R. Viscusi, MD, professor of anesthesiology at the Sidney Kimmel Medical College, Philadelphia, emphasized that, while cannabis may be perceived as “natural,” it should not be considered differently from manufactured drugs.

Cannabis and cannabinoids represent “a class of very potent and pharmacologically active compounds,” Dr. Viscusi said in an interview. While researchers continue to assess possible medically beneficial effects of cannabis compounds, clinicians also need to be aware of the risks.

“The literature continues to emerge, and while we are always hopeful for good news, as physicians, we need to be very well versed on potential risks, especially in a high-risk situation like surgery,” he said.

Dr. Shah has consulted for companies that develop medical devices and drugs. Dr. Finn is the editor of the textbook, “Cannabis in Medicine: An Evidence-Based Approach” (Springer: New York, 2020), for which he receives royalties.

A version of this article first appeared on Medscape.com.

If you smoke, vape, or ingest cannabis, your anesthesiologist should know before you undergo a surgical procedure, according to new medical guidelines.

All patients who undergo procedures that require regional or general anesthesia should be asked if, how often, and in what forms they use the drug, according to recommendations from the American Society of Regional Anesthesia and Pain Medicine.

One reason: Patients who regularly use cannabis may experience worse pain and nausea after surgery and may require more opioid analgesia, the group said.

The society’s recommendations – published in Regional Anesthesia and Pain Medicine – are the first guidelines in the United States to cover cannabis use as it relates to surgery, the group said.
 

Possible interactions

Use of cannabis has increased in recent years, and researchers have been concerned that the drug may interact with anesthesia and complicate pain management. Few studies have evaluated interactions between cannabis and anesthetic agents, however, according to the authors of the new guidelines.

“With the rising prevalence of both medical and recreational cannabis use in the general population, anesthesiologists, surgeons, and perioperative physicians must have an understanding of the effects of cannabis on physiology in order to provide safe perioperative care,” the guideline said.

“Before surgery, anesthesiologists should ask patients if they use cannabis – whether medicinally or recreationally – and be prepared to possibly change the anesthesia plan or delay the procedure in certain situations,” Samer Narouze, MD, PhD, ASRA president and senior author of the guidelines, said in a news release about the recommendations.

Although some patients may use cannabis to relieve pain, research shows that “regular users may have more pain and nausea after surgery, not less, and may need more medications, including opioids, to manage the discomfort,” said Dr. Narouze, chairman of the Center for Pain Medicine at Western Reserve Hospital in Cuyahoga Falls, Ohio.
 

Risks for vomiting, heart attack

The new recommendations were created by a committee of 13 experts, including anesthesiologists, chronic pain physicians, and a patient advocate. Shalini Shah, MD, vice chair of anesthesiology at the University of California, Irvine, was lead author of the document.

Four of 21 recommendations were classified as grade A, meaning that following them would be expected to provide substantial benefits. Those recommendations are to screen all patients before surgery; postpone elective surgery for patients who have altered mental status or impaired decision-making capacity at the time of surgery; counsel frequent, heavy users about the potential for cannabis use to impair postoperative pain control; and counsel pregnant patients about the risks of cannabis use to unborn children.

The authors cited studies to support their recommendations, including one showing that long-term cannabis use was associated with a 20% increase in the incidence of postoperative nausea and vomiting, a leading complaint of surgery patients. Other research has shown that cannabis use is linked to more pain and use of opioids after surgery.

Other recommendations include delaying elective surgery for at least 2 hours after a patient has smoked cannabis, owing to an increased risk for heart attack, and considering adjustment of ventilation settings during surgery for regular smokers of cannabis. Research has shown that smoking cannabis may be a rare trigger for myocardial infarction and is associated with airway inflammation and self-reported respiratory symptoms.

Nevertheless, doctors should not conduct universal toxicology screening, given a lack of evidence supporting this practice, the guideline stated.

The authors did not have enough information to make recommendations about reducing cannabis use before surgery or adjusting opioid prescriptions after surgery for patients who use cannabis, they said.

Kenneth Finn, MD, president of the American Board of Pain Medicine, welcomed the publication of the new guidelines. Dr. Finn, who practices at Springs Rehabilitation in Colorado Springs, has edited a textbook about cannabis in medicine and founded the International Academy on the Science and Impact of Cannabis.

“The vast majority of medical providers really have no idea about cannabis and what its impacts are on the human body,” Dr. Finn said.

For one, it can interact with numerous other drugs, including warfarin.

Guideline coauthor Eugene R. Viscusi, MD, professor of anesthesiology at the Sidney Kimmel Medical College, Philadelphia, emphasized that, while cannabis may be perceived as “natural,” it should not be considered differently from manufactured drugs.

Cannabis and cannabinoids represent “a class of very potent and pharmacologically active compounds,” Dr. Viscusi said in an interview. While researchers continue to assess possible medically beneficial effects of cannabis compounds, clinicians also need to be aware of the risks.

“The literature continues to emerge, and while we are always hopeful for good news, as physicians, we need to be very well versed on potential risks, especially in a high-risk situation like surgery,” he said.

Dr. Shah has consulted for companies that develop medical devices and drugs. Dr. Finn is the editor of the textbook, “Cannabis in Medicine: An Evidence-Based Approach” (Springer: New York, 2020), for which he receives royalties.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM REGIONAL ANETHESIA AND MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Is thrombolysis safe for stroke patients on DOACs?

Article Type
Changed
Thu, 02/09/2023 - 15:09

Stroke patients receiving thrombolysis do not appear to be at increased risk of symptomatic intracerebral hemorrhage (sICH) if they are taking direct oral anticoagulants (DOACs), a new study has found.

The study, the largest ever regarding the safety of thrombolysis in patients on DOACs, actually found a lower rate of sICH among patients taking DOACs than among those not taking anticoagulants.

“Thrombolysis is a backbone therapy in stroke, but the large population of patients who take DOACs are currently excluded from this treatment because DOAC use is a contraindication to treatment with thrombolysis. This is based on the presumption of an increased risk of sICH, but data to support or refute this presumption are lacking,” said senior author David J. Seiffge, MD, Bern University Hospital, Switzerland.

“Our results suggest that current guidelines need to be revised to remove the absolute contraindication of thrombolysis in patients on DOACs. The guidelines need to be more liberal on the use of thrombolysis in these patients,” he added.

“This study provides the basis for extending vital thrombolysis treatment to this substantial population of patients who take DOACs,” Dr. Seiffge said.

He estimates that 1 of every 6 stroke patients are taking a DOAC and that 1% to 2% of patients taking DOACs have a stroke each year. “As millions of patients are on DOACs, this is a large number of people who are not getting potentially life-saving thrombolysis therapy.”

Dr. Seiffge comments: “In our hospital we see at least one stroke patient on DOACs every day. It is a very frequent scenario. With this new data, we believe many of these patients could now benefit from thrombolysis without an increased bleeding risk.”

The study was published online  in JAMA Neurology.
 

An international investigation

While thrombolysis is currently contraindicated for patients taking DOACs, some clinicians still administer thrombolysis to these patients. Different selection strategies are used, including the use of DOAC reversal agents prior to thrombolysis or the selection of patients with low anticoagulant activity, the authors noted.

The current study involved an international collaboration. The investigators compared the risk of sICH among patients who had recently taken DOACs and who underwent thrombolysis as treatment for acute ischemic stroke with the risk among control stroke patients who underwent thrombolysis but who had not been taking DOACs.

Potential contributing centers were identified by a systematic search of the literature based on published studies on the use of thrombolysis for patients who had recently taken DOACs or prospective stroke registries that may include patients who had recently taken DOACs.

The study included 832 patients from 64 centers worldwide who were confirmed to have taken a DOAC within 48 hours of receiving thrombolysis for acute ischemic stroke. The comparison group was made up of 32,375 patients who had experienced ischemic stroke that was treated with thrombolysis but who had received no prior anticoagulation therapy.

Compared with control patients, patients who had recently taken DOACs were older; the incidence of hypertension among them was higher; they had a higher degree of prestroke disability; they were less likely to be smokers; the time from symptom onset to treatment was longer; they had experienced more severe stroke; and they were more likely to have a large-vessel occlusion.

Of the patients taking DOACs, 30.3% received DOAC reversal prior to thrombolysis. For 27.0%, DOAC plasma levels were measured. The remainder were treated with thrombolysis without either of these selection methods.

Results showed that the unadjusted rate of sICH was 2.5% among patients taking DOACs, compared with 4.1% among control patients who were not taking anticoagulants.

After adjustment for stroke severity and other baseline sICH predictors, patients who had recently taken DOACs and who received thrombolysis had lower odds of developing sICH (adjusted odds ratio, 0.57; 95% confidence interval, 0.36-0.92; P = .02).

There was no difference between the selection strategies, and results were consistent in different sensitivity analyses.

The secondary outcome of any ICH occurred in 18.0% in patients taking DOACs, compared with 17.4% among control patients who used no anticoagulants. After adjustment, there was no difference in the odds for any ICH between the groups (aOR, 1.18; 95% CI, 0.95-1.45; P = .14).

The unadjusted rate of functional independence was 45% among patients taking DOACs, compared with 57% among control patients. After adjustment, patients who had recently taken DOACs and who underwent thrombolysis had numerically higher odds of being functionally independent than control patients, although this difference did not reach statistical significance (aOR, 1.13; 95% CI, 0.94-1.36; P = .20).

The association of DOAC therapy with lower odds of sICH remained when mechanical thrombectomy, large-vessel occlusion, or concomitant antiplatelet therapy was added to the model.

“This is by far the largest study to look at this issue of thrombolytic use in patients on DOACs, and we did not find any group on DOACs that had an excess ICH rate with thrombolysis,” Dr. Seiffge said,

He explained that receiving warfarin was at one time an absolute contraindication for thrombolysis, but after a 2014 study suggested that the risk was not increased for patients with an international normalized ratio below 117, this was downgraded to a relative contraindication.

“We think our study is comparable and should lead to a guideline change,” Dr. Seiffge commented.

“A relative contraindication allows clinicians the space to make a considered decision on an individual basis,” he added.

Dr. Seiffge said that at his hospital, local guidelines regarding this issue have already been changed on the basis of these data, and use of DOACs is now considered a relative contraindication.

“International guidelines can take years to update, so in the meantime, I think other centers will also go ahead with a more liberal approach. There are always some centers that are ahead of the guidelines,” he added.

Although the lower risk of sICH seen in patients who have recently used DOACs seems counterintuitive at first glance, there could be a pathophysiologic explanation for this finding, the authors suggest.

They point out that thrombin inhibition, either directly or via the coagulation cascade, might be protective against the occurrence of sICH.

“Anticoagulants may allow the clot to respond better to thrombolysis – the clot is not as solid and is easier to recanalize. This leads to smaller strokes and a lower bleeding risk. Thrombin generation is also a major driver for blood brain barrier breakdown. DOACs reduce thrombin generation, so reduce blood brain barrier breakdown and reduce bleeding,” Dr. Seiffge explained. “But these are hypotheses,” he added.
 

 

 

Study ‘meaningfully advances the field’

In an accompanying editorial, Eva A. Mistry, MBBS, University of Cincinnati, said the current study “meaningfully advances the field” and provides an estimation of safety of intravenous thrombolysis among patients who have taken DOACs within 48 hours of hospital admission.

She lists strengths of the study as inclusion of a large number of patients across several geographically diverse institutions with heterogeneous standard practices for thrombolysis with recent DOAC use and narrow confidence intervals regarding observed rates of sICH.

“Further, the upper bound of this confidence interval for the DOAC group is below 4%, which is a welcome result and provides supportive data for clinicians who already practice thrombolysis for patients with recent DOAC ingestion,” Dr. Mistry adds.

However, she points out several study limitations, which she says limit immediate, widespread clinical applicability.

These include use of a nonconcurrent control population, which included patients from centers that did not contribute to the DOAC group and the inclusion of Asian patients who likely received a lower thrombolytic dose.

Dr. Seiffge noted that the researchers did adjust for Asian patients but not for the thrombolytic dosage. “I personally do not think this affects the results, as Asian patients have a lower dosage because they have a higher bleeding risk. The lower bleeding risk with DOACs was seen in all continents.”

Dr. Mistry also suggests that the DOAC group itself is prone to selection bias from preferential thrombolysis of patients receiving DOAC who are at lower risk of sICH.

But Dr. Seiffge argued: “I think, actually, the opposite is true. The DOAC patients were older, had more severe comorbidities, and an increased bleeding risk.”

Dr. Mistry concluded, “Despite the limitations of the study design and enrolled population, these data may be used by clinicians to make individualized decisions regarding thrombolysis among patients with recent DOAC use. Importantly, this study lays the foundation for prospective, well-powered studies that definitively determine the safety of thrombolysis in this population.”

The study was supported by a grant from the Bangerter-Rhyner Foundation. Dr. Seiffge received grants from Bangerter Rhyner Foundation during the conduct of the study and personal fees from Bayer, Alexion, and VarmX outside the submitted work. Dr. Mistry receives grant funding from the National Institute of Neurological Disorders and Stroke and serves as a consultant for RAPID AI.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(2)
Publications
Topics
Sections

Stroke patients receiving thrombolysis do not appear to be at increased risk of symptomatic intracerebral hemorrhage (sICH) if they are taking direct oral anticoagulants (DOACs), a new study has found.

The study, the largest ever regarding the safety of thrombolysis in patients on DOACs, actually found a lower rate of sICH among patients taking DOACs than among those not taking anticoagulants.

“Thrombolysis is a backbone therapy in stroke, but the large population of patients who take DOACs are currently excluded from this treatment because DOAC use is a contraindication to treatment with thrombolysis. This is based on the presumption of an increased risk of sICH, but data to support or refute this presumption are lacking,” said senior author David J. Seiffge, MD, Bern University Hospital, Switzerland.

“Our results suggest that current guidelines need to be revised to remove the absolute contraindication of thrombolysis in patients on DOACs. The guidelines need to be more liberal on the use of thrombolysis in these patients,” he added.

“This study provides the basis for extending vital thrombolysis treatment to this substantial population of patients who take DOACs,” Dr. Seiffge said.

He estimates that 1 of every 6 stroke patients are taking a DOAC and that 1% to 2% of patients taking DOACs have a stroke each year. “As millions of patients are on DOACs, this is a large number of people who are not getting potentially life-saving thrombolysis therapy.”

Dr. Seiffge comments: “In our hospital we see at least one stroke patient on DOACs every day. It is a very frequent scenario. With this new data, we believe many of these patients could now benefit from thrombolysis without an increased bleeding risk.”

The study was published online  in JAMA Neurology.
 

An international investigation

While thrombolysis is currently contraindicated for patients taking DOACs, some clinicians still administer thrombolysis to these patients. Different selection strategies are used, including the use of DOAC reversal agents prior to thrombolysis or the selection of patients with low anticoagulant activity, the authors noted.

The current study involved an international collaboration. The investigators compared the risk of sICH among patients who had recently taken DOACs and who underwent thrombolysis as treatment for acute ischemic stroke with the risk among control stroke patients who underwent thrombolysis but who had not been taking DOACs.

Potential contributing centers were identified by a systematic search of the literature based on published studies on the use of thrombolysis for patients who had recently taken DOACs or prospective stroke registries that may include patients who had recently taken DOACs.

The study included 832 patients from 64 centers worldwide who were confirmed to have taken a DOAC within 48 hours of receiving thrombolysis for acute ischemic stroke. The comparison group was made up of 32,375 patients who had experienced ischemic stroke that was treated with thrombolysis but who had received no prior anticoagulation therapy.

Compared with control patients, patients who had recently taken DOACs were older; the incidence of hypertension among them was higher; they had a higher degree of prestroke disability; they were less likely to be smokers; the time from symptom onset to treatment was longer; they had experienced more severe stroke; and they were more likely to have a large-vessel occlusion.

Of the patients taking DOACs, 30.3% received DOAC reversal prior to thrombolysis. For 27.0%, DOAC plasma levels were measured. The remainder were treated with thrombolysis without either of these selection methods.

Results showed that the unadjusted rate of sICH was 2.5% among patients taking DOACs, compared with 4.1% among control patients who were not taking anticoagulants.

After adjustment for stroke severity and other baseline sICH predictors, patients who had recently taken DOACs and who received thrombolysis had lower odds of developing sICH (adjusted odds ratio, 0.57; 95% confidence interval, 0.36-0.92; P = .02).

There was no difference between the selection strategies, and results were consistent in different sensitivity analyses.

The secondary outcome of any ICH occurred in 18.0% in patients taking DOACs, compared with 17.4% among control patients who used no anticoagulants. After adjustment, there was no difference in the odds for any ICH between the groups (aOR, 1.18; 95% CI, 0.95-1.45; P = .14).

The unadjusted rate of functional independence was 45% among patients taking DOACs, compared with 57% among control patients. After adjustment, patients who had recently taken DOACs and who underwent thrombolysis had numerically higher odds of being functionally independent than control patients, although this difference did not reach statistical significance (aOR, 1.13; 95% CI, 0.94-1.36; P = .20).

The association of DOAC therapy with lower odds of sICH remained when mechanical thrombectomy, large-vessel occlusion, or concomitant antiplatelet therapy was added to the model.

“This is by far the largest study to look at this issue of thrombolytic use in patients on DOACs, and we did not find any group on DOACs that had an excess ICH rate with thrombolysis,” Dr. Seiffge said,

He explained that receiving warfarin was at one time an absolute contraindication for thrombolysis, but after a 2014 study suggested that the risk was not increased for patients with an international normalized ratio below 117, this was downgraded to a relative contraindication.

“We think our study is comparable and should lead to a guideline change,” Dr. Seiffge commented.

“A relative contraindication allows clinicians the space to make a considered decision on an individual basis,” he added.

Dr. Seiffge said that at his hospital, local guidelines regarding this issue have already been changed on the basis of these data, and use of DOACs is now considered a relative contraindication.

“International guidelines can take years to update, so in the meantime, I think other centers will also go ahead with a more liberal approach. There are always some centers that are ahead of the guidelines,” he added.

Although the lower risk of sICH seen in patients who have recently used DOACs seems counterintuitive at first glance, there could be a pathophysiologic explanation for this finding, the authors suggest.

They point out that thrombin inhibition, either directly or via the coagulation cascade, might be protective against the occurrence of sICH.

“Anticoagulants may allow the clot to respond better to thrombolysis – the clot is not as solid and is easier to recanalize. This leads to smaller strokes and a lower bleeding risk. Thrombin generation is also a major driver for blood brain barrier breakdown. DOACs reduce thrombin generation, so reduce blood brain barrier breakdown and reduce bleeding,” Dr. Seiffge explained. “But these are hypotheses,” he added.
 

 

 

Study ‘meaningfully advances the field’

In an accompanying editorial, Eva A. Mistry, MBBS, University of Cincinnati, said the current study “meaningfully advances the field” and provides an estimation of safety of intravenous thrombolysis among patients who have taken DOACs within 48 hours of hospital admission.

She lists strengths of the study as inclusion of a large number of patients across several geographically diverse institutions with heterogeneous standard practices for thrombolysis with recent DOAC use and narrow confidence intervals regarding observed rates of sICH.

“Further, the upper bound of this confidence interval for the DOAC group is below 4%, which is a welcome result and provides supportive data for clinicians who already practice thrombolysis for patients with recent DOAC ingestion,” Dr. Mistry adds.

However, she points out several study limitations, which she says limit immediate, widespread clinical applicability.

These include use of a nonconcurrent control population, which included patients from centers that did not contribute to the DOAC group and the inclusion of Asian patients who likely received a lower thrombolytic dose.

Dr. Seiffge noted that the researchers did adjust for Asian patients but not for the thrombolytic dosage. “I personally do not think this affects the results, as Asian patients have a lower dosage because they have a higher bleeding risk. The lower bleeding risk with DOACs was seen in all continents.”

Dr. Mistry also suggests that the DOAC group itself is prone to selection bias from preferential thrombolysis of patients receiving DOAC who are at lower risk of sICH.

But Dr. Seiffge argued: “I think, actually, the opposite is true. The DOAC patients were older, had more severe comorbidities, and an increased bleeding risk.”

Dr. Mistry concluded, “Despite the limitations of the study design and enrolled population, these data may be used by clinicians to make individualized decisions regarding thrombolysis among patients with recent DOAC use. Importantly, this study lays the foundation for prospective, well-powered studies that definitively determine the safety of thrombolysis in this population.”

The study was supported by a grant from the Bangerter-Rhyner Foundation. Dr. Seiffge received grants from Bangerter Rhyner Foundation during the conduct of the study and personal fees from Bayer, Alexion, and VarmX outside the submitted work. Dr. Mistry receives grant funding from the National Institute of Neurological Disorders and Stroke and serves as a consultant for RAPID AI.

A version of this article first appeared on Medscape.com.

Stroke patients receiving thrombolysis do not appear to be at increased risk of symptomatic intracerebral hemorrhage (sICH) if they are taking direct oral anticoagulants (DOACs), a new study has found.

The study, the largest ever regarding the safety of thrombolysis in patients on DOACs, actually found a lower rate of sICH among patients taking DOACs than among those not taking anticoagulants.

“Thrombolysis is a backbone therapy in stroke, but the large population of patients who take DOACs are currently excluded from this treatment because DOAC use is a contraindication to treatment with thrombolysis. This is based on the presumption of an increased risk of sICH, but data to support or refute this presumption are lacking,” said senior author David J. Seiffge, MD, Bern University Hospital, Switzerland.

“Our results suggest that current guidelines need to be revised to remove the absolute contraindication of thrombolysis in patients on DOACs. The guidelines need to be more liberal on the use of thrombolysis in these patients,” he added.

“This study provides the basis for extending vital thrombolysis treatment to this substantial population of patients who take DOACs,” Dr. Seiffge said.

He estimates that 1 of every 6 stroke patients are taking a DOAC and that 1% to 2% of patients taking DOACs have a stroke each year. “As millions of patients are on DOACs, this is a large number of people who are not getting potentially life-saving thrombolysis therapy.”

Dr. Seiffge comments: “In our hospital we see at least one stroke patient on DOACs every day. It is a very frequent scenario. With this new data, we believe many of these patients could now benefit from thrombolysis without an increased bleeding risk.”

The study was published online  in JAMA Neurology.
 

An international investigation

While thrombolysis is currently contraindicated for patients taking DOACs, some clinicians still administer thrombolysis to these patients. Different selection strategies are used, including the use of DOAC reversal agents prior to thrombolysis or the selection of patients with low anticoagulant activity, the authors noted.

The current study involved an international collaboration. The investigators compared the risk of sICH among patients who had recently taken DOACs and who underwent thrombolysis as treatment for acute ischemic stroke with the risk among control stroke patients who underwent thrombolysis but who had not been taking DOACs.

Potential contributing centers were identified by a systematic search of the literature based on published studies on the use of thrombolysis for patients who had recently taken DOACs or prospective stroke registries that may include patients who had recently taken DOACs.

The study included 832 patients from 64 centers worldwide who were confirmed to have taken a DOAC within 48 hours of receiving thrombolysis for acute ischemic stroke. The comparison group was made up of 32,375 patients who had experienced ischemic stroke that was treated with thrombolysis but who had received no prior anticoagulation therapy.

Compared with control patients, patients who had recently taken DOACs were older; the incidence of hypertension among them was higher; they had a higher degree of prestroke disability; they were less likely to be smokers; the time from symptom onset to treatment was longer; they had experienced more severe stroke; and they were more likely to have a large-vessel occlusion.

Of the patients taking DOACs, 30.3% received DOAC reversal prior to thrombolysis. For 27.0%, DOAC plasma levels were measured. The remainder were treated with thrombolysis without either of these selection methods.

Results showed that the unadjusted rate of sICH was 2.5% among patients taking DOACs, compared with 4.1% among control patients who were not taking anticoagulants.

After adjustment for stroke severity and other baseline sICH predictors, patients who had recently taken DOACs and who received thrombolysis had lower odds of developing sICH (adjusted odds ratio, 0.57; 95% confidence interval, 0.36-0.92; P = .02).

There was no difference between the selection strategies, and results were consistent in different sensitivity analyses.

The secondary outcome of any ICH occurred in 18.0% in patients taking DOACs, compared with 17.4% among control patients who used no anticoagulants. After adjustment, there was no difference in the odds for any ICH between the groups (aOR, 1.18; 95% CI, 0.95-1.45; P = .14).

The unadjusted rate of functional independence was 45% among patients taking DOACs, compared with 57% among control patients. After adjustment, patients who had recently taken DOACs and who underwent thrombolysis had numerically higher odds of being functionally independent than control patients, although this difference did not reach statistical significance (aOR, 1.13; 95% CI, 0.94-1.36; P = .20).

The association of DOAC therapy with lower odds of sICH remained when mechanical thrombectomy, large-vessel occlusion, or concomitant antiplatelet therapy was added to the model.

“This is by far the largest study to look at this issue of thrombolytic use in patients on DOACs, and we did not find any group on DOACs that had an excess ICH rate with thrombolysis,” Dr. Seiffge said,

He explained that receiving warfarin was at one time an absolute contraindication for thrombolysis, but after a 2014 study suggested that the risk was not increased for patients with an international normalized ratio below 117, this was downgraded to a relative contraindication.

“We think our study is comparable and should lead to a guideline change,” Dr. Seiffge commented.

“A relative contraindication allows clinicians the space to make a considered decision on an individual basis,” he added.

Dr. Seiffge said that at his hospital, local guidelines regarding this issue have already been changed on the basis of these data, and use of DOACs is now considered a relative contraindication.

“International guidelines can take years to update, so in the meantime, I think other centers will also go ahead with a more liberal approach. There are always some centers that are ahead of the guidelines,” he added.

Although the lower risk of sICH seen in patients who have recently used DOACs seems counterintuitive at first glance, there could be a pathophysiologic explanation for this finding, the authors suggest.

They point out that thrombin inhibition, either directly or via the coagulation cascade, might be protective against the occurrence of sICH.

“Anticoagulants may allow the clot to respond better to thrombolysis – the clot is not as solid and is easier to recanalize. This leads to smaller strokes and a lower bleeding risk. Thrombin generation is also a major driver for blood brain barrier breakdown. DOACs reduce thrombin generation, so reduce blood brain barrier breakdown and reduce bleeding,” Dr. Seiffge explained. “But these are hypotheses,” he added.
 

 

 

Study ‘meaningfully advances the field’

In an accompanying editorial, Eva A. Mistry, MBBS, University of Cincinnati, said the current study “meaningfully advances the field” and provides an estimation of safety of intravenous thrombolysis among patients who have taken DOACs within 48 hours of hospital admission.

She lists strengths of the study as inclusion of a large number of patients across several geographically diverse institutions with heterogeneous standard practices for thrombolysis with recent DOAC use and narrow confidence intervals regarding observed rates of sICH.

“Further, the upper bound of this confidence interval for the DOAC group is below 4%, which is a welcome result and provides supportive data for clinicians who already practice thrombolysis for patients with recent DOAC ingestion,” Dr. Mistry adds.

However, she points out several study limitations, which she says limit immediate, widespread clinical applicability.

These include use of a nonconcurrent control population, which included patients from centers that did not contribute to the DOAC group and the inclusion of Asian patients who likely received a lower thrombolytic dose.

Dr. Seiffge noted that the researchers did adjust for Asian patients but not for the thrombolytic dosage. “I personally do not think this affects the results, as Asian patients have a lower dosage because they have a higher bleeding risk. The lower bleeding risk with DOACs was seen in all continents.”

Dr. Mistry also suggests that the DOAC group itself is prone to selection bias from preferential thrombolysis of patients receiving DOAC who are at lower risk of sICH.

But Dr. Seiffge argued: “I think, actually, the opposite is true. The DOAC patients were older, had more severe comorbidities, and an increased bleeding risk.”

Dr. Mistry concluded, “Despite the limitations of the study design and enrolled population, these data may be used by clinicians to make individualized decisions regarding thrombolysis among patients with recent DOAC use. Importantly, this study lays the foundation for prospective, well-powered studies that definitively determine the safety of thrombolysis in this population.”

The study was supported by a grant from the Bangerter-Rhyner Foundation. Dr. Seiffge received grants from Bangerter Rhyner Foundation during the conduct of the study and personal fees from Bayer, Alexion, and VarmX outside the submitted work. Dr. Mistry receives grant funding from the National Institute of Neurological Disorders and Stroke and serves as a consultant for RAPID AI.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(2)
Issue
Neurology Reviews - 31(2)
Publications
Publications
Topics
Article Type
Sections
Article Source

From JAMA Neurology

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article