Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Antiamyloids linked to accelerated brain atrophy

Article Type
Changed
Fri, 04/07/2023 - 13:46

Anti–amyloid-beta drugs, which are used in the management of Alzheimer’s disease (AD), have the potential to compromise long-term brain health by accelerating brain atrophy, a comprehensive meta-analysis of MRI data from clinical trials suggests.

Depending on the anti–amyloid-beta drug class, these agents can accelerate loss of whole brain and hippocampal volume and increase ventricular volume. This has been shown for some of the beta-secretase inhibitors and with several of the antiamyloid monoclonal antibodies, researchers noted.

“These data warrant concern, but we can’t make any firm conclusions yet. It is possible that the finding is not detrimental, but the usual interpretation of this finding is that volume changes are a surrogate for disease progression,” study investigator Scott Ayton, PhD, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, said in an interview.

“These data should be factored into the decisions by clinicians when they consider prescribing antiamyloid therapies. Like any side effect, clinicians should inform patients regarding the risk of brain atrophy. Patients should be actively monitored for this side effect,” Dr. Ayton said.

The study was published online in Neurology.
 

Earlier progression from MCI to AD?

Dr. Ayton and colleagues evaluated brain volume changes in 31 clinical trials of anti–amyloid-beta drugs that demonstrated a favorable change in at least one biomarker of pathological amyloid-beta and included detailed MRI data sufficient to assess the volumetric changes in at least one brain region.

A meta-analysis on the highest dose in each trial on the hippocampus, ventricles, and whole brain showed drug-induced acceleration of volume changes that varied by anti–amyloid-beta drug class.

Secretase inhibitors accelerated atrophy in the hippocampus (mean difference –37.1 mcL; –19.6% relative to change in placebo) and whole brain (mean difference –3.3 mL; –21.8% relative to change in placebo), but not ventricles.

Conversely, monoclonal antibodies caused accelerated ventricular enlargement (mean difference +1.3 mL; +23.8% relative to change in placebo), which was driven by the subset of monoclonal antibodies that induce amyloid-related imaging abnormalities (ARIA) (+2.1 mL; +38.7% relative to change in placebo). There was a “striking correlation between ventricular volume and ARIA frequency,” the investigators reported.

The effect of ARIA-inducing monoclonal antibodies on whole brain volume varied, with accelerated whole brain volume loss caused by donanemab (mean difference –4.6 mL; +23% relative to change in placebo) and lecanemab (–5.2 mL; +36.4% relative to change in placebo). This was not observed with aducanumab and bapineuzumab.

Monoclonal antibodies did not cause accelerated volume loss to the hippocampus regardless of whether they caused ARIA.

The researchers also modeled the effect of anti–amyloid-beta drugs on brain volume changes. In this analysis, participants with mild cognitive impairment (MCI) treated with anti–amyloid-beta drugs were projected to have a “material regression” toward brain volumes typical of AD roughly 8 months earlier than untreated peers.

The data, they note, “permit robust conclusions regarding the effect of [anti–amyloid-beta] drug classes on different brain structures, but the lack of individual patient data (which has yet to be released) limits the interpretations of our findings.”

“Questions like which brain regions are impacted by [anti–amyloid-beta] drugs and whether the volume changes are related to ARIA, plaque loss, cognitive/noncognitive outcomes, or clinical factors such as age, sex, and apoE4 genotype can and should be addressed with available data,” said Dr. Ayton.

Dr. Ayton and colleagues called on data safety monitoring boards (DSMBs) for current clinical trials of anti–amyloid-beta drugs to review volumetric data to determine if patient safety is at risk, particularly in patients who develop ARIA.

In addition, they noted ethics boards that approve trials for anti–amyloid-beta drugs “should request that volume changes be actively monitored. Long-term follow-up of brain volumes should be factored into the trial designs to determine if brain atrophy is progressive, particularly in patients who develop ARIA.”

Finally, they added that drug companies that have conducted trials of anti–amyloid-beta drugs should interrogate prior data on brain volume, report the findings, and release the data for researchers to investigate.

“I have been banging on about this for years,” said Dr. Ayton. “Unfortunately, my raising of this issue has not led to any response. The data are not available, and the basic questions haven’t been asked (publicly).”
 

 

 

Commendable research

In an accompanying editorial, Frederik Barkhof, MD, PhD, with Amsterdam University Medical Centers, and David Knopman, MD, with Mayo Clinic Alzheimer’s Disease Research Center, Rochester, Minn., wrote that the investigators should be “commended” for their analysis. 

“The reality in 2023 is that the relevance of brain volume reductions in this therapeutic context remains uncertain,” they wrote.

“Longer periods of observation will be needed to know whether the brain volume losses continue at an accelerated rate or if they attenuate or disappear. Ultimately, it’s the clinical outcomes that matter, regardless of the MRI changes,” Barkhof and Knopman concluded.

The research was supported by funds from the Australian National Health & Medical Research Council. Dr. Ayton reported being a consultant for Eisai in the past 3 years. Dr. Barkhof reported serving on the data and safety monitoring board for Prothena and the A45-AHEAD studies; being a steering committee member for Merck, Bayer, and Biogen; and being a consultant for IXICO, Roche, Celltrion, Rewind Therapeutics, and Combinostics. Dr. Knopman reported serving on the DSMB for the Dominantly Inherited Alzheimer Network Treatment Unit study; serving on a DSMB for a tau therapeutic for Biogen; being an investigator for clinical trials sponsored by Biogen, Lilly Pharmaceuticals, and the University of Southern California. He reported consulting with Roche, Samus Therapeutics, Magellan Health, BioVie, and Alzeca Biosciences.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Anti–amyloid-beta drugs, which are used in the management of Alzheimer’s disease (AD), have the potential to compromise long-term brain health by accelerating brain atrophy, a comprehensive meta-analysis of MRI data from clinical trials suggests.

Depending on the anti–amyloid-beta drug class, these agents can accelerate loss of whole brain and hippocampal volume and increase ventricular volume. This has been shown for some of the beta-secretase inhibitors and with several of the antiamyloid monoclonal antibodies, researchers noted.

“These data warrant concern, but we can’t make any firm conclusions yet. It is possible that the finding is not detrimental, but the usual interpretation of this finding is that volume changes are a surrogate for disease progression,” study investigator Scott Ayton, PhD, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, said in an interview.

“These data should be factored into the decisions by clinicians when they consider prescribing antiamyloid therapies. Like any side effect, clinicians should inform patients regarding the risk of brain atrophy. Patients should be actively monitored for this side effect,” Dr. Ayton said.

The study was published online in Neurology.
 

Earlier progression from MCI to AD?

Dr. Ayton and colleagues evaluated brain volume changes in 31 clinical trials of anti–amyloid-beta drugs that demonstrated a favorable change in at least one biomarker of pathological amyloid-beta and included detailed MRI data sufficient to assess the volumetric changes in at least one brain region.

A meta-analysis on the highest dose in each trial on the hippocampus, ventricles, and whole brain showed drug-induced acceleration of volume changes that varied by anti–amyloid-beta drug class.

Secretase inhibitors accelerated atrophy in the hippocampus (mean difference –37.1 mcL; –19.6% relative to change in placebo) and whole brain (mean difference –3.3 mL; –21.8% relative to change in placebo), but not ventricles.

Conversely, monoclonal antibodies caused accelerated ventricular enlargement (mean difference +1.3 mL; +23.8% relative to change in placebo), which was driven by the subset of monoclonal antibodies that induce amyloid-related imaging abnormalities (ARIA) (+2.1 mL; +38.7% relative to change in placebo). There was a “striking correlation between ventricular volume and ARIA frequency,” the investigators reported.

The effect of ARIA-inducing monoclonal antibodies on whole brain volume varied, with accelerated whole brain volume loss caused by donanemab (mean difference –4.6 mL; +23% relative to change in placebo) and lecanemab (–5.2 mL; +36.4% relative to change in placebo). This was not observed with aducanumab and bapineuzumab.

Monoclonal antibodies did not cause accelerated volume loss to the hippocampus regardless of whether they caused ARIA.

The researchers also modeled the effect of anti–amyloid-beta drugs on brain volume changes. In this analysis, participants with mild cognitive impairment (MCI) treated with anti–amyloid-beta drugs were projected to have a “material regression” toward brain volumes typical of AD roughly 8 months earlier than untreated peers.

The data, they note, “permit robust conclusions regarding the effect of [anti–amyloid-beta] drug classes on different brain structures, but the lack of individual patient data (which has yet to be released) limits the interpretations of our findings.”

“Questions like which brain regions are impacted by [anti–amyloid-beta] drugs and whether the volume changes are related to ARIA, plaque loss, cognitive/noncognitive outcomes, or clinical factors such as age, sex, and apoE4 genotype can and should be addressed with available data,” said Dr. Ayton.

Dr. Ayton and colleagues called on data safety monitoring boards (DSMBs) for current clinical trials of anti–amyloid-beta drugs to review volumetric data to determine if patient safety is at risk, particularly in patients who develop ARIA.

In addition, they noted ethics boards that approve trials for anti–amyloid-beta drugs “should request that volume changes be actively monitored. Long-term follow-up of brain volumes should be factored into the trial designs to determine if brain atrophy is progressive, particularly in patients who develop ARIA.”

Finally, they added that drug companies that have conducted trials of anti–amyloid-beta drugs should interrogate prior data on brain volume, report the findings, and release the data for researchers to investigate.

“I have been banging on about this for years,” said Dr. Ayton. “Unfortunately, my raising of this issue has not led to any response. The data are not available, and the basic questions haven’t been asked (publicly).”
 

 

 

Commendable research

In an accompanying editorial, Frederik Barkhof, MD, PhD, with Amsterdam University Medical Centers, and David Knopman, MD, with Mayo Clinic Alzheimer’s Disease Research Center, Rochester, Minn., wrote that the investigators should be “commended” for their analysis. 

“The reality in 2023 is that the relevance of brain volume reductions in this therapeutic context remains uncertain,” they wrote.

“Longer periods of observation will be needed to know whether the brain volume losses continue at an accelerated rate or if they attenuate or disappear. Ultimately, it’s the clinical outcomes that matter, regardless of the MRI changes,” Barkhof and Knopman concluded.

The research was supported by funds from the Australian National Health & Medical Research Council. Dr. Ayton reported being a consultant for Eisai in the past 3 years. Dr. Barkhof reported serving on the data and safety monitoring board for Prothena and the A45-AHEAD studies; being a steering committee member for Merck, Bayer, and Biogen; and being a consultant for IXICO, Roche, Celltrion, Rewind Therapeutics, and Combinostics. Dr. Knopman reported serving on the DSMB for the Dominantly Inherited Alzheimer Network Treatment Unit study; serving on a DSMB for a tau therapeutic for Biogen; being an investigator for clinical trials sponsored by Biogen, Lilly Pharmaceuticals, and the University of Southern California. He reported consulting with Roche, Samus Therapeutics, Magellan Health, BioVie, and Alzeca Biosciences.

A version of this article first appeared on Medscape.com.

Anti–amyloid-beta drugs, which are used in the management of Alzheimer’s disease (AD), have the potential to compromise long-term brain health by accelerating brain atrophy, a comprehensive meta-analysis of MRI data from clinical trials suggests.

Depending on the anti–amyloid-beta drug class, these agents can accelerate loss of whole brain and hippocampal volume and increase ventricular volume. This has been shown for some of the beta-secretase inhibitors and with several of the antiamyloid monoclonal antibodies, researchers noted.

“These data warrant concern, but we can’t make any firm conclusions yet. It is possible that the finding is not detrimental, but the usual interpretation of this finding is that volume changes are a surrogate for disease progression,” study investigator Scott Ayton, PhD, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, said in an interview.

“These data should be factored into the decisions by clinicians when they consider prescribing antiamyloid therapies. Like any side effect, clinicians should inform patients regarding the risk of brain atrophy. Patients should be actively monitored for this side effect,” Dr. Ayton said.

The study was published online in Neurology.
 

Earlier progression from MCI to AD?

Dr. Ayton and colleagues evaluated brain volume changes in 31 clinical trials of anti–amyloid-beta drugs that demonstrated a favorable change in at least one biomarker of pathological amyloid-beta and included detailed MRI data sufficient to assess the volumetric changes in at least one brain region.

A meta-analysis on the highest dose in each trial on the hippocampus, ventricles, and whole brain showed drug-induced acceleration of volume changes that varied by anti–amyloid-beta drug class.

Secretase inhibitors accelerated atrophy in the hippocampus (mean difference –37.1 mcL; –19.6% relative to change in placebo) and whole brain (mean difference –3.3 mL; –21.8% relative to change in placebo), but not ventricles.

Conversely, monoclonal antibodies caused accelerated ventricular enlargement (mean difference +1.3 mL; +23.8% relative to change in placebo), which was driven by the subset of monoclonal antibodies that induce amyloid-related imaging abnormalities (ARIA) (+2.1 mL; +38.7% relative to change in placebo). There was a “striking correlation between ventricular volume and ARIA frequency,” the investigators reported.

The effect of ARIA-inducing monoclonal antibodies on whole brain volume varied, with accelerated whole brain volume loss caused by donanemab (mean difference –4.6 mL; +23% relative to change in placebo) and lecanemab (–5.2 mL; +36.4% relative to change in placebo). This was not observed with aducanumab and bapineuzumab.

Monoclonal antibodies did not cause accelerated volume loss to the hippocampus regardless of whether they caused ARIA.

The researchers also modeled the effect of anti–amyloid-beta drugs on brain volume changes. In this analysis, participants with mild cognitive impairment (MCI) treated with anti–amyloid-beta drugs were projected to have a “material regression” toward brain volumes typical of AD roughly 8 months earlier than untreated peers.

The data, they note, “permit robust conclusions regarding the effect of [anti–amyloid-beta] drug classes on different brain structures, but the lack of individual patient data (which has yet to be released) limits the interpretations of our findings.”

“Questions like which brain regions are impacted by [anti–amyloid-beta] drugs and whether the volume changes are related to ARIA, plaque loss, cognitive/noncognitive outcomes, or clinical factors such as age, sex, and apoE4 genotype can and should be addressed with available data,” said Dr. Ayton.

Dr. Ayton and colleagues called on data safety monitoring boards (DSMBs) for current clinical trials of anti–amyloid-beta drugs to review volumetric data to determine if patient safety is at risk, particularly in patients who develop ARIA.

In addition, they noted ethics boards that approve trials for anti–amyloid-beta drugs “should request that volume changes be actively monitored. Long-term follow-up of brain volumes should be factored into the trial designs to determine if brain atrophy is progressive, particularly in patients who develop ARIA.”

Finally, they added that drug companies that have conducted trials of anti–amyloid-beta drugs should interrogate prior data on brain volume, report the findings, and release the data for researchers to investigate.

“I have been banging on about this for years,” said Dr. Ayton. “Unfortunately, my raising of this issue has not led to any response. The data are not available, and the basic questions haven’t been asked (publicly).”
 

 

 

Commendable research

In an accompanying editorial, Frederik Barkhof, MD, PhD, with Amsterdam University Medical Centers, and David Knopman, MD, with Mayo Clinic Alzheimer’s Disease Research Center, Rochester, Minn., wrote that the investigators should be “commended” for their analysis. 

“The reality in 2023 is that the relevance of brain volume reductions in this therapeutic context remains uncertain,” they wrote.

“Longer periods of observation will be needed to know whether the brain volume losses continue at an accelerated rate or if they attenuate or disappear. Ultimately, it’s the clinical outcomes that matter, regardless of the MRI changes,” Barkhof and Knopman concluded.

The research was supported by funds from the Australian National Health & Medical Research Council. Dr. Ayton reported being a consultant for Eisai in the past 3 years. Dr. Barkhof reported serving on the data and safety monitoring board for Prothena and the A45-AHEAD studies; being a steering committee member for Merck, Bayer, and Biogen; and being a consultant for IXICO, Roche, Celltrion, Rewind Therapeutics, and Combinostics. Dr. Knopman reported serving on the DSMB for the Dominantly Inherited Alzheimer Network Treatment Unit study; serving on a DSMB for a tau therapeutic for Biogen; being an investigator for clinical trials sponsored by Biogen, Lilly Pharmaceuticals, and the University of Southern California. He reported consulting with Roche, Samus Therapeutics, Magellan Health, BioVie, and Alzeca Biosciences.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Magnesium-rich diet linked to lower dementia risk

Article Type
Changed
Fri, 04/07/2023 - 14:04

A magnesium-rich diet has been linked to better brain health, an outcome that may help lower dementia risk, new research suggests.

Investigators studied more than 6,000 cognitively healthy individuals, aged 40-73, and found that those who consumed more than 550 mg of magnesium daily had a brain age approximately 1 year younger by age 55 years, compared with a person who consumed a normal magnesium intake (~360 mg per day).

“This research highlights the potential benefits of a diet high in magnesium and the role it plays in promoting good brain health,” lead author Khawlah Alateeq, a PhD candidate in neuroscience at Australian National University’s National Centre for Epidemiology and Population Health, said in an interview.

Clinicians “can use [the findings] to counsel patients on the benefits of increasing magnesium intake through a healthy diet and monitoring magnesium levels to prevent deficiencies,” she stated.

The study was published online  in the European Journal of Nutrition.
 

Promising target

The researchers were motivated to conduct the study because of “the growing concern over the increasing prevalence of dementia,” Ms. Alateeq said.

“Since there is no cure for dementia, and the development of pharmacological treatment for dementia has been unsuccessful over the last 30 years, prevention has been suggested as an effective approach to address the issue,” she added.

Nutrition, Ms. Alateeq said, is a “modifiable risk factor that can influence brain health and is highly amenable to scalable and cost-effective interventions.” It represents “a promising target” for risk reduction at a population level.

Previous research shows individuals with lower magnesium levels are at higher risk for AD, while those with higher dietary magnesium intake may be at lower risk of progressing from normal aging to cognitive impairment.

Most previous studies, however, included participants older than age 60 years, and it’s “unclear when the neuroprotective effects of dietary magnesium become detectable,” the researchers note.

Moreover, dietary patterns change and fluctuate, potentially leading to changes in magnesium intake over time. These changes may have as much impact as absolute magnesium at any point in time.

In light of the “current lack of understanding of when and to what extent dietary magnesium exerts its protective effects on the brain,” the researchers examined the association between magnesium trajectories over time, brain matter, and white matter lesions.

They also examined the association between magnesium and several different blood pressure measures (mean arterial pressure, systolic blood pressure, diastolic blood pressure, and pulse pressure).

Since cardiovascular health, neurodegeneration, and brain shrinkage patterns differ between men and women, the researchers stratified their analyses by sex.
 

Brain volume differences

The researchers analyzed the dietary magnesium intake of 6,001 individuals (mean age, 55.3 years) selected from the UK Biobank – a prospective cohort study of participants aged 37-73 at baseline, who were assessed between 2005 and 2023.

For the current study, only participants with baseline DBP and SBP measurements and structural MRI scans were included. Participants were also required to be free of neurologic disorders and to have an available record of dietary magnesium intake.

Covariates included age, sex, education, health conditions, smoking status, body mass index, amount of physical activity, smoking status, and alcohol intake.

Over a 16-month period, participants completed an online questionnaire five times. Their responses were used to calculate daily magnesium intake. Foods of particular interest included leafy green vegetables, legumes, nuts, seeds, and whole grains, all of which are magnesium rich.

They used latent class analysis (LCA) to “identify mutually exclusive subgroup (classes) of magnesium intake trajectory separately for men and women.”

Men had a slightly higher prevalence of BP medication and diabetes, compared with women, and postmenopausal women had a higher prevalence of BP medication and diabetes, compared with premenopausal women.

Compared with lower baseline magnesium intake, higher baseline dietary intake of magnesium was associated with larger brain volumes in several regions in both men and women.

The latent class analysis identified three classes of magnesium intake:




In women in particular, the “high-decreasing” trajectory was significantly associated with larger brain volumes, compared with the “normal-stable” trajectory, while the “low-increasing” trajectory was associated with smaller brain volumes.



Even an increase of 1 mg of magnesium per day (above 350 mg/day) made a difference in brain volume, especially in women. The changes associated with every 1-mg increase are found in the table below:



Associations between magnesium and BP measures were “mostly nonsignificant,” the researchers say, and the neuroprotective effect of higher magnesium intake in the high-decreasing trajectory was greater in postmenopausal versus premenopausal women.

“Our models indicate that compared to somebody with a normal magnesium intake (~350 mg per day), somebody in the top quartile of magnesium intake (≥ 550 mg per day) would be predicted to have a ~0.20% larger GM and ~0.46% larger RHC,” the authors summarize.

“In a population with an average age of 55 years, this effect corresponds to ~1 year of typical aging,” they note. “In other words, if this effect is generalizable to other populations, a 41% increase in magnesium intake may lead to significantly better brain health.”

Although the exact mechanisms underlying magnesium’s protective effects are “not yet clearly understood, there’s considerable evidence that magnesium levels are related to better cardiovascular health. Magnesium supplementation has been found to decrease blood pressure – and high blood pressure is a well-established risk factor for dementia,” said Ms. Alateeq.
 

 

 

Association, not causation

Yuko Hara, PhD, director of Aging and Prevention, Alzheimer’s Drug Discovery Foundation, noted that the study is observational and therefore shows an association, not causation.

“People eating a high-magnesium diet may also be eating a brain-healthy diet and getting high levels of nutrients/minerals other than magnesium alone,” suggested Dr. Hara, who was not involved with the study.

She noted that many foods are good sources of magnesium, including spinach, almonds, cashews, legumes, yogurt, brown rice, and avocados.

“Eating a brain-healthy diet (for example, the Mediterranean diet) is one of the Seven Steps to Protect Your Cognitive Vitality that ADDF’s Cognitive Vitality promotes,” she said.

Open Access funding was enabled and organized by the Council of Australian University Librarians and its Member Institutions. Ms. Alateeq, her co-authors, and Dr. Hara declare no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

A magnesium-rich diet has been linked to better brain health, an outcome that may help lower dementia risk, new research suggests.

Investigators studied more than 6,000 cognitively healthy individuals, aged 40-73, and found that those who consumed more than 550 mg of magnesium daily had a brain age approximately 1 year younger by age 55 years, compared with a person who consumed a normal magnesium intake (~360 mg per day).

“This research highlights the potential benefits of a diet high in magnesium and the role it plays in promoting good brain health,” lead author Khawlah Alateeq, a PhD candidate in neuroscience at Australian National University’s National Centre for Epidemiology and Population Health, said in an interview.

Clinicians “can use [the findings] to counsel patients on the benefits of increasing magnesium intake through a healthy diet and monitoring magnesium levels to prevent deficiencies,” she stated.

The study was published online  in the European Journal of Nutrition.
 

Promising target

The researchers were motivated to conduct the study because of “the growing concern over the increasing prevalence of dementia,” Ms. Alateeq said.

“Since there is no cure for dementia, and the development of pharmacological treatment for dementia has been unsuccessful over the last 30 years, prevention has been suggested as an effective approach to address the issue,” she added.

Nutrition, Ms. Alateeq said, is a “modifiable risk factor that can influence brain health and is highly amenable to scalable and cost-effective interventions.” It represents “a promising target” for risk reduction at a population level.

Previous research shows individuals with lower magnesium levels are at higher risk for AD, while those with higher dietary magnesium intake may be at lower risk of progressing from normal aging to cognitive impairment.

Most previous studies, however, included participants older than age 60 years, and it’s “unclear when the neuroprotective effects of dietary magnesium become detectable,” the researchers note.

Moreover, dietary patterns change and fluctuate, potentially leading to changes in magnesium intake over time. These changes may have as much impact as absolute magnesium at any point in time.

In light of the “current lack of understanding of when and to what extent dietary magnesium exerts its protective effects on the brain,” the researchers examined the association between magnesium trajectories over time, brain matter, and white matter lesions.

They also examined the association between magnesium and several different blood pressure measures (mean arterial pressure, systolic blood pressure, diastolic blood pressure, and pulse pressure).

Since cardiovascular health, neurodegeneration, and brain shrinkage patterns differ between men and women, the researchers stratified their analyses by sex.
 

Brain volume differences

The researchers analyzed the dietary magnesium intake of 6,001 individuals (mean age, 55.3 years) selected from the UK Biobank – a prospective cohort study of participants aged 37-73 at baseline, who were assessed between 2005 and 2023.

For the current study, only participants with baseline DBP and SBP measurements and structural MRI scans were included. Participants were also required to be free of neurologic disorders and to have an available record of dietary magnesium intake.

Covariates included age, sex, education, health conditions, smoking status, body mass index, amount of physical activity, smoking status, and alcohol intake.

Over a 16-month period, participants completed an online questionnaire five times. Their responses were used to calculate daily magnesium intake. Foods of particular interest included leafy green vegetables, legumes, nuts, seeds, and whole grains, all of which are magnesium rich.

They used latent class analysis (LCA) to “identify mutually exclusive subgroup (classes) of magnesium intake trajectory separately for men and women.”

Men had a slightly higher prevalence of BP medication and diabetes, compared with women, and postmenopausal women had a higher prevalence of BP medication and diabetes, compared with premenopausal women.

Compared with lower baseline magnesium intake, higher baseline dietary intake of magnesium was associated with larger brain volumes in several regions in both men and women.

The latent class analysis identified three classes of magnesium intake:




In women in particular, the “high-decreasing” trajectory was significantly associated with larger brain volumes, compared with the “normal-stable” trajectory, while the “low-increasing” trajectory was associated with smaller brain volumes.



Even an increase of 1 mg of magnesium per day (above 350 mg/day) made a difference in brain volume, especially in women. The changes associated with every 1-mg increase are found in the table below:



Associations between magnesium and BP measures were “mostly nonsignificant,” the researchers say, and the neuroprotective effect of higher magnesium intake in the high-decreasing trajectory was greater in postmenopausal versus premenopausal women.

“Our models indicate that compared to somebody with a normal magnesium intake (~350 mg per day), somebody in the top quartile of magnesium intake (≥ 550 mg per day) would be predicted to have a ~0.20% larger GM and ~0.46% larger RHC,” the authors summarize.

“In a population with an average age of 55 years, this effect corresponds to ~1 year of typical aging,” they note. “In other words, if this effect is generalizable to other populations, a 41% increase in magnesium intake may lead to significantly better brain health.”

Although the exact mechanisms underlying magnesium’s protective effects are “not yet clearly understood, there’s considerable evidence that magnesium levels are related to better cardiovascular health. Magnesium supplementation has been found to decrease blood pressure – and high blood pressure is a well-established risk factor for dementia,” said Ms. Alateeq.
 

 

 

Association, not causation

Yuko Hara, PhD, director of Aging and Prevention, Alzheimer’s Drug Discovery Foundation, noted that the study is observational and therefore shows an association, not causation.

“People eating a high-magnesium diet may also be eating a brain-healthy diet and getting high levels of nutrients/minerals other than magnesium alone,” suggested Dr. Hara, who was not involved with the study.

She noted that many foods are good sources of magnesium, including spinach, almonds, cashews, legumes, yogurt, brown rice, and avocados.

“Eating a brain-healthy diet (for example, the Mediterranean diet) is one of the Seven Steps to Protect Your Cognitive Vitality that ADDF’s Cognitive Vitality promotes,” she said.

Open Access funding was enabled and organized by the Council of Australian University Librarians and its Member Institutions. Ms. Alateeq, her co-authors, and Dr. Hara declare no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

A magnesium-rich diet has been linked to better brain health, an outcome that may help lower dementia risk, new research suggests.

Investigators studied more than 6,000 cognitively healthy individuals, aged 40-73, and found that those who consumed more than 550 mg of magnesium daily had a brain age approximately 1 year younger by age 55 years, compared with a person who consumed a normal magnesium intake (~360 mg per day).

“This research highlights the potential benefits of a diet high in magnesium and the role it plays in promoting good brain health,” lead author Khawlah Alateeq, a PhD candidate in neuroscience at Australian National University’s National Centre for Epidemiology and Population Health, said in an interview.

Clinicians “can use [the findings] to counsel patients on the benefits of increasing magnesium intake through a healthy diet and monitoring magnesium levels to prevent deficiencies,” she stated.

The study was published online  in the European Journal of Nutrition.
 

Promising target

The researchers were motivated to conduct the study because of “the growing concern over the increasing prevalence of dementia,” Ms. Alateeq said.

“Since there is no cure for dementia, and the development of pharmacological treatment for dementia has been unsuccessful over the last 30 years, prevention has been suggested as an effective approach to address the issue,” she added.

Nutrition, Ms. Alateeq said, is a “modifiable risk factor that can influence brain health and is highly amenable to scalable and cost-effective interventions.” It represents “a promising target” for risk reduction at a population level.

Previous research shows individuals with lower magnesium levels are at higher risk for AD, while those with higher dietary magnesium intake may be at lower risk of progressing from normal aging to cognitive impairment.

Most previous studies, however, included participants older than age 60 years, and it’s “unclear when the neuroprotective effects of dietary magnesium become detectable,” the researchers note.

Moreover, dietary patterns change and fluctuate, potentially leading to changes in magnesium intake over time. These changes may have as much impact as absolute magnesium at any point in time.

In light of the “current lack of understanding of when and to what extent dietary magnesium exerts its protective effects on the brain,” the researchers examined the association between magnesium trajectories over time, brain matter, and white matter lesions.

They also examined the association between magnesium and several different blood pressure measures (mean arterial pressure, systolic blood pressure, diastolic blood pressure, and pulse pressure).

Since cardiovascular health, neurodegeneration, and brain shrinkage patterns differ between men and women, the researchers stratified their analyses by sex.
 

Brain volume differences

The researchers analyzed the dietary magnesium intake of 6,001 individuals (mean age, 55.3 years) selected from the UK Biobank – a prospective cohort study of participants aged 37-73 at baseline, who were assessed between 2005 and 2023.

For the current study, only participants with baseline DBP and SBP measurements and structural MRI scans were included. Participants were also required to be free of neurologic disorders and to have an available record of dietary magnesium intake.

Covariates included age, sex, education, health conditions, smoking status, body mass index, amount of physical activity, smoking status, and alcohol intake.

Over a 16-month period, participants completed an online questionnaire five times. Their responses were used to calculate daily magnesium intake. Foods of particular interest included leafy green vegetables, legumes, nuts, seeds, and whole grains, all of which are magnesium rich.

They used latent class analysis (LCA) to “identify mutually exclusive subgroup (classes) of magnesium intake trajectory separately for men and women.”

Men had a slightly higher prevalence of BP medication and diabetes, compared with women, and postmenopausal women had a higher prevalence of BP medication and diabetes, compared with premenopausal women.

Compared with lower baseline magnesium intake, higher baseline dietary intake of magnesium was associated with larger brain volumes in several regions in both men and women.

The latent class analysis identified three classes of magnesium intake:




In women in particular, the “high-decreasing” trajectory was significantly associated with larger brain volumes, compared with the “normal-stable” trajectory, while the “low-increasing” trajectory was associated with smaller brain volumes.



Even an increase of 1 mg of magnesium per day (above 350 mg/day) made a difference in brain volume, especially in women. The changes associated with every 1-mg increase are found in the table below:



Associations between magnesium and BP measures were “mostly nonsignificant,” the researchers say, and the neuroprotective effect of higher magnesium intake in the high-decreasing trajectory was greater in postmenopausal versus premenopausal women.

“Our models indicate that compared to somebody with a normal magnesium intake (~350 mg per day), somebody in the top quartile of magnesium intake (≥ 550 mg per day) would be predicted to have a ~0.20% larger GM and ~0.46% larger RHC,” the authors summarize.

“In a population with an average age of 55 years, this effect corresponds to ~1 year of typical aging,” they note. “In other words, if this effect is generalizable to other populations, a 41% increase in magnesium intake may lead to significantly better brain health.”

Although the exact mechanisms underlying magnesium’s protective effects are “not yet clearly understood, there’s considerable evidence that magnesium levels are related to better cardiovascular health. Magnesium supplementation has been found to decrease blood pressure – and high blood pressure is a well-established risk factor for dementia,” said Ms. Alateeq.
 

 

 

Association, not causation

Yuko Hara, PhD, director of Aging and Prevention, Alzheimer’s Drug Discovery Foundation, noted that the study is observational and therefore shows an association, not causation.

“People eating a high-magnesium diet may also be eating a brain-healthy diet and getting high levels of nutrients/minerals other than magnesium alone,” suggested Dr. Hara, who was not involved with the study.

She noted that many foods are good sources of magnesium, including spinach, almonds, cashews, legumes, yogurt, brown rice, and avocados.

“Eating a brain-healthy diet (for example, the Mediterranean diet) is one of the Seven Steps to Protect Your Cognitive Vitality that ADDF’s Cognitive Vitality promotes,” she said.

Open Access funding was enabled and organized by the Council of Australian University Librarians and its Member Institutions. Ms. Alateeq, her co-authors, and Dr. Hara declare no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EUROPEAN JOURNAL OF NUTRITION

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cancer risk elevated after stroke in younger people

Article Type
Changed
Tue, 04/04/2023 - 17:43

 

Younger people who experience stroke or intracerebral hemorrhage have about a three- to fivefold increased risk of being diagnosed with cancer in the next few years, new research shows.

In young people, stroke might be the first manifestation of an underlying cancer, according to the investigators, led by Jamie Verhoeven, MD, PhD, with the department of neurology, Radboud University Medical Centre, Nijmegen, the Netherlands.

The new study can be viewed as a “stepping stone for future studies investigating the usefulness of screening for cancer after stroke,” the researchers say.

The study was published online in JAMA Network Open.

Currently, the diagnostic workup for young people with stroke includes searching for rare clotting disorders, although screening for cancer is not regularly performed.

Some research suggests that stroke and cancer are linked, but the literature is limited. In prior studies among people of all ages, cancer incidence after stroke has been variable – from 1% to 5% at 1 year and from 11% to 30% after 10 years.

To the team’s knowledge, only two studies have described the incidence of cancer after stroke among younger patients. One put the risk at 0.5% for people aged 18-50 years in the first year after stroke; the other described a cumulative risk of 17.3% in the 10 years after stroke for patients aged 18-55 years.

Using Dutch data, Dr. Verhoeven and colleagues identified 27,616 young stroke patients (age, 15-49 years; median age, 45 years) and 362,782 older stroke patients (median age, 76 years).

The cumulative incidence of any new cancer at 10 years was 3.7% among the younger stroke patients and 8.5% among the older stroke patients.

The incidence of a new cancer after stroke among younger patients was higher among women than men, while the opposite was true for older stroke patients.

Compared with the general population, younger stroke patients had a more than 2.5-fold greater likelihood of being diagnosed with a new cancer in the first year after ischemic stroke (standardized incidence ratio, 2.6). The risk was highest for lung cancer (SIR, 6.9), followed by hematologic cancers (SIR, 5.2).

Compared with the general population, younger stroke patients had nearly a 5.5-fold greater likelihood of being diagnosed with a new cancer in the first year after intracerebral hemorrhage (SIR, 5.4), and the risk was highest for hematologic cancers (SIR, 14.2).

In younger patients, the cumulative incidence of any cancer decreased over the years but remained significantly higher for 8 years following a stroke.

For patients aged 50 years or older, the 1-year risk for any new cancer after either ischemic stroke or intracerebral hemorrhage was 1.2 times higher, compared with the general population.

“We typically think of occult cancer as being a cause of stroke in an older population, given that the incidence of cancer increases over time [but] what this study shows is that we probably do need to consider occult cancer as an underlying cause of stroke even in a younger population,” said Laura Gioia, MD, stroke neurologist at the University of Montreal, who was not involved in the research.

Dr. Verhoeven and colleagues conclude that their finding supports the hypothesis of a causal link between cancer and stroke. Given the timing between stroke and cancer diagnosis, cancer may have been present when the stroke occurred and possibly played a role in causing it, the authors note. However, conclusions on causal mechanisms cannot be drawn from the current study.

The question of whether young stroke patients should be screened for cancer is a tough one, Dr. Gioia noted. “Cancer represents a small percentage of causes of stroke. That means you would have to screen a lot of people with a benefit that is still uncertain for the moment,” Dr. Gioia said in an interview.

“I think we need to keep cancer in mind as a cause of stroke in our young patients, and that should probably guide our history-taking with the patient and consider imaging when it’s appropriate and when we think that there could be an underlying occult cancer,” Dr. Gioia suggested.

The study was funded in part through unrestricted funding by Stryker, Medtronic, and Cerenovus. Dr. Verhoeven and Dr. Gioia have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Younger people who experience stroke or intracerebral hemorrhage have about a three- to fivefold increased risk of being diagnosed with cancer in the next few years, new research shows.

In young people, stroke might be the first manifestation of an underlying cancer, according to the investigators, led by Jamie Verhoeven, MD, PhD, with the department of neurology, Radboud University Medical Centre, Nijmegen, the Netherlands.

The new study can be viewed as a “stepping stone for future studies investigating the usefulness of screening for cancer after stroke,” the researchers say.

The study was published online in JAMA Network Open.

Currently, the diagnostic workup for young people with stroke includes searching for rare clotting disorders, although screening for cancer is not regularly performed.

Some research suggests that stroke and cancer are linked, but the literature is limited. In prior studies among people of all ages, cancer incidence after stroke has been variable – from 1% to 5% at 1 year and from 11% to 30% after 10 years.

To the team’s knowledge, only two studies have described the incidence of cancer after stroke among younger patients. One put the risk at 0.5% for people aged 18-50 years in the first year after stroke; the other described a cumulative risk of 17.3% in the 10 years after stroke for patients aged 18-55 years.

Using Dutch data, Dr. Verhoeven and colleagues identified 27,616 young stroke patients (age, 15-49 years; median age, 45 years) and 362,782 older stroke patients (median age, 76 years).

The cumulative incidence of any new cancer at 10 years was 3.7% among the younger stroke patients and 8.5% among the older stroke patients.

The incidence of a new cancer after stroke among younger patients was higher among women than men, while the opposite was true for older stroke patients.

Compared with the general population, younger stroke patients had a more than 2.5-fold greater likelihood of being diagnosed with a new cancer in the first year after ischemic stroke (standardized incidence ratio, 2.6). The risk was highest for lung cancer (SIR, 6.9), followed by hematologic cancers (SIR, 5.2).

Compared with the general population, younger stroke patients had nearly a 5.5-fold greater likelihood of being diagnosed with a new cancer in the first year after intracerebral hemorrhage (SIR, 5.4), and the risk was highest for hematologic cancers (SIR, 14.2).

In younger patients, the cumulative incidence of any cancer decreased over the years but remained significantly higher for 8 years following a stroke.

For patients aged 50 years or older, the 1-year risk for any new cancer after either ischemic stroke or intracerebral hemorrhage was 1.2 times higher, compared with the general population.

“We typically think of occult cancer as being a cause of stroke in an older population, given that the incidence of cancer increases over time [but] what this study shows is that we probably do need to consider occult cancer as an underlying cause of stroke even in a younger population,” said Laura Gioia, MD, stroke neurologist at the University of Montreal, who was not involved in the research.

Dr. Verhoeven and colleagues conclude that their finding supports the hypothesis of a causal link between cancer and stroke. Given the timing between stroke and cancer diagnosis, cancer may have been present when the stroke occurred and possibly played a role in causing it, the authors note. However, conclusions on causal mechanisms cannot be drawn from the current study.

The question of whether young stroke patients should be screened for cancer is a tough one, Dr. Gioia noted. “Cancer represents a small percentage of causes of stroke. That means you would have to screen a lot of people with a benefit that is still uncertain for the moment,” Dr. Gioia said in an interview.

“I think we need to keep cancer in mind as a cause of stroke in our young patients, and that should probably guide our history-taking with the patient and consider imaging when it’s appropriate and when we think that there could be an underlying occult cancer,” Dr. Gioia suggested.

The study was funded in part through unrestricted funding by Stryker, Medtronic, and Cerenovus. Dr. Verhoeven and Dr. Gioia have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Younger people who experience stroke or intracerebral hemorrhage have about a three- to fivefold increased risk of being diagnosed with cancer in the next few years, new research shows.

In young people, stroke might be the first manifestation of an underlying cancer, according to the investigators, led by Jamie Verhoeven, MD, PhD, with the department of neurology, Radboud University Medical Centre, Nijmegen, the Netherlands.

The new study can be viewed as a “stepping stone for future studies investigating the usefulness of screening for cancer after stroke,” the researchers say.

The study was published online in JAMA Network Open.

Currently, the diagnostic workup for young people with stroke includes searching for rare clotting disorders, although screening for cancer is not regularly performed.

Some research suggests that stroke and cancer are linked, but the literature is limited. In prior studies among people of all ages, cancer incidence after stroke has been variable – from 1% to 5% at 1 year and from 11% to 30% after 10 years.

To the team’s knowledge, only two studies have described the incidence of cancer after stroke among younger patients. One put the risk at 0.5% for people aged 18-50 years in the first year after stroke; the other described a cumulative risk of 17.3% in the 10 years after stroke for patients aged 18-55 years.

Using Dutch data, Dr. Verhoeven and colleagues identified 27,616 young stroke patients (age, 15-49 years; median age, 45 years) and 362,782 older stroke patients (median age, 76 years).

The cumulative incidence of any new cancer at 10 years was 3.7% among the younger stroke patients and 8.5% among the older stroke patients.

The incidence of a new cancer after stroke among younger patients was higher among women than men, while the opposite was true for older stroke patients.

Compared with the general population, younger stroke patients had a more than 2.5-fold greater likelihood of being diagnosed with a new cancer in the first year after ischemic stroke (standardized incidence ratio, 2.6). The risk was highest for lung cancer (SIR, 6.9), followed by hematologic cancers (SIR, 5.2).

Compared with the general population, younger stroke patients had nearly a 5.5-fold greater likelihood of being diagnosed with a new cancer in the first year after intracerebral hemorrhage (SIR, 5.4), and the risk was highest for hematologic cancers (SIR, 14.2).

In younger patients, the cumulative incidence of any cancer decreased over the years but remained significantly higher for 8 years following a stroke.

For patients aged 50 years or older, the 1-year risk for any new cancer after either ischemic stroke or intracerebral hemorrhage was 1.2 times higher, compared with the general population.

“We typically think of occult cancer as being a cause of stroke in an older population, given that the incidence of cancer increases over time [but] what this study shows is that we probably do need to consider occult cancer as an underlying cause of stroke even in a younger population,” said Laura Gioia, MD, stroke neurologist at the University of Montreal, who was not involved in the research.

Dr. Verhoeven and colleagues conclude that their finding supports the hypothesis of a causal link between cancer and stroke. Given the timing between stroke and cancer diagnosis, cancer may have been present when the stroke occurred and possibly played a role in causing it, the authors note. However, conclusions on causal mechanisms cannot be drawn from the current study.

The question of whether young stroke patients should be screened for cancer is a tough one, Dr. Gioia noted. “Cancer represents a small percentage of causes of stroke. That means you would have to screen a lot of people with a benefit that is still uncertain for the moment,” Dr. Gioia said in an interview.

“I think we need to keep cancer in mind as a cause of stroke in our young patients, and that should probably guide our history-taking with the patient and consider imaging when it’s appropriate and when we think that there could be an underlying occult cancer,” Dr. Gioia suggested.

The study was funded in part through unrestricted funding by Stryker, Medtronic, and Cerenovus. Dr. Verhoeven and Dr. Gioia have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cluster, migraine headache strongly linked to circadian rhythm

Article Type
Changed
Mon, 04/03/2023 - 14:18

 

Cluster headache and migraine have strong ties to the circadian system at multiple levels, say new findings that could have significant treatment implications.

A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.

Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.

Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).

“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.

“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.

The study was published online in Neurology.


 

Treatment implications?

Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.

Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.

On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.

Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.

“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.

“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.

“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
 

Importance of sleep regulation

The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.

“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.

“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.

A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.

The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Cluster headache and migraine have strong ties to the circadian system at multiple levels, say new findings that could have significant treatment implications.

A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.

Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.

Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).

“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.

“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.

The study was published online in Neurology.


 

Treatment implications?

Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.

Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.

On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.

Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.

“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.

“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.

“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
 

Importance of sleep regulation

The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.

“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.

“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.

A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.

The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.

A version of this article first appeared on Medscape.com.

 

Cluster headache and migraine have strong ties to the circadian system at multiple levels, say new findings that could have significant treatment implications.

A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.

Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.

Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).

“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.

“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.

The study was published online in Neurology.


 

Treatment implications?

Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.

Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.

On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.

Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.

“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.

“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.

“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
 

Importance of sleep regulation

The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.

“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.

“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.

A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.

The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Do B vitamins reduce Parkinson’s risk?

Article Type
Changed
Mon, 04/03/2023 - 14:25

 

Increasing intake of folate and vitamin B6 beyond recommended daily levels offers no protective benefit against Parkinson’s disease (PD), a new study shows.

Though there was some evidence that vitamin B12 early in life was associated with decreased PD risk, the findings were inconsistent and were observed only in people whose daily intake was 10 times the recommended level.

“The results of this large prospective study do not support the hypothesis that increasing folate or vitamin B6 intakes above the current levels would reduce PD risk in this population of mostly White U.S. health professionals,” lead investigator Mario H. Flores-Torres, MD, PhD, a research scientist in the department of nutrition at the Harvard T.H. Chan School of Public Health, Boston, said in an interview.

However, he added, the study “leaves open the possibility that in some individuals the intake of vitamin B12 contributes to PD risk – a finding that warrants further research.”

The findings were published online  in Movement Disorders.
 

Mixed findings

Previous studies have suggested B vitamins – including folate, B6 and B12 – might affect PD risk, but results have been mixed.

The new study included 80,965 women from the Nurses’ Health Study (1984-2016) and 48,837 men from the Health Professionals Follow-up Study (1986-2016). The average age at baseline was 50 years in women and 54 years in men, and participants were followed for about 30 years.

Participants completed questionnaires about diet at the beginning of the study and again every 4 years.

To account for the possibility of reverse causation due to the long prodromal phase of PD, investigators conducted lagged analyses at 8, 12, 16, and 20 years.

During the follow-up period, 1,426 incident cases of PD were diagnosed (687 in women and 739 in men).

Researchers found no link between reduced PD risk and intake of vitamin B6 or folate.

Though the total cumulative average intake of vitamin B12 was not associated with PD risk, investigators noted a modest decrease in risk between those with highest baseline of B12 and participants with the lowest baseline levels (hazard ratio, 0.80; P = .01).

Individuals in the highest quintile of B12 intake at baseline had an average intake of 21-22 mcg/d, close to 10 times the recommended daily intake of 2.4 mcg/d.

“Although some of our results suggest that a higher intake of vitamin B12 may decrease the risk of PD in a population of U.S. health professionals, the associations we observed were modest and not entirely consistent,” Dr. Flores-Torres said.

“Additional studies need to confirm our findings to better understand whether people who take higher amounts of B12 younger in life may have a protective benefit against PD,” he added.
 

The whole picture?

Commenting on the findings for this article, Rebecca Gilbert, MD, PhD, chief scientific officer of the American Parkinson Disease Association, New York, noted that checking B vitamin levels is a fairly standard practice for most clinicians. In that regard, this study highlights why this is important.

“Neurologists will often test B12 levels and recommend a supplement if your level is below the normal range,” she said. “No one is questioning the value of B12 for nerves and recommend that B12 is in the normal to high normal range.”

But understanding how B vitamins may or may not affect PD risk might require a different kind of study.

“This analysis, much like many others, is trying so hard to figure out what is it in diets that affects Parkinson’s disease risk,” Dr. Gilbert said. “But we have yet to say these are the nutrients that prevent Parkinson’s or increase the risk.”

One reason for the conflicting results in studies such as this could be that the explanation for the link between diet and PD risk may not be in specific minerals consumed but rather in the diet as a whole.

“Focusing on specific elements of a diet may not give us the answer,” Dr. Gilbert said. “We should be analyzing diet as a complete holistic picture because it’s not just the elements but how everything in what we eat works together.”

The study was funded by the National Institutes of Health and the Parkinson’s Foundation. Dr. Flores-Torres and Dr. Gilbert report no relevant conflicts.
 

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

 

Increasing intake of folate and vitamin B6 beyond recommended daily levels offers no protective benefit against Parkinson’s disease (PD), a new study shows.

Though there was some evidence that vitamin B12 early in life was associated with decreased PD risk, the findings were inconsistent and were observed only in people whose daily intake was 10 times the recommended level.

“The results of this large prospective study do not support the hypothesis that increasing folate or vitamin B6 intakes above the current levels would reduce PD risk in this population of mostly White U.S. health professionals,” lead investigator Mario H. Flores-Torres, MD, PhD, a research scientist in the department of nutrition at the Harvard T.H. Chan School of Public Health, Boston, said in an interview.

However, he added, the study “leaves open the possibility that in some individuals the intake of vitamin B12 contributes to PD risk – a finding that warrants further research.”

The findings were published online  in Movement Disorders.
 

Mixed findings

Previous studies have suggested B vitamins – including folate, B6 and B12 – might affect PD risk, but results have been mixed.

The new study included 80,965 women from the Nurses’ Health Study (1984-2016) and 48,837 men from the Health Professionals Follow-up Study (1986-2016). The average age at baseline was 50 years in women and 54 years in men, and participants were followed for about 30 years.

Participants completed questionnaires about diet at the beginning of the study and again every 4 years.

To account for the possibility of reverse causation due to the long prodromal phase of PD, investigators conducted lagged analyses at 8, 12, 16, and 20 years.

During the follow-up period, 1,426 incident cases of PD were diagnosed (687 in women and 739 in men).

Researchers found no link between reduced PD risk and intake of vitamin B6 or folate.

Though the total cumulative average intake of vitamin B12 was not associated with PD risk, investigators noted a modest decrease in risk between those with highest baseline of B12 and participants with the lowest baseline levels (hazard ratio, 0.80; P = .01).

Individuals in the highest quintile of B12 intake at baseline had an average intake of 21-22 mcg/d, close to 10 times the recommended daily intake of 2.4 mcg/d.

“Although some of our results suggest that a higher intake of vitamin B12 may decrease the risk of PD in a population of U.S. health professionals, the associations we observed were modest and not entirely consistent,” Dr. Flores-Torres said.

“Additional studies need to confirm our findings to better understand whether people who take higher amounts of B12 younger in life may have a protective benefit against PD,” he added.
 

The whole picture?

Commenting on the findings for this article, Rebecca Gilbert, MD, PhD, chief scientific officer of the American Parkinson Disease Association, New York, noted that checking B vitamin levels is a fairly standard practice for most clinicians. In that regard, this study highlights why this is important.

“Neurologists will often test B12 levels and recommend a supplement if your level is below the normal range,” she said. “No one is questioning the value of B12 for nerves and recommend that B12 is in the normal to high normal range.”

But understanding how B vitamins may or may not affect PD risk might require a different kind of study.

“This analysis, much like many others, is trying so hard to figure out what is it in diets that affects Parkinson’s disease risk,” Dr. Gilbert said. “But we have yet to say these are the nutrients that prevent Parkinson’s or increase the risk.”

One reason for the conflicting results in studies such as this could be that the explanation for the link between diet and PD risk may not be in specific minerals consumed but rather in the diet as a whole.

“Focusing on specific elements of a diet may not give us the answer,” Dr. Gilbert said. “We should be analyzing diet as a complete holistic picture because it’s not just the elements but how everything in what we eat works together.”

The study was funded by the National Institutes of Health and the Parkinson’s Foundation. Dr. Flores-Torres and Dr. Gilbert report no relevant conflicts.
 

A version of this article originally appeared on Medscape.com.

 

Increasing intake of folate and vitamin B6 beyond recommended daily levels offers no protective benefit against Parkinson’s disease (PD), a new study shows.

Though there was some evidence that vitamin B12 early in life was associated with decreased PD risk, the findings were inconsistent and were observed only in people whose daily intake was 10 times the recommended level.

“The results of this large prospective study do not support the hypothesis that increasing folate or vitamin B6 intakes above the current levels would reduce PD risk in this population of mostly White U.S. health professionals,” lead investigator Mario H. Flores-Torres, MD, PhD, a research scientist in the department of nutrition at the Harvard T.H. Chan School of Public Health, Boston, said in an interview.

However, he added, the study “leaves open the possibility that in some individuals the intake of vitamin B12 contributes to PD risk – a finding that warrants further research.”

The findings were published online  in Movement Disorders.
 

Mixed findings

Previous studies have suggested B vitamins – including folate, B6 and B12 – might affect PD risk, but results have been mixed.

The new study included 80,965 women from the Nurses’ Health Study (1984-2016) and 48,837 men from the Health Professionals Follow-up Study (1986-2016). The average age at baseline was 50 years in women and 54 years in men, and participants were followed for about 30 years.

Participants completed questionnaires about diet at the beginning of the study and again every 4 years.

To account for the possibility of reverse causation due to the long prodromal phase of PD, investigators conducted lagged analyses at 8, 12, 16, and 20 years.

During the follow-up period, 1,426 incident cases of PD were diagnosed (687 in women and 739 in men).

Researchers found no link between reduced PD risk and intake of vitamin B6 or folate.

Though the total cumulative average intake of vitamin B12 was not associated with PD risk, investigators noted a modest decrease in risk between those with highest baseline of B12 and participants with the lowest baseline levels (hazard ratio, 0.80; P = .01).

Individuals in the highest quintile of B12 intake at baseline had an average intake of 21-22 mcg/d, close to 10 times the recommended daily intake of 2.4 mcg/d.

“Although some of our results suggest that a higher intake of vitamin B12 may decrease the risk of PD in a population of U.S. health professionals, the associations we observed were modest and not entirely consistent,” Dr. Flores-Torres said.

“Additional studies need to confirm our findings to better understand whether people who take higher amounts of B12 younger in life may have a protective benefit against PD,” he added.
 

The whole picture?

Commenting on the findings for this article, Rebecca Gilbert, MD, PhD, chief scientific officer of the American Parkinson Disease Association, New York, noted that checking B vitamin levels is a fairly standard practice for most clinicians. In that regard, this study highlights why this is important.

“Neurologists will often test B12 levels and recommend a supplement if your level is below the normal range,” she said. “No one is questioning the value of B12 for nerves and recommend that B12 is in the normal to high normal range.”

But understanding how B vitamins may or may not affect PD risk might require a different kind of study.

“This analysis, much like many others, is trying so hard to figure out what is it in diets that affects Parkinson’s disease risk,” Dr. Gilbert said. “But we have yet to say these are the nutrients that prevent Parkinson’s or increase the risk.”

One reason for the conflicting results in studies such as this could be that the explanation for the link between diet and PD risk may not be in specific minerals consumed but rather in the diet as a whole.

“Focusing on specific elements of a diet may not give us the answer,” Dr. Gilbert said. “We should be analyzing diet as a complete holistic picture because it’s not just the elements but how everything in what we eat works together.”

The study was funded by the National Institutes of Health and the Parkinson’s Foundation. Dr. Flores-Torres and Dr. Gilbert report no relevant conflicts.
 

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM MOVEMENT DISORDERS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Exercise tied to reduced Parkinson’s motor symptoms and increased well-being

Article Type
Changed
Thu, 03/30/2023 - 13:07

Physical exercise may improve the motor symptoms and quality of life for patients with Parkinson’s disease, new research shows. A systematic review of 156 clinical trials involving 8,000 patients with Parkinson’s disease showed dancing and aquatic exercise, in particular, were most likely to improve motor symptoms, while swimming, endurance training, and mind-body training were most likely to benefit quality of life.

“For most types of exercise we studied, we observed positive effects on both the severity of motor signs and quality of life. These results highlight the importance of exercise in general, as they suggest people with Parkinson’s disease can benefit from a variety of exercises,” said study investigator Moritz Ernst, MSc, deputy head of the working group on evidence-based medicine at the University Hospital Cologne (Germany).

University Hospital Cologne
Moritz Ernst

“Clinicians and people with Parkinson’s disease may have several options of exercise programs to choose from when establishing an individual training routine,” he added, emphasizing that overall those with Parkinson’s disease should seek professional advice, including assessment of motor and nonmotor symptoms, to develop a training agenda based on their individual needs.

The study was published online in the Cochrane Database of Systematic Reviews.
 

May I have this dance?

The investigators analyzed data from randomized, controlled trials comparing different types of exercise and no exercise and the subsequent effect on Parkinson’s disease symptoms. Exercise included dance, strength-resistance training, mind-body training such as tai chi and yoga, water-based training, resistance training, gait/balance/functional training, and endurance training.

The average age of study participants ranged from 60 to 74 years, and most of the studies included patients with mild to moderate Parkinson’s disease. The mean length of the various interventions was 12 weeks.

When the researchers examined the effect of exercise on motor symptoms, they found that dance (P = .88), aqua-based training (P = .69), and gait/balance/functional training (P = .67) were most likely to reduce symptom severity.

Aqua-based training (P = .95), endurance training (P = .77), and mind-body training (P = .75) were most were most likely to benefit quality of life, although the investigators caution that these findings were at risk of bias because quality of life was self-reported.

The investigators noted other study limitations including the fact that most of the studies included in the review had small sample sizes and their study only included patients with mild to moderate versus severe Parkinson’s disease.

The authors said that future research should include larger samples, report intent-to-treat analyses, and involve participants with more advanced forms of Parkinson’s disease who may also have cognitive difficulties.
 

Prescribe exercise

“We should be giving our patients, no matter where they are in their disease stage, a ‘prescription’ to exercise,” said Mitra Afshari, MD, MPH. Dr. Afshari was not involved in the study but leads her own research on Parkinson’s disease and exercise as the site principal investigator on the National Institutes of Health–funded SPARX3 Study in Parkinson’s Disease and Exercise at Rush University in Chicago. She said that, based on her experience caring for patients with Parkinson’s disease at all disease stages, “patients who have been physically active their whole lives and can maintain that activity despite their diagnosis fare the best.”

However, she added, those who initiate physical exercise after diagnosis can also do very well and reap benefits, including improved motor symptoms.

The study was funded by University Hospital of Cologne, Faculty of Medicine and University Hospital, University of Cologne, and the German Ministry of Education and Research. The authors have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Physical exercise may improve the motor symptoms and quality of life for patients with Parkinson’s disease, new research shows. A systematic review of 156 clinical trials involving 8,000 patients with Parkinson’s disease showed dancing and aquatic exercise, in particular, were most likely to improve motor symptoms, while swimming, endurance training, and mind-body training were most likely to benefit quality of life.

“For most types of exercise we studied, we observed positive effects on both the severity of motor signs and quality of life. These results highlight the importance of exercise in general, as they suggest people with Parkinson’s disease can benefit from a variety of exercises,” said study investigator Moritz Ernst, MSc, deputy head of the working group on evidence-based medicine at the University Hospital Cologne (Germany).

University Hospital Cologne
Moritz Ernst

“Clinicians and people with Parkinson’s disease may have several options of exercise programs to choose from when establishing an individual training routine,” he added, emphasizing that overall those with Parkinson’s disease should seek professional advice, including assessment of motor and nonmotor symptoms, to develop a training agenda based on their individual needs.

The study was published online in the Cochrane Database of Systematic Reviews.
 

May I have this dance?

The investigators analyzed data from randomized, controlled trials comparing different types of exercise and no exercise and the subsequent effect on Parkinson’s disease symptoms. Exercise included dance, strength-resistance training, mind-body training such as tai chi and yoga, water-based training, resistance training, gait/balance/functional training, and endurance training.

The average age of study participants ranged from 60 to 74 years, and most of the studies included patients with mild to moderate Parkinson’s disease. The mean length of the various interventions was 12 weeks.

When the researchers examined the effect of exercise on motor symptoms, they found that dance (P = .88), aqua-based training (P = .69), and gait/balance/functional training (P = .67) were most likely to reduce symptom severity.

Aqua-based training (P = .95), endurance training (P = .77), and mind-body training (P = .75) were most were most likely to benefit quality of life, although the investigators caution that these findings were at risk of bias because quality of life was self-reported.

The investigators noted other study limitations including the fact that most of the studies included in the review had small sample sizes and their study only included patients with mild to moderate versus severe Parkinson’s disease.

The authors said that future research should include larger samples, report intent-to-treat analyses, and involve participants with more advanced forms of Parkinson’s disease who may also have cognitive difficulties.
 

Prescribe exercise

“We should be giving our patients, no matter where they are in their disease stage, a ‘prescription’ to exercise,” said Mitra Afshari, MD, MPH. Dr. Afshari was not involved in the study but leads her own research on Parkinson’s disease and exercise as the site principal investigator on the National Institutes of Health–funded SPARX3 Study in Parkinson’s Disease and Exercise at Rush University in Chicago. She said that, based on her experience caring for patients with Parkinson’s disease at all disease stages, “patients who have been physically active their whole lives and can maintain that activity despite their diagnosis fare the best.”

However, she added, those who initiate physical exercise after diagnosis can also do very well and reap benefits, including improved motor symptoms.

The study was funded by University Hospital of Cologne, Faculty of Medicine and University Hospital, University of Cologne, and the German Ministry of Education and Research. The authors have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Physical exercise may improve the motor symptoms and quality of life for patients with Parkinson’s disease, new research shows. A systematic review of 156 clinical trials involving 8,000 patients with Parkinson’s disease showed dancing and aquatic exercise, in particular, were most likely to improve motor symptoms, while swimming, endurance training, and mind-body training were most likely to benefit quality of life.

“For most types of exercise we studied, we observed positive effects on both the severity of motor signs and quality of life. These results highlight the importance of exercise in general, as they suggest people with Parkinson’s disease can benefit from a variety of exercises,” said study investigator Moritz Ernst, MSc, deputy head of the working group on evidence-based medicine at the University Hospital Cologne (Germany).

University Hospital Cologne
Moritz Ernst

“Clinicians and people with Parkinson’s disease may have several options of exercise programs to choose from when establishing an individual training routine,” he added, emphasizing that overall those with Parkinson’s disease should seek professional advice, including assessment of motor and nonmotor symptoms, to develop a training agenda based on their individual needs.

The study was published online in the Cochrane Database of Systematic Reviews.
 

May I have this dance?

The investigators analyzed data from randomized, controlled trials comparing different types of exercise and no exercise and the subsequent effect on Parkinson’s disease symptoms. Exercise included dance, strength-resistance training, mind-body training such as tai chi and yoga, water-based training, resistance training, gait/balance/functional training, and endurance training.

The average age of study participants ranged from 60 to 74 years, and most of the studies included patients with mild to moderate Parkinson’s disease. The mean length of the various interventions was 12 weeks.

When the researchers examined the effect of exercise on motor symptoms, they found that dance (P = .88), aqua-based training (P = .69), and gait/balance/functional training (P = .67) were most likely to reduce symptom severity.

Aqua-based training (P = .95), endurance training (P = .77), and mind-body training (P = .75) were most were most likely to benefit quality of life, although the investigators caution that these findings were at risk of bias because quality of life was self-reported.

The investigators noted other study limitations including the fact that most of the studies included in the review had small sample sizes and their study only included patients with mild to moderate versus severe Parkinson’s disease.

The authors said that future research should include larger samples, report intent-to-treat analyses, and involve participants with more advanced forms of Parkinson’s disease who may also have cognitive difficulties.
 

Prescribe exercise

“We should be giving our patients, no matter where they are in their disease stage, a ‘prescription’ to exercise,” said Mitra Afshari, MD, MPH. Dr. Afshari was not involved in the study but leads her own research on Parkinson’s disease and exercise as the site principal investigator on the National Institutes of Health–funded SPARX3 Study in Parkinson’s Disease and Exercise at Rush University in Chicago. She said that, based on her experience caring for patients with Parkinson’s disease at all disease stages, “patients who have been physically active their whole lives and can maintain that activity despite their diagnosis fare the best.”

However, she added, those who initiate physical exercise after diagnosis can also do very well and reap benefits, including improved motor symptoms.

The study was funded by University Hospital of Cologne, Faculty of Medicine and University Hospital, University of Cologne, and the German Ministry of Education and Research. The authors have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE COCHRANE DATABASE OF SYSTEMATIC REVIEWS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Poor bone health is a ‘robust’ dementia risk factor

Article Type
Changed
Thu, 03/30/2023 - 07:52

Low bone mineral density (BMD), particularly at the femoral neck, emerged as a “robust” risk factor for dementia in older adults in the long-running Rotterdam Study. After adjusting for relevant factors, adults with the lowest versus highest BMD at the femoral neck were 42% more likely to develop dementia over roughly 10 years.

“Our research has found a link between bone loss and dementia, but further studies are needed to better understand this connection between bone density and memory loss,” study investigator Mohammad Arfan Ikram, MD, PhD, with Erasmus University Medical Center in Rotterdam, the Netherlands, said in a statement.

“It’s possible that bone loss may occur already in the earliest phases of dementia, years before any clinical symptoms manifest themselves. If that were the case, bone loss could be an indicator of risk for dementia and people with bone loss could be targeted for screening and improved care,” Dr. Ikram added.

The study was published online in Neurology.


 

Common bedfellows

Low BMD and dementia commonly co-occur in the older population, with bone loss accelerating in dementia patients because of physical inactivity and poor nutrition. However, the extent to which bone loss already exists prior to the onset of dementia remains unclear.

The new findings are based on 3,651 adults (mean age 72 years, 58% women) in the Rotterdam Study who were free of dementia between 2002 and 2005. At that time, BMD at the femoral neck, lumbar spine, and total body were obtained using dual-energy radiography absorptiometry (DXA) and the trabecular bone score, which offers further details such as bone microarchitecture, was calculated. Participants were followed up until Jan. 1, 2020.

Analyses were adjusted for age, sex, education, physical activity, smoking status, body mass index, blood pressure, cholesterol, history of comorbidities (stroke and diabetes), and apolipoprotein E genotype.

During follow-up, 688 (19%) participants developed dementia, mostly Alzheimer’s disease (77%).

Throughout the entire follow-up period, lower BMD at the femoral neck (per standard deviation), but not at other bone sites, correlated with a higher risk for all-cause dementia (hazard ratio, 1.12; 95% confidence interval, 1.02-1.23) and Alzheimer’s disease (HR, 1.14; 95% CI, 1.02-1.28).

Within the first 10 years after baseline, the risk for dementia was greatest in individuals with the lowest BMD at the femoral neck (HR, 2.03; 95% CI, 1.39-2.96) and total body (HR, 1.42; 95% CI, 1.01-2.02) and lowest trabecular bone score (HR, 1.59; 95% CI, 1.11-2.28).

Only BMD at the femoral neck was related to incident all-cause dementia in the first 5 years of follow-up (HR, 2.13; 95% CI, 1.28-3.57).

These findings add “extra knowledge to previous findings that associations change with time, with the strength of the effect decreasing with increasing follow-up time,” the investigators noted.

They suggest that total BMD and trabecular bone score might occur as “prodromal features instead of causes of dementia and related toxic protein accumulation in the brain. In other words, persons with subclinical, incipient dementia may have poor bone health due to the dementia process instead of vice versa.”

The investigators noted that further research focusing on the predictive ability of BMD for dementia is necessary. “As an indicator of dementia risk, intervening in BMD may improve clinical care of these persons, especially considering the multicomorbidities and polypharmacy that are highly preventive in this group,” they concluded.
 

 

 

Little known bone-brain axis to blame?

In a comment, Shaheen Lakhan, MD, a neurologist and researcher in Boston, noted that “bone health is increasingly becoming front of mind in older adults. This study confirms an association between poor bone health – low bone mineral density and bone scores – and poor brain health.”

However, it’s unclear whether the link is causal – that is, whether poor bone health actually leads to poor brain health, and whether that can be staved off by directly supporting bone density,” Dr. Lakhan said.

“The link may very well be the little known ‘brain-bone axis’ – where our bones actually regulate our brain,” he added.

“Take for example the bone-generated hormone osteocalcin that crosses the blood-brain barrier and regulates brain functions like memory and cognition. Mice who don’t express the osteocalcin gene or are injected with antibodies that block osteocalcin actually have poor memory and worse anxiety,” Dr. Lakhan said.

“In any event, good bone health begins with healthy habits: a diet with plenty of calcium, vitamin D, and protein; a regimen of not just cardio, but also weight-bearing exercises; and staying clear of smoking and heavy alcohol intake,” he concluded.

The study was funded by Erasmus Medical Center and Erasmus University Rotterdam, the Netherlands Organization for Scientific Research, the Netherlands Organization for Health Research and Development, the Research Institute for Diseases in the Elderly, the Netherlands Genomics Initiative, the Ministry of Education, Culture and Science, the Ministry of Health, Welfare and Sports, the European Commission, and the Municipality of Rotterdam. Dr. Ikram and Dr. Lakhan report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Low bone mineral density (BMD), particularly at the femoral neck, emerged as a “robust” risk factor for dementia in older adults in the long-running Rotterdam Study. After adjusting for relevant factors, adults with the lowest versus highest BMD at the femoral neck were 42% more likely to develop dementia over roughly 10 years.

“Our research has found a link between bone loss and dementia, but further studies are needed to better understand this connection between bone density and memory loss,” study investigator Mohammad Arfan Ikram, MD, PhD, with Erasmus University Medical Center in Rotterdam, the Netherlands, said in a statement.

“It’s possible that bone loss may occur already in the earliest phases of dementia, years before any clinical symptoms manifest themselves. If that were the case, bone loss could be an indicator of risk for dementia and people with bone loss could be targeted for screening and improved care,” Dr. Ikram added.

The study was published online in Neurology.


 

Common bedfellows

Low BMD and dementia commonly co-occur in the older population, with bone loss accelerating in dementia patients because of physical inactivity and poor nutrition. However, the extent to which bone loss already exists prior to the onset of dementia remains unclear.

The new findings are based on 3,651 adults (mean age 72 years, 58% women) in the Rotterdam Study who were free of dementia between 2002 and 2005. At that time, BMD at the femoral neck, lumbar spine, and total body were obtained using dual-energy radiography absorptiometry (DXA) and the trabecular bone score, which offers further details such as bone microarchitecture, was calculated. Participants were followed up until Jan. 1, 2020.

Analyses were adjusted for age, sex, education, physical activity, smoking status, body mass index, blood pressure, cholesterol, history of comorbidities (stroke and diabetes), and apolipoprotein E genotype.

During follow-up, 688 (19%) participants developed dementia, mostly Alzheimer’s disease (77%).

Throughout the entire follow-up period, lower BMD at the femoral neck (per standard deviation), but not at other bone sites, correlated with a higher risk for all-cause dementia (hazard ratio, 1.12; 95% confidence interval, 1.02-1.23) and Alzheimer’s disease (HR, 1.14; 95% CI, 1.02-1.28).

Within the first 10 years after baseline, the risk for dementia was greatest in individuals with the lowest BMD at the femoral neck (HR, 2.03; 95% CI, 1.39-2.96) and total body (HR, 1.42; 95% CI, 1.01-2.02) and lowest trabecular bone score (HR, 1.59; 95% CI, 1.11-2.28).

Only BMD at the femoral neck was related to incident all-cause dementia in the first 5 years of follow-up (HR, 2.13; 95% CI, 1.28-3.57).

These findings add “extra knowledge to previous findings that associations change with time, with the strength of the effect decreasing with increasing follow-up time,” the investigators noted.

They suggest that total BMD and trabecular bone score might occur as “prodromal features instead of causes of dementia and related toxic protein accumulation in the brain. In other words, persons with subclinical, incipient dementia may have poor bone health due to the dementia process instead of vice versa.”

The investigators noted that further research focusing on the predictive ability of BMD for dementia is necessary. “As an indicator of dementia risk, intervening in BMD may improve clinical care of these persons, especially considering the multicomorbidities and polypharmacy that are highly preventive in this group,” they concluded.
 

 

 

Little known bone-brain axis to blame?

In a comment, Shaheen Lakhan, MD, a neurologist and researcher in Boston, noted that “bone health is increasingly becoming front of mind in older adults. This study confirms an association between poor bone health – low bone mineral density and bone scores – and poor brain health.”

However, it’s unclear whether the link is causal – that is, whether poor bone health actually leads to poor brain health, and whether that can be staved off by directly supporting bone density,” Dr. Lakhan said.

“The link may very well be the little known ‘brain-bone axis’ – where our bones actually regulate our brain,” he added.

“Take for example the bone-generated hormone osteocalcin that crosses the blood-brain barrier and regulates brain functions like memory and cognition. Mice who don’t express the osteocalcin gene or are injected with antibodies that block osteocalcin actually have poor memory and worse anxiety,” Dr. Lakhan said.

“In any event, good bone health begins with healthy habits: a diet with plenty of calcium, vitamin D, and protein; a regimen of not just cardio, but also weight-bearing exercises; and staying clear of smoking and heavy alcohol intake,” he concluded.

The study was funded by Erasmus Medical Center and Erasmus University Rotterdam, the Netherlands Organization for Scientific Research, the Netherlands Organization for Health Research and Development, the Research Institute for Diseases in the Elderly, the Netherlands Genomics Initiative, the Ministry of Education, Culture and Science, the Ministry of Health, Welfare and Sports, the European Commission, and the Municipality of Rotterdam. Dr. Ikram and Dr. Lakhan report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Low bone mineral density (BMD), particularly at the femoral neck, emerged as a “robust” risk factor for dementia in older adults in the long-running Rotterdam Study. After adjusting for relevant factors, adults with the lowest versus highest BMD at the femoral neck were 42% more likely to develop dementia over roughly 10 years.

“Our research has found a link between bone loss and dementia, but further studies are needed to better understand this connection between bone density and memory loss,” study investigator Mohammad Arfan Ikram, MD, PhD, with Erasmus University Medical Center in Rotterdam, the Netherlands, said in a statement.

“It’s possible that bone loss may occur already in the earliest phases of dementia, years before any clinical symptoms manifest themselves. If that were the case, bone loss could be an indicator of risk for dementia and people with bone loss could be targeted for screening and improved care,” Dr. Ikram added.

The study was published online in Neurology.


 

Common bedfellows

Low BMD and dementia commonly co-occur in the older population, with bone loss accelerating in dementia patients because of physical inactivity and poor nutrition. However, the extent to which bone loss already exists prior to the onset of dementia remains unclear.

The new findings are based on 3,651 adults (mean age 72 years, 58% women) in the Rotterdam Study who were free of dementia between 2002 and 2005. At that time, BMD at the femoral neck, lumbar spine, and total body were obtained using dual-energy radiography absorptiometry (DXA) and the trabecular bone score, which offers further details such as bone microarchitecture, was calculated. Participants were followed up until Jan. 1, 2020.

Analyses were adjusted for age, sex, education, physical activity, smoking status, body mass index, blood pressure, cholesterol, history of comorbidities (stroke and diabetes), and apolipoprotein E genotype.

During follow-up, 688 (19%) participants developed dementia, mostly Alzheimer’s disease (77%).

Throughout the entire follow-up period, lower BMD at the femoral neck (per standard deviation), but not at other bone sites, correlated with a higher risk for all-cause dementia (hazard ratio, 1.12; 95% confidence interval, 1.02-1.23) and Alzheimer’s disease (HR, 1.14; 95% CI, 1.02-1.28).

Within the first 10 years after baseline, the risk for dementia was greatest in individuals with the lowest BMD at the femoral neck (HR, 2.03; 95% CI, 1.39-2.96) and total body (HR, 1.42; 95% CI, 1.01-2.02) and lowest trabecular bone score (HR, 1.59; 95% CI, 1.11-2.28).

Only BMD at the femoral neck was related to incident all-cause dementia in the first 5 years of follow-up (HR, 2.13; 95% CI, 1.28-3.57).

These findings add “extra knowledge to previous findings that associations change with time, with the strength of the effect decreasing with increasing follow-up time,” the investigators noted.

They suggest that total BMD and trabecular bone score might occur as “prodromal features instead of causes of dementia and related toxic protein accumulation in the brain. In other words, persons with subclinical, incipient dementia may have poor bone health due to the dementia process instead of vice versa.”

The investigators noted that further research focusing on the predictive ability of BMD for dementia is necessary. “As an indicator of dementia risk, intervening in BMD may improve clinical care of these persons, especially considering the multicomorbidities and polypharmacy that are highly preventive in this group,” they concluded.
 

 

 

Little known bone-brain axis to blame?

In a comment, Shaheen Lakhan, MD, a neurologist and researcher in Boston, noted that “bone health is increasingly becoming front of mind in older adults. This study confirms an association between poor bone health – low bone mineral density and bone scores – and poor brain health.”

However, it’s unclear whether the link is causal – that is, whether poor bone health actually leads to poor brain health, and whether that can be staved off by directly supporting bone density,” Dr. Lakhan said.

“The link may very well be the little known ‘brain-bone axis’ – where our bones actually regulate our brain,” he added.

“Take for example the bone-generated hormone osteocalcin that crosses the blood-brain barrier and regulates brain functions like memory and cognition. Mice who don’t express the osteocalcin gene or are injected with antibodies that block osteocalcin actually have poor memory and worse anxiety,” Dr. Lakhan said.

“In any event, good bone health begins with healthy habits: a diet with plenty of calcium, vitamin D, and protein; a regimen of not just cardio, but also weight-bearing exercises; and staying clear of smoking and heavy alcohol intake,” he concluded.

The study was funded by Erasmus Medical Center and Erasmus University Rotterdam, the Netherlands Organization for Scientific Research, the Netherlands Organization for Health Research and Development, the Research Institute for Diseases in the Elderly, the Netherlands Genomics Initiative, the Ministry of Education, Culture and Science, the Ministry of Health, Welfare and Sports, the European Commission, and the Municipality of Rotterdam. Dr. Ikram and Dr. Lakhan report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Longer telomeres tied to better brain health

Article Type
Changed
Thu, 03/30/2023 - 07:53

Telomere shortening – a sign of cellular aging – is associated with multiple changes in the brain associated with dementia, whereas longer telomeres associate with better brain health and lower risk for dementia, new research suggests.

“This is the largest and most systematic investigation of telomere length and brain structure and function,” said Anya Topiwala, of the University of Oxford (England). “We found that longer telomeres associated with protection against dementia. The links with brain structure, we think, offer a possible mechanism for this protection. The hope is, by understanding the mechanism, new treatment targets could be uncovered,” Dr. Topiwala said.

The study was published online in PLOS ONE.
 

UK Biobank cohort

Telomeres form protective caps at the ends of chromosomes, and they progressively shorten with age, which may increase susceptibility to age-related diseases including Alzheimer’s disease. The mechanism underlying this risk is unclear and may involve changes in brain structure and function. However, the relationship between telomere length and neuroimaging markers is poorly characterized.

Dr. Topiwala and colleagues compared telomere length in white blood cells to brain MRI and health record data in 31,661 middle-aged and older adults in UK Biobank. They found that longer leucocyte telomere length (LTL) was associated with a larger volume of global and subcortical grey matter and a larger hippocampus – both of which shrink in patients with Alzheimer’s disease. Longer telomeres were also associated with a thicker cerebral cortex, which thins as Alzheimer’s disease progresses.

Longer LTL was also associated with reduced incidence of dementia during follow-up (hazard ratio, 0.93; 95% confidence interval, 0.91-0.96).

Dr. Topiwala noted that many of the factors related to telomere shortening, such as age, genetics, and sex, can’t be changed. However, in a previous study, her team found that drinking alcohol may shorten telomere length. “So by this logic, reducing your alcohol intake could curb the shortening,” Dr. Topiwala said.

She said that a limitation of the study is that telomere length was measured in blood rather than brain and that it’s not clear at present how closely the two relate. Also, UK Biobank participants are generally more healthy than is the general population. Also, though telomere length and brain measures were associated, “we cannot from this study prove one is causing the other,” she added.
 

Need for more research

Commenting on the research, Percy Griffin, PhD, Alzheimer’s Association director of scientific engagement, said that it’s been “known for some time that shortened telomeres – the caps at the end of DNA – are associated with increased aging.”

This new study is “interesting,” said Dr. Percy, in that it shows an association between longer telomere length in white blood cells and healthier brain structures in the areas associated with Alzheimer’s disease. The longer telomeres were also associated with lower incidence of all-cause dementia.

But echoing Dr. Topiwala, “association does not mean causation,” Dr. Griffin said. “More research is needed to understand how diverse mechanisms contributing to Alzheimer’s and other dementia can be targeted.”

“The Alzheimer’s Association is accelerating the discovery of novel therapies through its Part the Cloud funding program, which has invested more than $65 million to accelerate the development of 65 drug development programs,” Dr. Griffin said.

The study had no specific funding. Dr. Topiwala and Dr. Griffin report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Telomere shortening – a sign of cellular aging – is associated with multiple changes in the brain associated with dementia, whereas longer telomeres associate with better brain health and lower risk for dementia, new research suggests.

“This is the largest and most systematic investigation of telomere length and brain structure and function,” said Anya Topiwala, of the University of Oxford (England). “We found that longer telomeres associated with protection against dementia. The links with brain structure, we think, offer a possible mechanism for this protection. The hope is, by understanding the mechanism, new treatment targets could be uncovered,” Dr. Topiwala said.

The study was published online in PLOS ONE.
 

UK Biobank cohort

Telomeres form protective caps at the ends of chromosomes, and they progressively shorten with age, which may increase susceptibility to age-related diseases including Alzheimer’s disease. The mechanism underlying this risk is unclear and may involve changes in brain structure and function. However, the relationship between telomere length and neuroimaging markers is poorly characterized.

Dr. Topiwala and colleagues compared telomere length in white blood cells to brain MRI and health record data in 31,661 middle-aged and older adults in UK Biobank. They found that longer leucocyte telomere length (LTL) was associated with a larger volume of global and subcortical grey matter and a larger hippocampus – both of which shrink in patients with Alzheimer’s disease. Longer telomeres were also associated with a thicker cerebral cortex, which thins as Alzheimer’s disease progresses.

Longer LTL was also associated with reduced incidence of dementia during follow-up (hazard ratio, 0.93; 95% confidence interval, 0.91-0.96).

Dr. Topiwala noted that many of the factors related to telomere shortening, such as age, genetics, and sex, can’t be changed. However, in a previous study, her team found that drinking alcohol may shorten telomere length. “So by this logic, reducing your alcohol intake could curb the shortening,” Dr. Topiwala said.

She said that a limitation of the study is that telomere length was measured in blood rather than brain and that it’s not clear at present how closely the two relate. Also, UK Biobank participants are generally more healthy than is the general population. Also, though telomere length and brain measures were associated, “we cannot from this study prove one is causing the other,” she added.
 

Need for more research

Commenting on the research, Percy Griffin, PhD, Alzheimer’s Association director of scientific engagement, said that it’s been “known for some time that shortened telomeres – the caps at the end of DNA – are associated with increased aging.”

This new study is “interesting,” said Dr. Percy, in that it shows an association between longer telomere length in white blood cells and healthier brain structures in the areas associated with Alzheimer’s disease. The longer telomeres were also associated with lower incidence of all-cause dementia.

But echoing Dr. Topiwala, “association does not mean causation,” Dr. Griffin said. “More research is needed to understand how diverse mechanisms contributing to Alzheimer’s and other dementia can be targeted.”

“The Alzheimer’s Association is accelerating the discovery of novel therapies through its Part the Cloud funding program, which has invested more than $65 million to accelerate the development of 65 drug development programs,” Dr. Griffin said.

The study had no specific funding. Dr. Topiwala and Dr. Griffin report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Telomere shortening – a sign of cellular aging – is associated with multiple changes in the brain associated with dementia, whereas longer telomeres associate with better brain health and lower risk for dementia, new research suggests.

“This is the largest and most systematic investigation of telomere length and brain structure and function,” said Anya Topiwala, of the University of Oxford (England). “We found that longer telomeres associated with protection against dementia. The links with brain structure, we think, offer a possible mechanism for this protection. The hope is, by understanding the mechanism, new treatment targets could be uncovered,” Dr. Topiwala said.

The study was published online in PLOS ONE.
 

UK Biobank cohort

Telomeres form protective caps at the ends of chromosomes, and they progressively shorten with age, which may increase susceptibility to age-related diseases including Alzheimer’s disease. The mechanism underlying this risk is unclear and may involve changes in brain structure and function. However, the relationship between telomere length and neuroimaging markers is poorly characterized.

Dr. Topiwala and colleagues compared telomere length in white blood cells to brain MRI and health record data in 31,661 middle-aged and older adults in UK Biobank. They found that longer leucocyte telomere length (LTL) was associated with a larger volume of global and subcortical grey matter and a larger hippocampus – both of which shrink in patients with Alzheimer’s disease. Longer telomeres were also associated with a thicker cerebral cortex, which thins as Alzheimer’s disease progresses.

Longer LTL was also associated with reduced incidence of dementia during follow-up (hazard ratio, 0.93; 95% confidence interval, 0.91-0.96).

Dr. Topiwala noted that many of the factors related to telomere shortening, such as age, genetics, and sex, can’t be changed. However, in a previous study, her team found that drinking alcohol may shorten telomere length. “So by this logic, reducing your alcohol intake could curb the shortening,” Dr. Topiwala said.

She said that a limitation of the study is that telomere length was measured in blood rather than brain and that it’s not clear at present how closely the two relate. Also, UK Biobank participants are generally more healthy than is the general population. Also, though telomere length and brain measures were associated, “we cannot from this study prove one is causing the other,” she added.
 

Need for more research

Commenting on the research, Percy Griffin, PhD, Alzheimer’s Association director of scientific engagement, said that it’s been “known for some time that shortened telomeres – the caps at the end of DNA – are associated with increased aging.”

This new study is “interesting,” said Dr. Percy, in that it shows an association between longer telomere length in white blood cells and healthier brain structures in the areas associated with Alzheimer’s disease. The longer telomeres were also associated with lower incidence of all-cause dementia.

But echoing Dr. Topiwala, “association does not mean causation,” Dr. Griffin said. “More research is needed to understand how diverse mechanisms contributing to Alzheimer’s and other dementia can be targeted.”

“The Alzheimer’s Association is accelerating the discovery of novel therapies through its Part the Cloud funding program, which has invested more than $65 million to accelerate the development of 65 drug development programs,” Dr. Griffin said.

The study had no specific funding. Dr. Topiwala and Dr. Griffin report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PLOS ONE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Tooth loss and diabetes together hasten mental decline

Article Type
Changed
Thu, 03/30/2023 - 07:58

 

Both tooth loss and diabetes can lead to accelerated cognitive decline in older adults, most specifically in those 65-74 years of age, new findings suggest.

The data come from a 12-year follow-up of older adults in a nationally representative U.S. survey.

“From a clinical perspective, our study demonstrates the importance of improving access to dental health care and integrating primary dental and medical care. Health care professionals and family caregivers should pay close attention to the cognitive status of diabetic older adults with poor oral health status,” lead author Bei Wu, PhD, of New York University, said in an interview. Dr. Wu is the Dean’s Professor in Global Health and codirector of the NYU Aging Incubator.

Moreover, said Dr. Wu: “For individuals with both poor oral health and diabetes, regular dental visits should be encouraged in addition to adherence to the diabetes self-care protocol.”

Diabetes has long been recognized as a risk factor for cognitive decline, but the findings have been inconsistent for different age groups. Tooth loss has also been linked to cognitive decline and dementia, as well as diabetes.

The mechanisms aren’t entirely clear, but “co-occurring diabetes and poor oral health may increase the risk for dementia, possibly via the potentially interrelated pathways of chronic inflammation and cardiovascular risk factors,” Dr. Wu said.

The new study, published in the Journal of Dental Research, is the first to examine the relationships between all three conditions by age group.  
 

Diabetes, edentulism, and cognitive decline

The data came from a total of 9,948 participants in the Health and Retirement Study (HRS) from 2006 to 2018. At baseline, 5,440 participants were aged 65-74 years, 3,300 were aged 75-84, and 1,208 were aged 85 years or older.

They were assessed every 2 years using the 35-point Telephone Survey for Cognitive Status, which included tests of immediate and delayed word recall, repeated subtracting by 7, counting backward from 20, naming objects, and naming the president and vice president of the U.S. As might be expected, the youngest group scored the highest, averaging 23 points, while the oldest group scored lowest, at 18.5 points.

Participants were also asked if they had ever been told by a doctor that they have diabetes. Another question was: “Have you lost all of your upper and lower natural permanent teeth?”

The condition of having no teeth is known as edentulism.

The percentages of participants who reported having both diabetes and edentulism were 6.0%, 6.7%, and 5.0% for those aged 65-74 years, 75-84 years, and 85 years or older, respectively. The proportions with neither of those conditions were 63.5%, 60.4%, and 58.3% in those three age groups, respectively (P < .001).

Compared with their counterparts with neither diabetes nor edentulism at baseline, older adults with both conditions aged 65-74 years (P < .001) and aged 75-84 years had worse cognitive function (P < .001).

In terms of the rate of cognitive decline, compared with those with neither condition from the same age cohort, older adults aged 65-74 years with both conditions declined at a higher rate (P < .001).

Having diabetes alone led to accelerated cognitive decline in older adults aged 65-74 years (P < .001). Having edentulism alone led to accelerated decline in older adults aged 65-74 years (P < .001) and older adults aged 75-84 years (P < 0.01).

“Our study finds the co-occurrence of diabetes and edentulism led to a worse cognitive function and a faster cognitive decline in older adults aged 65-74 years,” say Wu and colleagues.
 

Study limitations: Better data needed

The study has several limitations, most of them due to the data source. For example, while the HRS collects detailed information on cognitive status, edentulism is its only measure of oral health. There were no data on whether individuals had replacements such as dentures or implants that would affect their ability to eat, which could influence other health factors.

“I have made repeated appeals for federal funding to collect more oral health-related information in large national surveys,” Dr. Wu told this news organization.

Similarly, assessments of diabetes status such as hemoglobin A1c were only available for small subsets and not sufficient to demonstrate statistical significance, she explained.

Dr. Wu suggested that both oral health and cognitive screening might be included in the “Welcome to Medicare” preventive visit. In addition, “Oral hygiene practice should also be highlighted to improve cognitive health. Developing dental care interventions and programs are needed for reducing the societal cost of dementia.”

The study was partially supported by the National Institutes of Health. The authors have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Both tooth loss and diabetes can lead to accelerated cognitive decline in older adults, most specifically in those 65-74 years of age, new findings suggest.

The data come from a 12-year follow-up of older adults in a nationally representative U.S. survey.

“From a clinical perspective, our study demonstrates the importance of improving access to dental health care and integrating primary dental and medical care. Health care professionals and family caregivers should pay close attention to the cognitive status of diabetic older adults with poor oral health status,” lead author Bei Wu, PhD, of New York University, said in an interview. Dr. Wu is the Dean’s Professor in Global Health and codirector of the NYU Aging Incubator.

Moreover, said Dr. Wu: “For individuals with both poor oral health and diabetes, regular dental visits should be encouraged in addition to adherence to the diabetes self-care protocol.”

Diabetes has long been recognized as a risk factor for cognitive decline, but the findings have been inconsistent for different age groups. Tooth loss has also been linked to cognitive decline and dementia, as well as diabetes.

The mechanisms aren’t entirely clear, but “co-occurring diabetes and poor oral health may increase the risk for dementia, possibly via the potentially interrelated pathways of chronic inflammation and cardiovascular risk factors,” Dr. Wu said.

The new study, published in the Journal of Dental Research, is the first to examine the relationships between all three conditions by age group.  
 

Diabetes, edentulism, and cognitive decline

The data came from a total of 9,948 participants in the Health and Retirement Study (HRS) from 2006 to 2018. At baseline, 5,440 participants were aged 65-74 years, 3,300 were aged 75-84, and 1,208 were aged 85 years or older.

They were assessed every 2 years using the 35-point Telephone Survey for Cognitive Status, which included tests of immediate and delayed word recall, repeated subtracting by 7, counting backward from 20, naming objects, and naming the president and vice president of the U.S. As might be expected, the youngest group scored the highest, averaging 23 points, while the oldest group scored lowest, at 18.5 points.

Participants were also asked if they had ever been told by a doctor that they have diabetes. Another question was: “Have you lost all of your upper and lower natural permanent teeth?”

The condition of having no teeth is known as edentulism.

The percentages of participants who reported having both diabetes and edentulism were 6.0%, 6.7%, and 5.0% for those aged 65-74 years, 75-84 years, and 85 years or older, respectively. The proportions with neither of those conditions were 63.5%, 60.4%, and 58.3% in those three age groups, respectively (P < .001).

Compared with their counterparts with neither diabetes nor edentulism at baseline, older adults with both conditions aged 65-74 years (P < .001) and aged 75-84 years had worse cognitive function (P < .001).

In terms of the rate of cognitive decline, compared with those with neither condition from the same age cohort, older adults aged 65-74 years with both conditions declined at a higher rate (P < .001).

Having diabetes alone led to accelerated cognitive decline in older adults aged 65-74 years (P < .001). Having edentulism alone led to accelerated decline in older adults aged 65-74 years (P < .001) and older adults aged 75-84 years (P < 0.01).

“Our study finds the co-occurrence of diabetes and edentulism led to a worse cognitive function and a faster cognitive decline in older adults aged 65-74 years,” say Wu and colleagues.
 

Study limitations: Better data needed

The study has several limitations, most of them due to the data source. For example, while the HRS collects detailed information on cognitive status, edentulism is its only measure of oral health. There were no data on whether individuals had replacements such as dentures or implants that would affect their ability to eat, which could influence other health factors.

“I have made repeated appeals for federal funding to collect more oral health-related information in large national surveys,” Dr. Wu told this news organization.

Similarly, assessments of diabetes status such as hemoglobin A1c were only available for small subsets and not sufficient to demonstrate statistical significance, she explained.

Dr. Wu suggested that both oral health and cognitive screening might be included in the “Welcome to Medicare” preventive visit. In addition, “Oral hygiene practice should also be highlighted to improve cognitive health. Developing dental care interventions and programs are needed for reducing the societal cost of dementia.”

The study was partially supported by the National Institutes of Health. The authors have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Both tooth loss and diabetes can lead to accelerated cognitive decline in older adults, most specifically in those 65-74 years of age, new findings suggest.

The data come from a 12-year follow-up of older adults in a nationally representative U.S. survey.

“From a clinical perspective, our study demonstrates the importance of improving access to dental health care and integrating primary dental and medical care. Health care professionals and family caregivers should pay close attention to the cognitive status of diabetic older adults with poor oral health status,” lead author Bei Wu, PhD, of New York University, said in an interview. Dr. Wu is the Dean’s Professor in Global Health and codirector of the NYU Aging Incubator.

Moreover, said Dr. Wu: “For individuals with both poor oral health and diabetes, regular dental visits should be encouraged in addition to adherence to the diabetes self-care protocol.”

Diabetes has long been recognized as a risk factor for cognitive decline, but the findings have been inconsistent for different age groups. Tooth loss has also been linked to cognitive decline and dementia, as well as diabetes.

The mechanisms aren’t entirely clear, but “co-occurring diabetes and poor oral health may increase the risk for dementia, possibly via the potentially interrelated pathways of chronic inflammation and cardiovascular risk factors,” Dr. Wu said.

The new study, published in the Journal of Dental Research, is the first to examine the relationships between all three conditions by age group.  
 

Diabetes, edentulism, and cognitive decline

The data came from a total of 9,948 participants in the Health and Retirement Study (HRS) from 2006 to 2018. At baseline, 5,440 participants were aged 65-74 years, 3,300 were aged 75-84, and 1,208 were aged 85 years or older.

They were assessed every 2 years using the 35-point Telephone Survey for Cognitive Status, which included tests of immediate and delayed word recall, repeated subtracting by 7, counting backward from 20, naming objects, and naming the president and vice president of the U.S. As might be expected, the youngest group scored the highest, averaging 23 points, while the oldest group scored lowest, at 18.5 points.

Participants were also asked if they had ever been told by a doctor that they have diabetes. Another question was: “Have you lost all of your upper and lower natural permanent teeth?”

The condition of having no teeth is known as edentulism.

The percentages of participants who reported having both diabetes and edentulism were 6.0%, 6.7%, and 5.0% for those aged 65-74 years, 75-84 years, and 85 years or older, respectively. The proportions with neither of those conditions were 63.5%, 60.4%, and 58.3% in those three age groups, respectively (P < .001).

Compared with their counterparts with neither diabetes nor edentulism at baseline, older adults with both conditions aged 65-74 years (P < .001) and aged 75-84 years had worse cognitive function (P < .001).

In terms of the rate of cognitive decline, compared with those with neither condition from the same age cohort, older adults aged 65-74 years with both conditions declined at a higher rate (P < .001).

Having diabetes alone led to accelerated cognitive decline in older adults aged 65-74 years (P < .001). Having edentulism alone led to accelerated decline in older adults aged 65-74 years (P < .001) and older adults aged 75-84 years (P < 0.01).

“Our study finds the co-occurrence of diabetes and edentulism led to a worse cognitive function and a faster cognitive decline in older adults aged 65-74 years,” say Wu and colleagues.
 

Study limitations: Better data needed

The study has several limitations, most of them due to the data source. For example, while the HRS collects detailed information on cognitive status, edentulism is its only measure of oral health. There were no data on whether individuals had replacements such as dentures or implants that would affect their ability to eat, which could influence other health factors.

“I have made repeated appeals for federal funding to collect more oral health-related information in large national surveys,” Dr. Wu told this news organization.

Similarly, assessments of diabetes status such as hemoglobin A1c were only available for small subsets and not sufficient to demonstrate statistical significance, she explained.

Dr. Wu suggested that both oral health and cognitive screening might be included in the “Welcome to Medicare” preventive visit. In addition, “Oral hygiene practice should also be highlighted to improve cognitive health. Developing dental care interventions and programs are needed for reducing the societal cost of dementia.”

The study was partially supported by the National Institutes of Health. The authors have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF DENTAL RESEARCH

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Mortality risk in epilepsy: New data

Article Type
Changed
Thu, 03/30/2023 - 07:59

People with epilepsy have a twofold increased risk for death, compared with their counterparts without the condition, irrespective of comorbidities or disease severity, new research shows.

“To our knowledge, this is the only study that has assessed the cause-specific mortality risk among people with epilepsy according to age and disease course,” investigators led by Seo-Young Lee, MD, PhD, of Kangwon National University, Chuncheon, South Korea, write. “Understanding cause-specific mortality risk, particularly the risk of external causes, is important because they are mostly preventable.”

The findings were published online  in Neurology.
 

Higher mortality risk

For the study, researchers analyzed data from the National Health Insurance Service database in Korea from 2006 to 2017 and vital statistics from Statistics Korea from 2008 to 2017.

The study population included 138,998 patients with newly treated epilepsy, with an average at diagnosis of 48.6 years.

Over 665,928 person years of follow-up (mean follow-up, 4.79 years), 20.095 patients died.

People with epilepsy had more than twice the risk for death, compared with the overall population (standardized mortality ratio, 2.25; 95% confidence interval, 2.22-2.28). Mortality was highest in children aged 4 years or younger and was higher in the first year after diagnosis and in women at all age points.

People with epilepsy had a higher mortality risk, compared with the general public, regardless of how many anti-seizure medications they were taking. Those taking only one medication had a 156% higher risk for death (SMR, 1.56; 95% CI, 1.53-1.60), compared with 493% higher risk in those taking four or more medications (SMR, 4.93; 95% CI, 4.76-5.10).

Where patients lived also played a role in mortality risk. Living in a rural area was associated with a 247% higher risk for death, compared with people without epilepsy who lived in the same area (SMR, 2.47; 95% CI, 2.41-2.53), and the risk was 203% higher risk among those living in urban centers (SMR, 2.03; 95% CI, 1.98-2.09).

Although people with comorbidities had higher mortality rates, even those without any other health conditions had a 161% higher risk for death, compared with people without epilepsy (SMR, 1.61; 95% CI, 1.50-1.72).
 

Causes of death

The most frequent causes of death were malignant neoplasm and cerebrovascular disease, which researchers noted are thought to be underlying causes of epilepsy.

Among external causes of death, suicide was the most common cause (2.6%). The suicide rate was highest among younger patients and gradually decreased with age.

Deaths tied directly to epilepsy, transport accidents, or falls were lower in this study than had been previously reported, which may be due to adequate seizure control or because the number of older people with epilepsy and comorbidities is higher in Korea than that reported in other countries.

“To reduce mortality in people with epilepsy, comprehensive efforts [are needed], including a national policy against stigma of epilepsy and clinicians’ total management such as risk stratification, education about injury prevention, and monitoring for suicidal ideation with psychological intervention, as well as active control of seizures,” the authors write.
 

Generalizable findings

Joseph Sirven, MD, professor of neurology at Mayo Clinic Florida, Jacksonville, said that although the study included only Korean patients, the findings are applicable to other counties.

That researchers found patients with epilepsy were more than twice as likely to die prematurely, compared with the general population wasn’t particularly surprising, Dr. Sirven said.

“What struck me the most was the fact that even patients who were on a single drug and seemingly well controlled also had excess mortality reported,” Dr. Sirven said. “That these risks occur should be part of what we tell all patients with epilepsy so that they can better arm themselves with information and help to address some of the risks that this study showed.”

Another important finding is the risk for suicide in patients with epilepsy, especially those who are newly diagnosed, he said.

“When we treat a patient with epilepsy, it should not just be about seizures, but we need to inquire about the psychiatric comorbidities and more importantly manage them in a comprehensive manner,” Dr. Sirven said.

The study was funded by Soonchunhyang University Research Fund and the Korea Health Technology R&D Project. The study authors and Dr. Sirven report no relevant financial conflicts.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

People with epilepsy have a twofold increased risk for death, compared with their counterparts without the condition, irrespective of comorbidities or disease severity, new research shows.

“To our knowledge, this is the only study that has assessed the cause-specific mortality risk among people with epilepsy according to age and disease course,” investigators led by Seo-Young Lee, MD, PhD, of Kangwon National University, Chuncheon, South Korea, write. “Understanding cause-specific mortality risk, particularly the risk of external causes, is important because they are mostly preventable.”

The findings were published online  in Neurology.
 

Higher mortality risk

For the study, researchers analyzed data from the National Health Insurance Service database in Korea from 2006 to 2017 and vital statistics from Statistics Korea from 2008 to 2017.

The study population included 138,998 patients with newly treated epilepsy, with an average at diagnosis of 48.6 years.

Over 665,928 person years of follow-up (mean follow-up, 4.79 years), 20.095 patients died.

People with epilepsy had more than twice the risk for death, compared with the overall population (standardized mortality ratio, 2.25; 95% confidence interval, 2.22-2.28). Mortality was highest in children aged 4 years or younger and was higher in the first year after diagnosis and in women at all age points.

People with epilepsy had a higher mortality risk, compared with the general public, regardless of how many anti-seizure medications they were taking. Those taking only one medication had a 156% higher risk for death (SMR, 1.56; 95% CI, 1.53-1.60), compared with 493% higher risk in those taking four or more medications (SMR, 4.93; 95% CI, 4.76-5.10).

Where patients lived also played a role in mortality risk. Living in a rural area was associated with a 247% higher risk for death, compared with people without epilepsy who lived in the same area (SMR, 2.47; 95% CI, 2.41-2.53), and the risk was 203% higher risk among those living in urban centers (SMR, 2.03; 95% CI, 1.98-2.09).

Although people with comorbidities had higher mortality rates, even those without any other health conditions had a 161% higher risk for death, compared with people without epilepsy (SMR, 1.61; 95% CI, 1.50-1.72).
 

Causes of death

The most frequent causes of death were malignant neoplasm and cerebrovascular disease, which researchers noted are thought to be underlying causes of epilepsy.

Among external causes of death, suicide was the most common cause (2.6%). The suicide rate was highest among younger patients and gradually decreased with age.

Deaths tied directly to epilepsy, transport accidents, or falls were lower in this study than had been previously reported, which may be due to adequate seizure control or because the number of older people with epilepsy and comorbidities is higher in Korea than that reported in other countries.

“To reduce mortality in people with epilepsy, comprehensive efforts [are needed], including a national policy against stigma of epilepsy and clinicians’ total management such as risk stratification, education about injury prevention, and monitoring for suicidal ideation with psychological intervention, as well as active control of seizures,” the authors write.
 

Generalizable findings

Joseph Sirven, MD, professor of neurology at Mayo Clinic Florida, Jacksonville, said that although the study included only Korean patients, the findings are applicable to other counties.

That researchers found patients with epilepsy were more than twice as likely to die prematurely, compared with the general population wasn’t particularly surprising, Dr. Sirven said.

“What struck me the most was the fact that even patients who were on a single drug and seemingly well controlled also had excess mortality reported,” Dr. Sirven said. “That these risks occur should be part of what we tell all patients with epilepsy so that they can better arm themselves with information and help to address some of the risks that this study showed.”

Another important finding is the risk for suicide in patients with epilepsy, especially those who are newly diagnosed, he said.

“When we treat a patient with epilepsy, it should not just be about seizures, but we need to inquire about the psychiatric comorbidities and more importantly manage them in a comprehensive manner,” Dr. Sirven said.

The study was funded by Soonchunhyang University Research Fund and the Korea Health Technology R&D Project. The study authors and Dr. Sirven report no relevant financial conflicts.

A version of this article first appeared on Medscape.com.

People with epilepsy have a twofold increased risk for death, compared with their counterparts without the condition, irrespective of comorbidities or disease severity, new research shows.

“To our knowledge, this is the only study that has assessed the cause-specific mortality risk among people with epilepsy according to age and disease course,” investigators led by Seo-Young Lee, MD, PhD, of Kangwon National University, Chuncheon, South Korea, write. “Understanding cause-specific mortality risk, particularly the risk of external causes, is important because they are mostly preventable.”

The findings were published online  in Neurology.
 

Higher mortality risk

For the study, researchers analyzed data from the National Health Insurance Service database in Korea from 2006 to 2017 and vital statistics from Statistics Korea from 2008 to 2017.

The study population included 138,998 patients with newly treated epilepsy, with an average at diagnosis of 48.6 years.

Over 665,928 person years of follow-up (mean follow-up, 4.79 years), 20.095 patients died.

People with epilepsy had more than twice the risk for death, compared with the overall population (standardized mortality ratio, 2.25; 95% confidence interval, 2.22-2.28). Mortality was highest in children aged 4 years or younger and was higher in the first year after diagnosis and in women at all age points.

People with epilepsy had a higher mortality risk, compared with the general public, regardless of how many anti-seizure medications they were taking. Those taking only one medication had a 156% higher risk for death (SMR, 1.56; 95% CI, 1.53-1.60), compared with 493% higher risk in those taking four or more medications (SMR, 4.93; 95% CI, 4.76-5.10).

Where patients lived also played a role in mortality risk. Living in a rural area was associated with a 247% higher risk for death, compared with people without epilepsy who lived in the same area (SMR, 2.47; 95% CI, 2.41-2.53), and the risk was 203% higher risk among those living in urban centers (SMR, 2.03; 95% CI, 1.98-2.09).

Although people with comorbidities had higher mortality rates, even those without any other health conditions had a 161% higher risk for death, compared with people without epilepsy (SMR, 1.61; 95% CI, 1.50-1.72).
 

Causes of death

The most frequent causes of death were malignant neoplasm and cerebrovascular disease, which researchers noted are thought to be underlying causes of epilepsy.

Among external causes of death, suicide was the most common cause (2.6%). The suicide rate was highest among younger patients and gradually decreased with age.

Deaths tied directly to epilepsy, transport accidents, or falls were lower in this study than had been previously reported, which may be due to adequate seizure control or because the number of older people with epilepsy and comorbidities is higher in Korea than that reported in other countries.

“To reduce mortality in people with epilepsy, comprehensive efforts [are needed], including a national policy against stigma of epilepsy and clinicians’ total management such as risk stratification, education about injury prevention, and monitoring for suicidal ideation with psychological intervention, as well as active control of seizures,” the authors write.
 

Generalizable findings

Joseph Sirven, MD, professor of neurology at Mayo Clinic Florida, Jacksonville, said that although the study included only Korean patients, the findings are applicable to other counties.

That researchers found patients with epilepsy were more than twice as likely to die prematurely, compared with the general population wasn’t particularly surprising, Dr. Sirven said.

“What struck me the most was the fact that even patients who were on a single drug and seemingly well controlled also had excess mortality reported,” Dr. Sirven said. “That these risks occur should be part of what we tell all patients with epilepsy so that they can better arm themselves with information and help to address some of the risks that this study showed.”

Another important finding is the risk for suicide in patients with epilepsy, especially those who are newly diagnosed, he said.

“When we treat a patient with epilepsy, it should not just be about seizures, but we need to inquire about the psychiatric comorbidities and more importantly manage them in a comprehensive manner,” Dr. Sirven said.

The study was funded by Soonchunhyang University Research Fund and the Korea Health Technology R&D Project. The study authors and Dr. Sirven report no relevant financial conflicts.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article