Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

A New and Early Predictor of Dementia?

Article Type
Changed
Fri, 11/15/2024 - 12:14

Signs of frailty may signal future dementia more than a decade before cognitive symptoms occur, in new findings that may provide a potential opportunity to identify high-risk populations for targeted enrollment in clinical trials of dementia prevention and treatment.

Results of an international study assessing frailty trajectories showed frailty levels notably increased in the 4-9 years before dementia diagnosis. Even among study participants whose baseline frailty measurement was taken prior to that acceleration period, frailty was still positively associated with dementia risk, the investigators noted.

“We found that with every four to five additional health problems, there is on average a 40% higher risk of developing dementia, while the risk is lower for people who are more physically fit,” said study investigator David Ward, PhD, of the Centre for Health Services Research, The University of Queensland, Brisbane, Australia.

The findings were published online in JAMA Neurology.

 

A Promising Biomarker

An accessible biomarker for both biologic age and dementia risk is essential for advancing dementia prevention and treatment strategies, the investigators noted, adding that growing evidence suggests frailty may be a promising candidate for this role.

To learn more about the association between frailty and dementia, Ward and his team analyzed data on 29,849 participants aged 60 years or above (mean age, 71.6 years; 62% women) who participated in four cohort studies: the English Longitudinal Study of Ageing (ELSA; n = 6771), the Health and Retirement Study (HRS; n = 9045), the Rush Memory and Aging Project (MAP; n = 1451), and the National Alzheimer’s Coordinating Center (NACC; n = 12,582).

The primary outcome was all-cause dementia. Depending on the cohort, dementia diagnoses were determined through cognitive testing, self- or family report of physician diagnosis, or a diagnosis by the study physician. Participants were excluded if they had cognitive impairment at baseline.

Investigators retrospectively determined frailty index scores by gathering information on health and functional outcomes for participants from each cohort. Only participants with frailty data on at least 30 deficits were included.

Commonly included deficits included high blood pressure, cancer, and chronic pain, as well as functional problems such as hearing impairment, difficulty with mobility, and challenges managing finances.

Investigators conducted follow-up visits with participants until they developed dementia or until the study ended, with follow-up periods varying across cohorts.

After adjustment for potential confounders, frailty scores were modeled using backward time scales.

Among participants who developed incident dementia (n = 3154), covariate-adjusted expected frailty index scores were, on average, higher in women than in men by 18.5% in ELSA, 20.9% in HRS, and 16.2% in MAP. There were no differences in frailty scores between sexes in the NACC cohort.

When measured on a timeline, as compared with those who didn’t develop dementia, frailty scores were significantly and consistently higher in the dementia groups 8-20 before dementia onset (20 years in HRS; 13 in MAP; 12 in ELSA; 8 in NACC).

Increases in the rates of frailty index scores began accelerating 4-9 years before dementia onset for the various cohorts, investigators noted.

In all four cohorts, each 0.1 increase in frailty scores was positively associated with increased dementia risk.

Adjusted hazard ratios [aHRs] ranged from 1.18 in the HRS cohort to 1.73 in the NACC cohort, which showed the strongest association.

In participants whose baseline frailty measurement was conducted before the predementia acceleration period began, the association of frailty scores and dementia risk was positive. These aHRs ranged from 1.18 in the HRS cohort to 1.43 in the NACC cohort.

 

The ‘Four Pillars’ of Prevention

The good news, investigators said, is that the long trajectory of frailty symptoms preceding dementia onset provides plenty of opportunity for intervention.

To slow the development of frailty, Ward suggested adhering to the “four pillars of frailty prevention and management,” which include good nutrition with plenty of protein, exercise, optimizing medications for chronic conditions, and maintaining a strong social network.

Ward suggested neurologists track frailty in their patients and pointed to a recent article focused on helping neurologists use frailty measures to influence care planning.

Study limitations include the possibility of reverse causality and the fact that investigators could not adjust for genetic risk for dementia.

 

Unclear Pathway

Commenting on the findings, Lycia Neumann, PhD, senior director of Health Services Research at the Alzheimer’s Association, noted that many studies over the years have shown a link between frailty and dementia. However, she cautioned that a link does not imply causation.

The pathway from frailty to dementia is not 100% clear, and both are complex conditions, said Neumann, who was not part of the study.

“Adopting healthy lifestyle behaviors early and consistently can help decrease the risk of — or postpone the onset of — both frailty and cognitive decline,” she said. Neumann added that physical activity, a healthy diet, social engagement, and controlling diabetes and blood pressure can also reduce the risk for dementia as well as cardiovascular disease.

The study was funded in part by the Deep Dementia Phenotyping Network through the Frailty and Dementia Special Interest Group. Ward and Neumann reported no relevant financial relationships.

 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Signs of frailty may signal future dementia more than a decade before cognitive symptoms occur, in new findings that may provide a potential opportunity to identify high-risk populations for targeted enrollment in clinical trials of dementia prevention and treatment.

Results of an international study assessing frailty trajectories showed frailty levels notably increased in the 4-9 years before dementia diagnosis. Even among study participants whose baseline frailty measurement was taken prior to that acceleration period, frailty was still positively associated with dementia risk, the investigators noted.

“We found that with every four to five additional health problems, there is on average a 40% higher risk of developing dementia, while the risk is lower for people who are more physically fit,” said study investigator David Ward, PhD, of the Centre for Health Services Research, The University of Queensland, Brisbane, Australia.

The findings were published online in JAMA Neurology.

 

A Promising Biomarker

An accessible biomarker for both biologic age and dementia risk is essential for advancing dementia prevention and treatment strategies, the investigators noted, adding that growing evidence suggests frailty may be a promising candidate for this role.

To learn more about the association between frailty and dementia, Ward and his team analyzed data on 29,849 participants aged 60 years or above (mean age, 71.6 years; 62% women) who participated in four cohort studies: the English Longitudinal Study of Ageing (ELSA; n = 6771), the Health and Retirement Study (HRS; n = 9045), the Rush Memory and Aging Project (MAP; n = 1451), and the National Alzheimer’s Coordinating Center (NACC; n = 12,582).

The primary outcome was all-cause dementia. Depending on the cohort, dementia diagnoses were determined through cognitive testing, self- or family report of physician diagnosis, or a diagnosis by the study physician. Participants were excluded if they had cognitive impairment at baseline.

Investigators retrospectively determined frailty index scores by gathering information on health and functional outcomes for participants from each cohort. Only participants with frailty data on at least 30 deficits were included.

Commonly included deficits included high blood pressure, cancer, and chronic pain, as well as functional problems such as hearing impairment, difficulty with mobility, and challenges managing finances.

Investigators conducted follow-up visits with participants until they developed dementia or until the study ended, with follow-up periods varying across cohorts.

After adjustment for potential confounders, frailty scores were modeled using backward time scales.

Among participants who developed incident dementia (n = 3154), covariate-adjusted expected frailty index scores were, on average, higher in women than in men by 18.5% in ELSA, 20.9% in HRS, and 16.2% in MAP. There were no differences in frailty scores between sexes in the NACC cohort.

When measured on a timeline, as compared with those who didn’t develop dementia, frailty scores were significantly and consistently higher in the dementia groups 8-20 before dementia onset (20 years in HRS; 13 in MAP; 12 in ELSA; 8 in NACC).

Increases in the rates of frailty index scores began accelerating 4-9 years before dementia onset for the various cohorts, investigators noted.

In all four cohorts, each 0.1 increase in frailty scores was positively associated with increased dementia risk.

Adjusted hazard ratios [aHRs] ranged from 1.18 in the HRS cohort to 1.73 in the NACC cohort, which showed the strongest association.

In participants whose baseline frailty measurement was conducted before the predementia acceleration period began, the association of frailty scores and dementia risk was positive. These aHRs ranged from 1.18 in the HRS cohort to 1.43 in the NACC cohort.

 

The ‘Four Pillars’ of Prevention

The good news, investigators said, is that the long trajectory of frailty symptoms preceding dementia onset provides plenty of opportunity for intervention.

To slow the development of frailty, Ward suggested adhering to the “four pillars of frailty prevention and management,” which include good nutrition with plenty of protein, exercise, optimizing medications for chronic conditions, and maintaining a strong social network.

Ward suggested neurologists track frailty in their patients and pointed to a recent article focused on helping neurologists use frailty measures to influence care planning.

Study limitations include the possibility of reverse causality and the fact that investigators could not adjust for genetic risk for dementia.

 

Unclear Pathway

Commenting on the findings, Lycia Neumann, PhD, senior director of Health Services Research at the Alzheimer’s Association, noted that many studies over the years have shown a link between frailty and dementia. However, she cautioned that a link does not imply causation.

The pathway from frailty to dementia is not 100% clear, and both are complex conditions, said Neumann, who was not part of the study.

“Adopting healthy lifestyle behaviors early and consistently can help decrease the risk of — or postpone the onset of — both frailty and cognitive decline,” she said. Neumann added that physical activity, a healthy diet, social engagement, and controlling diabetes and blood pressure can also reduce the risk for dementia as well as cardiovascular disease.

The study was funded in part by the Deep Dementia Phenotyping Network through the Frailty and Dementia Special Interest Group. Ward and Neumann reported no relevant financial relationships.

 

A version of this article appeared on Medscape.com.

Signs of frailty may signal future dementia more than a decade before cognitive symptoms occur, in new findings that may provide a potential opportunity to identify high-risk populations for targeted enrollment in clinical trials of dementia prevention and treatment.

Results of an international study assessing frailty trajectories showed frailty levels notably increased in the 4-9 years before dementia diagnosis. Even among study participants whose baseline frailty measurement was taken prior to that acceleration period, frailty was still positively associated with dementia risk, the investigators noted.

“We found that with every four to five additional health problems, there is on average a 40% higher risk of developing dementia, while the risk is lower for people who are more physically fit,” said study investigator David Ward, PhD, of the Centre for Health Services Research, The University of Queensland, Brisbane, Australia.

The findings were published online in JAMA Neurology.

 

A Promising Biomarker

An accessible biomarker for both biologic age and dementia risk is essential for advancing dementia prevention and treatment strategies, the investigators noted, adding that growing evidence suggests frailty may be a promising candidate for this role.

To learn more about the association between frailty and dementia, Ward and his team analyzed data on 29,849 participants aged 60 years or above (mean age, 71.6 years; 62% women) who participated in four cohort studies: the English Longitudinal Study of Ageing (ELSA; n = 6771), the Health and Retirement Study (HRS; n = 9045), the Rush Memory and Aging Project (MAP; n = 1451), and the National Alzheimer’s Coordinating Center (NACC; n = 12,582).

The primary outcome was all-cause dementia. Depending on the cohort, dementia diagnoses were determined through cognitive testing, self- or family report of physician diagnosis, or a diagnosis by the study physician. Participants were excluded if they had cognitive impairment at baseline.

Investigators retrospectively determined frailty index scores by gathering information on health and functional outcomes for participants from each cohort. Only participants with frailty data on at least 30 deficits were included.

Commonly included deficits included high blood pressure, cancer, and chronic pain, as well as functional problems such as hearing impairment, difficulty with mobility, and challenges managing finances.

Investigators conducted follow-up visits with participants until they developed dementia or until the study ended, with follow-up periods varying across cohorts.

After adjustment for potential confounders, frailty scores were modeled using backward time scales.

Among participants who developed incident dementia (n = 3154), covariate-adjusted expected frailty index scores were, on average, higher in women than in men by 18.5% in ELSA, 20.9% in HRS, and 16.2% in MAP. There were no differences in frailty scores between sexes in the NACC cohort.

When measured on a timeline, as compared with those who didn’t develop dementia, frailty scores were significantly and consistently higher in the dementia groups 8-20 before dementia onset (20 years in HRS; 13 in MAP; 12 in ELSA; 8 in NACC).

Increases in the rates of frailty index scores began accelerating 4-9 years before dementia onset for the various cohorts, investigators noted.

In all four cohorts, each 0.1 increase in frailty scores was positively associated with increased dementia risk.

Adjusted hazard ratios [aHRs] ranged from 1.18 in the HRS cohort to 1.73 in the NACC cohort, which showed the strongest association.

In participants whose baseline frailty measurement was conducted before the predementia acceleration period began, the association of frailty scores and dementia risk was positive. These aHRs ranged from 1.18 in the HRS cohort to 1.43 in the NACC cohort.

 

The ‘Four Pillars’ of Prevention

The good news, investigators said, is that the long trajectory of frailty symptoms preceding dementia onset provides plenty of opportunity for intervention.

To slow the development of frailty, Ward suggested adhering to the “four pillars of frailty prevention and management,” which include good nutrition with plenty of protein, exercise, optimizing medications for chronic conditions, and maintaining a strong social network.

Ward suggested neurologists track frailty in their patients and pointed to a recent article focused on helping neurologists use frailty measures to influence care planning.

Study limitations include the possibility of reverse causality and the fact that investigators could not adjust for genetic risk for dementia.

 

Unclear Pathway

Commenting on the findings, Lycia Neumann, PhD, senior director of Health Services Research at the Alzheimer’s Association, noted that many studies over the years have shown a link between frailty and dementia. However, she cautioned that a link does not imply causation.

The pathway from frailty to dementia is not 100% clear, and both are complex conditions, said Neumann, who was not part of the study.

“Adopting healthy lifestyle behaviors early and consistently can help decrease the risk of — or postpone the onset of — both frailty and cognitive decline,” she said. Neumann added that physical activity, a healthy diet, social engagement, and controlling diabetes and blood pressure can also reduce the risk for dementia as well as cardiovascular disease.

The study was funded in part by the Deep Dementia Phenotyping Network through the Frailty and Dementia Special Interest Group. Ward and Neumann reported no relevant financial relationships.

 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 11/15/2024 - 12:12
Un-Gate On Date
Fri, 11/15/2024 - 12:12
Use ProPublica
CFC Schedule Remove Status
Fri, 11/15/2024 - 12:12
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 11/15/2024 - 12:12

Smoldering MS May Warrant Unique Diagnosis, Treatment, and Research Strategies

Article Type
Changed
Fri, 11/15/2024 - 11:47

Smoldering-associated worsening (SAW) of multiple sclerosis (MS) deserves a broader, more comprehensive approach to diagnosis, treatment, and research that goes beyond neurologists’ understanding of progression independent of relapse activity (PIRA), according to a recently published international consensus. However, an outside expert said that promulgating the “smoldering” concept may stoke patient and provider confusion.

Although current disease-modifying therapies (DMTs) for MS exclusively target focal white matter (WM) inflammation, wrote authors lead by Antonio Scalfari, MD, PhD, of Charing Cross Hospital, Imperial College London in England, many people with MS experience worsening disability in a more indolent fashion — despite stable inflammatory markers.

“The gradual accumulation of physical and cognitive disability is driven by smoldering pathological processes via biological substrates, which are different from those of acute focal damage, remain an important unmet therapeutic target,” they wrote.

The same research team first described smoldering MS in a 2022 publication. In the present paper, Scalfari and colleagues reviewed emerging clinical, radiological, and pathological evidence and presented 29 consensus statements in areas ranging from the definition, pathology, and clinical manifestations of smoldering MS to appropriate biomarkers and best clinical practices.

 

Definition

By definition, the authors wrote, SAW encompasses PIRA but also includes a range of gradually worsening, relapse-independent symptoms that remain undetectable on standard assessments, including the Expanded Disability Status Scale (EDSS) or EDSS-Plus, especially in early disease. To capture symptoms such as subtle motor impairment, cognitive slowing, and fatigue, Scalfari and colleagues recommend tools such as neurological stress tests, fatigue/mood scales, wearable devices, and patient reported outcomes.

Disease Mechanisms

Pathologically, the authors wrote, smoldering MS may stem from intrinsic central nervous system processes that likely incorporate various glial, immune, and neural cells. Smoldering MS also could contribute to aging, and vice versa, the latter possibly through dynamics such as age-related exhaustion of compensatory mechanisms, reduction in remyelination efficiency, and telomere shortening, they added.

Clinical Implementation

Current MS management rests on crude estimates of physical disability and overemphasizes identifying relapses and new MRI lesions as the principal markers of disease activity, wrote Scalfari and colleagues. Instead, they suggested combining motor-associated assessments such as EDSS-Plus with cognitive gauges such as the Brief International Cognitive Assessment for Multiple Sclerosis.

Providers are uncomfortable identifying and discussing smoldering MS, authors allowed, because no licensed treatments target SAW. However, the authors wrote, a principal reason for discussing smoldering MS with patients is to help manage their expectations of current DMTs, which may have little effect on SAW.

 

‘More Than Lesions’

Bruce Cree, MD, PhD, MAS, professor of neurology at the University of California, San Francisco, said that it is extremely important to raise awareness of physicians’ emerging understanding that “there is more going on in MS than lesions and relapses,” a concept that has been a work in progress for several years. He was not involved with the study but was asked to comment.

Dr. Bruce Cree

A 2019 report on the EPIC cohort coauthored by Cree labeled the disconnect between disability accumulation and relapse occurrence “silent progression.” The observation that disability accumulates in early relapsing MS independent of relapsing activity has been replicated in virtually every dataset worldwide, he added.

“What I don’t like about this article is the reliance on the term ‘smoldering’ and the acceptance that this is an actual phenomenon supported by data.” And authors’ leveraging “smoldering” into additional acronyms such as SAW likely will confuse rather than clarify physicians’ and patients’ understanding of the situation, Cree added. “Clinicians don’t need yet another snappy acronym.” Many are still trying to grasp the PIRA concept in relapsing MS, he said.

“One of the reasons this topic has become so important is that we recognize that even when we have very good control of relapsing disease activity — clinical relapses as well as radiographic large lesion formation on MRI — some patients still develop insidious worsening of disability. And the reasons for that are not well understood,” said Cree.

Accumulating disability absent relapse activity could stem from any number of microscopic inflammatory processes, possibly involving abnormal microglial activation, fibrinogen deposition, microscopic inflammatory infiltrates of CD8-positive T cells, or mitochondrial damage from iron deposition, he said. Or the processes driving PIRA may not even involve inflammation, he added. “We still don’t have a unifying way of understanding how these processes work.”

Cree suspects that, despite investigators’ good intentions, the study’s sponsor, Sanofi, may have influenced the resultant messaging. The company’s tolebrutinib recently completed phase 3 trials in secondary progressive MS and relapsing MS, and a phase 3 trial in primary progressive MS is scheduled for completion in 2025. “A hallmark of Sanofi’s messaging has been this idea that there is smoldering inflammation occurring in MS that tolebrutinib is going to address,” he said.

If clinicians really knew what drove progressive MS, said Cree, “we would be keen on developing therapies targeting that fundamental process. But because we don’t know what’s driving it, we don’t know what to go after.”

The study was supported by Sanofi. Cree is a coauthor of the GEMINI 1 and GEMINI 2 tolebrutinib studies.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Smoldering-associated worsening (SAW) of multiple sclerosis (MS) deserves a broader, more comprehensive approach to diagnosis, treatment, and research that goes beyond neurologists’ understanding of progression independent of relapse activity (PIRA), according to a recently published international consensus. However, an outside expert said that promulgating the “smoldering” concept may stoke patient and provider confusion.

Although current disease-modifying therapies (DMTs) for MS exclusively target focal white matter (WM) inflammation, wrote authors lead by Antonio Scalfari, MD, PhD, of Charing Cross Hospital, Imperial College London in England, many people with MS experience worsening disability in a more indolent fashion — despite stable inflammatory markers.

“The gradual accumulation of physical and cognitive disability is driven by smoldering pathological processes via biological substrates, which are different from those of acute focal damage, remain an important unmet therapeutic target,” they wrote.

The same research team first described smoldering MS in a 2022 publication. In the present paper, Scalfari and colleagues reviewed emerging clinical, radiological, and pathological evidence and presented 29 consensus statements in areas ranging from the definition, pathology, and clinical manifestations of smoldering MS to appropriate biomarkers and best clinical practices.

 

Definition

By definition, the authors wrote, SAW encompasses PIRA but also includes a range of gradually worsening, relapse-independent symptoms that remain undetectable on standard assessments, including the Expanded Disability Status Scale (EDSS) or EDSS-Plus, especially in early disease. To capture symptoms such as subtle motor impairment, cognitive slowing, and fatigue, Scalfari and colleagues recommend tools such as neurological stress tests, fatigue/mood scales, wearable devices, and patient reported outcomes.

Disease Mechanisms

Pathologically, the authors wrote, smoldering MS may stem from intrinsic central nervous system processes that likely incorporate various glial, immune, and neural cells. Smoldering MS also could contribute to aging, and vice versa, the latter possibly through dynamics such as age-related exhaustion of compensatory mechanisms, reduction in remyelination efficiency, and telomere shortening, they added.

Clinical Implementation

Current MS management rests on crude estimates of physical disability and overemphasizes identifying relapses and new MRI lesions as the principal markers of disease activity, wrote Scalfari and colleagues. Instead, they suggested combining motor-associated assessments such as EDSS-Plus with cognitive gauges such as the Brief International Cognitive Assessment for Multiple Sclerosis.

Providers are uncomfortable identifying and discussing smoldering MS, authors allowed, because no licensed treatments target SAW. However, the authors wrote, a principal reason for discussing smoldering MS with patients is to help manage their expectations of current DMTs, which may have little effect on SAW.

 

‘More Than Lesions’

Bruce Cree, MD, PhD, MAS, professor of neurology at the University of California, San Francisco, said that it is extremely important to raise awareness of physicians’ emerging understanding that “there is more going on in MS than lesions and relapses,” a concept that has been a work in progress for several years. He was not involved with the study but was asked to comment.

Dr. Bruce Cree

A 2019 report on the EPIC cohort coauthored by Cree labeled the disconnect between disability accumulation and relapse occurrence “silent progression.” The observation that disability accumulates in early relapsing MS independent of relapsing activity has been replicated in virtually every dataset worldwide, he added.

“What I don’t like about this article is the reliance on the term ‘smoldering’ and the acceptance that this is an actual phenomenon supported by data.” And authors’ leveraging “smoldering” into additional acronyms such as SAW likely will confuse rather than clarify physicians’ and patients’ understanding of the situation, Cree added. “Clinicians don’t need yet another snappy acronym.” Many are still trying to grasp the PIRA concept in relapsing MS, he said.

“One of the reasons this topic has become so important is that we recognize that even when we have very good control of relapsing disease activity — clinical relapses as well as radiographic large lesion formation on MRI — some patients still develop insidious worsening of disability. And the reasons for that are not well understood,” said Cree.

Accumulating disability absent relapse activity could stem from any number of microscopic inflammatory processes, possibly involving abnormal microglial activation, fibrinogen deposition, microscopic inflammatory infiltrates of CD8-positive T cells, or mitochondrial damage from iron deposition, he said. Or the processes driving PIRA may not even involve inflammation, he added. “We still don’t have a unifying way of understanding how these processes work.”

Cree suspects that, despite investigators’ good intentions, the study’s sponsor, Sanofi, may have influenced the resultant messaging. The company’s tolebrutinib recently completed phase 3 trials in secondary progressive MS and relapsing MS, and a phase 3 trial in primary progressive MS is scheduled for completion in 2025. “A hallmark of Sanofi’s messaging has been this idea that there is smoldering inflammation occurring in MS that tolebrutinib is going to address,” he said.

If clinicians really knew what drove progressive MS, said Cree, “we would be keen on developing therapies targeting that fundamental process. But because we don’t know what’s driving it, we don’t know what to go after.”

The study was supported by Sanofi. Cree is a coauthor of the GEMINI 1 and GEMINI 2 tolebrutinib studies.

A version of this article first appeared on Medscape.com.

Smoldering-associated worsening (SAW) of multiple sclerosis (MS) deserves a broader, more comprehensive approach to diagnosis, treatment, and research that goes beyond neurologists’ understanding of progression independent of relapse activity (PIRA), according to a recently published international consensus. However, an outside expert said that promulgating the “smoldering” concept may stoke patient and provider confusion.

Although current disease-modifying therapies (DMTs) for MS exclusively target focal white matter (WM) inflammation, wrote authors lead by Antonio Scalfari, MD, PhD, of Charing Cross Hospital, Imperial College London in England, many people with MS experience worsening disability in a more indolent fashion — despite stable inflammatory markers.

“The gradual accumulation of physical and cognitive disability is driven by smoldering pathological processes via biological substrates, which are different from those of acute focal damage, remain an important unmet therapeutic target,” they wrote.

The same research team first described smoldering MS in a 2022 publication. In the present paper, Scalfari and colleagues reviewed emerging clinical, radiological, and pathological evidence and presented 29 consensus statements in areas ranging from the definition, pathology, and clinical manifestations of smoldering MS to appropriate biomarkers and best clinical practices.

 

Definition

By definition, the authors wrote, SAW encompasses PIRA but also includes a range of gradually worsening, relapse-independent symptoms that remain undetectable on standard assessments, including the Expanded Disability Status Scale (EDSS) or EDSS-Plus, especially in early disease. To capture symptoms such as subtle motor impairment, cognitive slowing, and fatigue, Scalfari and colleagues recommend tools such as neurological stress tests, fatigue/mood scales, wearable devices, and patient reported outcomes.

Disease Mechanisms

Pathologically, the authors wrote, smoldering MS may stem from intrinsic central nervous system processes that likely incorporate various glial, immune, and neural cells. Smoldering MS also could contribute to aging, and vice versa, the latter possibly through dynamics such as age-related exhaustion of compensatory mechanisms, reduction in remyelination efficiency, and telomere shortening, they added.

Clinical Implementation

Current MS management rests on crude estimates of physical disability and overemphasizes identifying relapses and new MRI lesions as the principal markers of disease activity, wrote Scalfari and colleagues. Instead, they suggested combining motor-associated assessments such as EDSS-Plus with cognitive gauges such as the Brief International Cognitive Assessment for Multiple Sclerosis.

Providers are uncomfortable identifying and discussing smoldering MS, authors allowed, because no licensed treatments target SAW. However, the authors wrote, a principal reason for discussing smoldering MS with patients is to help manage their expectations of current DMTs, which may have little effect on SAW.

 

‘More Than Lesions’

Bruce Cree, MD, PhD, MAS, professor of neurology at the University of California, San Francisco, said that it is extremely important to raise awareness of physicians’ emerging understanding that “there is more going on in MS than lesions and relapses,” a concept that has been a work in progress for several years. He was not involved with the study but was asked to comment.

Dr. Bruce Cree

A 2019 report on the EPIC cohort coauthored by Cree labeled the disconnect between disability accumulation and relapse occurrence “silent progression.” The observation that disability accumulates in early relapsing MS independent of relapsing activity has been replicated in virtually every dataset worldwide, he added.

“What I don’t like about this article is the reliance on the term ‘smoldering’ and the acceptance that this is an actual phenomenon supported by data.” And authors’ leveraging “smoldering” into additional acronyms such as SAW likely will confuse rather than clarify physicians’ and patients’ understanding of the situation, Cree added. “Clinicians don’t need yet another snappy acronym.” Many are still trying to grasp the PIRA concept in relapsing MS, he said.

“One of the reasons this topic has become so important is that we recognize that even when we have very good control of relapsing disease activity — clinical relapses as well as radiographic large lesion formation on MRI — some patients still develop insidious worsening of disability. And the reasons for that are not well understood,” said Cree.

Accumulating disability absent relapse activity could stem from any number of microscopic inflammatory processes, possibly involving abnormal microglial activation, fibrinogen deposition, microscopic inflammatory infiltrates of CD8-positive T cells, or mitochondrial damage from iron deposition, he said. Or the processes driving PIRA may not even involve inflammation, he added. “We still don’t have a unifying way of understanding how these processes work.”

Cree suspects that, despite investigators’ good intentions, the study’s sponsor, Sanofi, may have influenced the resultant messaging. The company’s tolebrutinib recently completed phase 3 trials in secondary progressive MS and relapsing MS, and a phase 3 trial in primary progressive MS is scheduled for completion in 2025. “A hallmark of Sanofi’s messaging has been this idea that there is smoldering inflammation occurring in MS that tolebrutinib is going to address,” he said.

If clinicians really knew what drove progressive MS, said Cree, “we would be keen on developing therapies targeting that fundamental process. But because we don’t know what’s driving it, we don’t know what to go after.”

The study was supported by Sanofi. Cree is a coauthor of the GEMINI 1 and GEMINI 2 tolebrutinib studies.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 11/15/2024 - 11:45
Un-Gate On Date
Fri, 11/15/2024 - 11:45
Use ProPublica
CFC Schedule Remove Status
Fri, 11/15/2024 - 11:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 11/15/2024 - 11:45

Faster Brain Atrophy Linked to MCI

Article Type
Changed
Wed, 11/13/2024 - 09:21

 

A long-term brain imaging study in aging adults showed faster rates of atrophy in certain brain structures to be associated with the risk of developing mild cognitive impairment (MCI).

While some brain atrophy is expected in aging, high levels of atrophy in the white matter and high enlargement in the ventricles are associated with earlier progression from normal cognition to MCI, the study found. The researchers also identified diabetes and atypical levels of amyloid beta protein in the cerebrospinal fluid as risk factors for brain atrophy and MCI.

For their research, published online on JAMA Network Open, Yuto Uchida, MD, PhD, and his colleagues at the Johns Hopkins University School of Medicine in Baltimore, Maryland, looked at data for 185 individuals (mean age, 55.4 years; 63% women) who were cognitively normal at baseline and followed for a median of 20 years.

All had been enrolled in a longitudinal cohort study on biomarkers of cognitive decline conducted at Johns Hopkins. Each participant underwent a median of five structural MRI studies during the follow-up period as well as annual cognitive testing. Altogether 60 individuals developed MCI, with eight of them progressing to dementia.

“We hypothesized that annual rates of change of segmental brain volumes would be associated with vascular risk factors among middle-aged and older adults and that these trends would be associated with the progression from normal cognition to MCI,” Uchida and colleagues wrote.
 

Uniquely Long Follow-Up

Most longitudinal studies using structural MRI count a decade or less of follow-up, the study authors noted. This makes it difficult to discern whether the annual rates of change of brain volumes are affected by vascular risk factors or are useful in predicting MCI, they said. Individual differences in brain aging make population-based studies less informative.

This study’s long timeframe allowed for tracking of brain changes “on an individual basis, which facilitates the differentiation between interindividual and intraindividual variations and leads to more accurate estimations of rates of brain atrophy,” Uchida and colleagues wrote.

People with high levels of atrophy in the white matter and enlargement in the ventricles saw earlier progression to MCI (hazard ratio [HR], 1.86; 95% CI, 1.24-2.49; P = .001). Diabetes mellitus was associated with progression to MCI (HR, 1.41; 95% CI, 1.06-1.76; P = .04), as was a low CSF Abeta42:Abeta40 ratio (HR, 1.48; 95% CI, 1.09-1.88; P = .04).

People with both diabetes and an abnormal amyloid profile were even more vulnerable to developing MCI (HR, 1.55; 95% CI, 1.13-1.98; P = .03). This indicated “a synergic association of diabetes and amyloid pathology with MCI progression,” Uchida and colleagues wrote, noting that insulin resistance has been shown to promote the formation of amyloid plaques, a hallmark of Alzheimer’s disease.

The findings also underscore that “white matter volume changes are closely associated with cognitive function in aging, suggesting that white matter degeneration may play a crucial role in cognitive decline,” the authors noted.

Uchida and colleagues acknowledged the modest size and imbalanced sex ratio of their study cohort as potential weaknesses, as well as the fact that the imaging technologies had changed over the course of the study. Most of the participants were White with family histories of dementia.
 

Findings May Lead to Targeted Interventions

In an editorial comment accompanying Uchida and colleagues’ study, Shohei Fujita, MD, PhD, of Massachusetts General Hospital, Boston, said that, while a more diverse population sample would be desirable and should be sought for future studies, the results nonetheless highlight “the potential of long-term longitudinal brain MRI datasets in elucidating the interplay of risk factors underlying cognitive decline and the potential benefits of controlling diabetes to reduce the risk of progression” along the Alzheimer’s disease continuum.

The findings may prove informative, Fujita said, in developing “targeted interventions for those most susceptible to progressive brain changes, potentially combining lifestyle modifications and pharmacological treatments.”

Uchida and colleagues’ study was funded by the Alzheimer’s Association, the National Alzheimer’s Coordinating Center, and the National Institutes of Health. The study’s corresponding author, Kenichi Oishi, disclosed funding from the Richman Family Foundation, Richman, the Sharp Family Foundation, and others. Uchida and Fujita reported no relevant financial conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

A long-term brain imaging study in aging adults showed faster rates of atrophy in certain brain structures to be associated with the risk of developing mild cognitive impairment (MCI).

While some brain atrophy is expected in aging, high levels of atrophy in the white matter and high enlargement in the ventricles are associated with earlier progression from normal cognition to MCI, the study found. The researchers also identified diabetes and atypical levels of amyloid beta protein in the cerebrospinal fluid as risk factors for brain atrophy and MCI.

For their research, published online on JAMA Network Open, Yuto Uchida, MD, PhD, and his colleagues at the Johns Hopkins University School of Medicine in Baltimore, Maryland, looked at data for 185 individuals (mean age, 55.4 years; 63% women) who were cognitively normal at baseline and followed for a median of 20 years.

All had been enrolled in a longitudinal cohort study on biomarkers of cognitive decline conducted at Johns Hopkins. Each participant underwent a median of five structural MRI studies during the follow-up period as well as annual cognitive testing. Altogether 60 individuals developed MCI, with eight of them progressing to dementia.

“We hypothesized that annual rates of change of segmental brain volumes would be associated with vascular risk factors among middle-aged and older adults and that these trends would be associated with the progression from normal cognition to MCI,” Uchida and colleagues wrote.
 

Uniquely Long Follow-Up

Most longitudinal studies using structural MRI count a decade or less of follow-up, the study authors noted. This makes it difficult to discern whether the annual rates of change of brain volumes are affected by vascular risk factors or are useful in predicting MCI, they said. Individual differences in brain aging make population-based studies less informative.

This study’s long timeframe allowed for tracking of brain changes “on an individual basis, which facilitates the differentiation between interindividual and intraindividual variations and leads to more accurate estimations of rates of brain atrophy,” Uchida and colleagues wrote.

People with high levels of atrophy in the white matter and enlargement in the ventricles saw earlier progression to MCI (hazard ratio [HR], 1.86; 95% CI, 1.24-2.49; P = .001). Diabetes mellitus was associated with progression to MCI (HR, 1.41; 95% CI, 1.06-1.76; P = .04), as was a low CSF Abeta42:Abeta40 ratio (HR, 1.48; 95% CI, 1.09-1.88; P = .04).

People with both diabetes and an abnormal amyloid profile were even more vulnerable to developing MCI (HR, 1.55; 95% CI, 1.13-1.98; P = .03). This indicated “a synergic association of diabetes and amyloid pathology with MCI progression,” Uchida and colleagues wrote, noting that insulin resistance has been shown to promote the formation of amyloid plaques, a hallmark of Alzheimer’s disease.

The findings also underscore that “white matter volume changes are closely associated with cognitive function in aging, suggesting that white matter degeneration may play a crucial role in cognitive decline,” the authors noted.

Uchida and colleagues acknowledged the modest size and imbalanced sex ratio of their study cohort as potential weaknesses, as well as the fact that the imaging technologies had changed over the course of the study. Most of the participants were White with family histories of dementia.
 

Findings May Lead to Targeted Interventions

In an editorial comment accompanying Uchida and colleagues’ study, Shohei Fujita, MD, PhD, of Massachusetts General Hospital, Boston, said that, while a more diverse population sample would be desirable and should be sought for future studies, the results nonetheless highlight “the potential of long-term longitudinal brain MRI datasets in elucidating the interplay of risk factors underlying cognitive decline and the potential benefits of controlling diabetes to reduce the risk of progression” along the Alzheimer’s disease continuum.

The findings may prove informative, Fujita said, in developing “targeted interventions for those most susceptible to progressive brain changes, potentially combining lifestyle modifications and pharmacological treatments.”

Uchida and colleagues’ study was funded by the Alzheimer’s Association, the National Alzheimer’s Coordinating Center, and the National Institutes of Health. The study’s corresponding author, Kenichi Oishi, disclosed funding from the Richman Family Foundation, Richman, the Sharp Family Foundation, and others. Uchida and Fujita reported no relevant financial conflicts of interest.

A version of this article first appeared on Medscape.com.

 

A long-term brain imaging study in aging adults showed faster rates of atrophy in certain brain structures to be associated with the risk of developing mild cognitive impairment (MCI).

While some brain atrophy is expected in aging, high levels of atrophy in the white matter and high enlargement in the ventricles are associated with earlier progression from normal cognition to MCI, the study found. The researchers also identified diabetes and atypical levels of amyloid beta protein in the cerebrospinal fluid as risk factors for brain atrophy and MCI.

For their research, published online on JAMA Network Open, Yuto Uchida, MD, PhD, and his colleagues at the Johns Hopkins University School of Medicine in Baltimore, Maryland, looked at data for 185 individuals (mean age, 55.4 years; 63% women) who were cognitively normal at baseline and followed for a median of 20 years.

All had been enrolled in a longitudinal cohort study on biomarkers of cognitive decline conducted at Johns Hopkins. Each participant underwent a median of five structural MRI studies during the follow-up period as well as annual cognitive testing. Altogether 60 individuals developed MCI, with eight of them progressing to dementia.

“We hypothesized that annual rates of change of segmental brain volumes would be associated with vascular risk factors among middle-aged and older adults and that these trends would be associated with the progression from normal cognition to MCI,” Uchida and colleagues wrote.
 

Uniquely Long Follow-Up

Most longitudinal studies using structural MRI count a decade or less of follow-up, the study authors noted. This makes it difficult to discern whether the annual rates of change of brain volumes are affected by vascular risk factors or are useful in predicting MCI, they said. Individual differences in brain aging make population-based studies less informative.

This study’s long timeframe allowed for tracking of brain changes “on an individual basis, which facilitates the differentiation between interindividual and intraindividual variations and leads to more accurate estimations of rates of brain atrophy,” Uchida and colleagues wrote.

People with high levels of atrophy in the white matter and enlargement in the ventricles saw earlier progression to MCI (hazard ratio [HR], 1.86; 95% CI, 1.24-2.49; P = .001). Diabetes mellitus was associated with progression to MCI (HR, 1.41; 95% CI, 1.06-1.76; P = .04), as was a low CSF Abeta42:Abeta40 ratio (HR, 1.48; 95% CI, 1.09-1.88; P = .04).

People with both diabetes and an abnormal amyloid profile were even more vulnerable to developing MCI (HR, 1.55; 95% CI, 1.13-1.98; P = .03). This indicated “a synergic association of diabetes and amyloid pathology with MCI progression,” Uchida and colleagues wrote, noting that insulin resistance has been shown to promote the formation of amyloid plaques, a hallmark of Alzheimer’s disease.

The findings also underscore that “white matter volume changes are closely associated with cognitive function in aging, suggesting that white matter degeneration may play a crucial role in cognitive decline,” the authors noted.

Uchida and colleagues acknowledged the modest size and imbalanced sex ratio of their study cohort as potential weaknesses, as well as the fact that the imaging technologies had changed over the course of the study. Most of the participants were White with family histories of dementia.
 

Findings May Lead to Targeted Interventions

In an editorial comment accompanying Uchida and colleagues’ study, Shohei Fujita, MD, PhD, of Massachusetts General Hospital, Boston, said that, while a more diverse population sample would be desirable and should be sought for future studies, the results nonetheless highlight “the potential of long-term longitudinal brain MRI datasets in elucidating the interplay of risk factors underlying cognitive decline and the potential benefits of controlling diabetes to reduce the risk of progression” along the Alzheimer’s disease continuum.

The findings may prove informative, Fujita said, in developing “targeted interventions for those most susceptible to progressive brain changes, potentially combining lifestyle modifications and pharmacological treatments.”

Uchida and colleagues’ study was funded by the Alzheimer’s Association, the National Alzheimer’s Coordinating Center, and the National Institutes of Health. The study’s corresponding author, Kenichi Oishi, disclosed funding from the Richman Family Foundation, Richman, the Sharp Family Foundation, and others. Uchida and Fujita reported no relevant financial conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Experts Challenge New Diagnostic Criteria for Alzheimer’s disease

Article Type
Changed
Wed, 11/06/2024 - 15:03

A group of international experts is challenging revised diagnostic criteria for Alzheimer’s disease as laid out by the Alzheimer’s Association earlier in 2024.

In a paper published online in JAMA Neurology, the International Working Group (IWG), which includes 46 experts from 17 countries, is recommending that the diagnosis of Alzheimer’s disease be limited to individuals with mild cognitive impairment or dementia and not be applied to cognitively normal individuals with Alzheimer’s disease biomarkers such as amyloid-beta 42/40 or p-tau.

Clinicians should be “very careful” about using the “A” word (Alzheimer’s) for cognitively unimpaired people with Alzheimer’s disease biomarkers, said the paper’s first author Bruno Dubois, MD, professor of neurology, Sorbonne University and Department of Neurology, Pitié-Salpêtrière Hospital, Paris, France.

Providing an Alzheimer’s disease diagnosis to those who have a high chance of never developing cognitive impairment can be psychologically harmful, said Dubois.

“It’s not something small like telling someone they have a fever. Just imagine you’re 65 years old and are amyloid positive, and you’re told you have Alzheimer’s disease. It affects the decisions you make for the rest of your life and changes your vision of your future, even though you may never develop the disease,” he added.
 

Divergent View

The IWG’s perspective on Alzheimer’s disease contrasts with a recent proposal from the Alzheimer’s Association. The Alzheimer’s Association criteria suggest that Alzheimer’s disease should be regarded solely as a biological entity, which could include cognitively normal individuals with one core Alzheimer’s disease biomarker.

The IWG noted that its concerns regarding the application of a purely biological definition of Alzheimer’s disease in clinical practice prompted the group to consider updating its guidelines, potentially offering “an alternative definitional view of Alzheimer’s disease as a clinical-biological construct for clinical use.”

The group conducted a PubMed search for relevant Alzheimer’s disease articles, and included references, published between July 2020 and March 2024. The research showed the majority of biomarker-positive, cognitively normal individuals will not become symptomatic during their lifetime.

The risk of a 55-year-old who is amyloid positive developing Alzheimer’s disease is not that much higher than that for an individual of a similar age who is amyloid negative, Dubois noted. “There’s an 83% chance that person will never develop Alzheimer’s disease.”

Disclosing a diagnosis of Alzheimer’s disease to cognitively normal people with only one core Alzheimer’s disease biomarker represents “the most problematic implication of a purely biological definition of the disease,” the authors noted.

“A biomarker is a marker of pathology, not a biomarker of disease,” said Dubois, adding that a person may have markers for several different brain diseases.

The IWG recommends the following nomenclature: At risk for Alzheimer’s disease for those with Alzheimer’s disease biomarkers but low lifetime risk and presymptomatic Alzheimer’s disease for those with Alzheimer’s disease biomarkers with a very high lifetime risk for progression such as individuals with autosomal dominant genetic mutations and other distinct biomarker profiles that put them at extremely high lifetime risk of developing the disease.

Dubois emphasized the difference between those showing typical Alzheimer’s disease symptoms with positive biomarkers who should be considered to have the disease and those with positive biomarkers but no typical Alzheimer’s disease symptoms who should be considered at risk.

This is an important distinction as it affects research approaches and assessment of risks, he said.

For low-risk asymptomatic individuals, the IWG does not recommend routine diagnostic testing outside of the research setting. “There’s no reason to send a 65-year-old cognitively normal subject off to collect biomarker information,” said Dubois.

He reiterated the importance of clinicians using appropriate and sensitive language surrounding Alzheimer’s disease when face to face with patients. This issue “is not purely semantic; this is real life.”

For these patients in the clinical setting, “we have to be very careful about proposing treatments that may have side effects,” he said.

However, this does not mean asymptomatic at-risk people should not be studied to determine what pharmacological interventions might prevent or delay the onset of clinical disease, he noted.

Presymptomatic individuals who are at a high risk of developing Alzheimer’s disease “should be the target for clinical trials in the future” to determine best ways to delay the conversion to Alzheimer’s disease, he said.

The main focus of such research should be to better understand the “biomarker pattern profile” that is associated with a high risk of developing Alzheimer’s disease, said Dubois.
 

 

 

Plea for Unity

In an accompanying editorial, Ronald C. Petersen, PhD, MD, director, Mayo Clinic Alzheimer’s Disease Research Center and Mayo Clinic Study of Aging, Rochester, Minnesota, and colleagues outline the difference between the IWG and Alzheimer’s Association positions.

As the IWG uses Alzheimer’s disease to define those with cognitive impairment and the Alzheimer’s Association group uses Alzheimer’s disease to define those with the pathology of the disease, the field is now at a crossroads. “Do we name the disease before clinical symptoms?” they asked.

They note that Alzheimer’s Association criteria distinguish between a disease and an illness, whereas the IWG does not. “As such, although the primary disagreement between the groups is semantic, the ramifications of the labeling can be significant.”

It is “incumbent” that the field “come together” on an Alzheimer’s disease definition, the editorial concluded. “Neither the Alzheimer’s Association or IWG documents are appropriate to serve as a guide for how to apply biomarkers in a clinical setting. Appropriate-use criteria are needed to form a bridge between biological frameworks and real-world clinical practice so we can all maximally help all of our patients with this disorder.”

In a comment, Reisa Sperling, MD, professor of neurology, Harvard Medical School, and director, Center for Alzheimer Research and Treatment, Brigham and Women’s Hospital and Massachusetts General Hospital, all in Boston, who is part of the Alzheimer’s Association work group that published the revised criteria for diagnosis and staging of Alzheimer’s disease, likened Alzheimer’s disease, which begins in the brain many years before dementia onset, to cardiovascular disease in that it involves multiple processes. She noted the World Health Organization classifies cardiovascular disease as a “disease” prior to clinical manifestations such as stroke and myocardial infarction.

“If someone has Alzheimer’s disease pathology in their brain, they are at risk for dementia or clinical manifestations of the disease — just like vascular disease quantifies the risk of stroke or heart attack, not risk of developing ‘vascular disease’ if the underlying vascular disease is already present,” said Sperling.

A large part of the controversy is related to terminology and the “stigma” of the “A” word in the same way there used to be fear around using the “C” word — cancer, said Sperling.

“Once people began talking about cancer publicly as a potentially treatable disease and began getting screened and diagnosed before symptoms of cancer were manifest, this has had a tremendous impact on public health.”

She clarified that her work group does not recommend screening asymptomatic people with Alzheimer’s disease biomarkers. “We actually need to prove that treating at the preclinical stage of the disease is able to prevent clinical impairment and dementia,” she said, adding “hopefully, we are getting closer to this.”

Dubois reported no relevant disclosures. Petersen reported receiving personal fees from Roche, Genentech, Eli Lilly and Company, Eisai, and Novo Nordisk outside the submitted work and royalties from Oxford University Press, UpToDate, and Medscape educational activities.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A group of international experts is challenging revised diagnostic criteria for Alzheimer’s disease as laid out by the Alzheimer’s Association earlier in 2024.

In a paper published online in JAMA Neurology, the International Working Group (IWG), which includes 46 experts from 17 countries, is recommending that the diagnosis of Alzheimer’s disease be limited to individuals with mild cognitive impairment or dementia and not be applied to cognitively normal individuals with Alzheimer’s disease biomarkers such as amyloid-beta 42/40 or p-tau.

Clinicians should be “very careful” about using the “A” word (Alzheimer’s) for cognitively unimpaired people with Alzheimer’s disease biomarkers, said the paper’s first author Bruno Dubois, MD, professor of neurology, Sorbonne University and Department of Neurology, Pitié-Salpêtrière Hospital, Paris, France.

Providing an Alzheimer’s disease diagnosis to those who have a high chance of never developing cognitive impairment can be psychologically harmful, said Dubois.

“It’s not something small like telling someone they have a fever. Just imagine you’re 65 years old and are amyloid positive, and you’re told you have Alzheimer’s disease. It affects the decisions you make for the rest of your life and changes your vision of your future, even though you may never develop the disease,” he added.
 

Divergent View

The IWG’s perspective on Alzheimer’s disease contrasts with a recent proposal from the Alzheimer’s Association. The Alzheimer’s Association criteria suggest that Alzheimer’s disease should be regarded solely as a biological entity, which could include cognitively normal individuals with one core Alzheimer’s disease biomarker.

The IWG noted that its concerns regarding the application of a purely biological definition of Alzheimer’s disease in clinical practice prompted the group to consider updating its guidelines, potentially offering “an alternative definitional view of Alzheimer’s disease as a clinical-biological construct for clinical use.”

The group conducted a PubMed search for relevant Alzheimer’s disease articles, and included references, published between July 2020 and March 2024. The research showed the majority of biomarker-positive, cognitively normal individuals will not become symptomatic during their lifetime.

The risk of a 55-year-old who is amyloid positive developing Alzheimer’s disease is not that much higher than that for an individual of a similar age who is amyloid negative, Dubois noted. “There’s an 83% chance that person will never develop Alzheimer’s disease.”

Disclosing a diagnosis of Alzheimer’s disease to cognitively normal people with only one core Alzheimer’s disease biomarker represents “the most problematic implication of a purely biological definition of the disease,” the authors noted.

“A biomarker is a marker of pathology, not a biomarker of disease,” said Dubois, adding that a person may have markers for several different brain diseases.

The IWG recommends the following nomenclature: At risk for Alzheimer’s disease for those with Alzheimer’s disease biomarkers but low lifetime risk and presymptomatic Alzheimer’s disease for those with Alzheimer’s disease biomarkers with a very high lifetime risk for progression such as individuals with autosomal dominant genetic mutations and other distinct biomarker profiles that put them at extremely high lifetime risk of developing the disease.

Dubois emphasized the difference between those showing typical Alzheimer’s disease symptoms with positive biomarkers who should be considered to have the disease and those with positive biomarkers but no typical Alzheimer’s disease symptoms who should be considered at risk.

This is an important distinction as it affects research approaches and assessment of risks, he said.

For low-risk asymptomatic individuals, the IWG does not recommend routine diagnostic testing outside of the research setting. “There’s no reason to send a 65-year-old cognitively normal subject off to collect biomarker information,” said Dubois.

He reiterated the importance of clinicians using appropriate and sensitive language surrounding Alzheimer’s disease when face to face with patients. This issue “is not purely semantic; this is real life.”

For these patients in the clinical setting, “we have to be very careful about proposing treatments that may have side effects,” he said.

However, this does not mean asymptomatic at-risk people should not be studied to determine what pharmacological interventions might prevent or delay the onset of clinical disease, he noted.

Presymptomatic individuals who are at a high risk of developing Alzheimer’s disease “should be the target for clinical trials in the future” to determine best ways to delay the conversion to Alzheimer’s disease, he said.

The main focus of such research should be to better understand the “biomarker pattern profile” that is associated with a high risk of developing Alzheimer’s disease, said Dubois.
 

 

 

Plea for Unity

In an accompanying editorial, Ronald C. Petersen, PhD, MD, director, Mayo Clinic Alzheimer’s Disease Research Center and Mayo Clinic Study of Aging, Rochester, Minnesota, and colleagues outline the difference between the IWG and Alzheimer’s Association positions.

As the IWG uses Alzheimer’s disease to define those with cognitive impairment and the Alzheimer’s Association group uses Alzheimer’s disease to define those with the pathology of the disease, the field is now at a crossroads. “Do we name the disease before clinical symptoms?” they asked.

They note that Alzheimer’s Association criteria distinguish between a disease and an illness, whereas the IWG does not. “As such, although the primary disagreement between the groups is semantic, the ramifications of the labeling can be significant.”

It is “incumbent” that the field “come together” on an Alzheimer’s disease definition, the editorial concluded. “Neither the Alzheimer’s Association or IWG documents are appropriate to serve as a guide for how to apply biomarkers in a clinical setting. Appropriate-use criteria are needed to form a bridge between biological frameworks and real-world clinical practice so we can all maximally help all of our patients with this disorder.”

In a comment, Reisa Sperling, MD, professor of neurology, Harvard Medical School, and director, Center for Alzheimer Research and Treatment, Brigham and Women’s Hospital and Massachusetts General Hospital, all in Boston, who is part of the Alzheimer’s Association work group that published the revised criteria for diagnosis and staging of Alzheimer’s disease, likened Alzheimer’s disease, which begins in the brain many years before dementia onset, to cardiovascular disease in that it involves multiple processes. She noted the World Health Organization classifies cardiovascular disease as a “disease” prior to clinical manifestations such as stroke and myocardial infarction.

“If someone has Alzheimer’s disease pathology in their brain, they are at risk for dementia or clinical manifestations of the disease — just like vascular disease quantifies the risk of stroke or heart attack, not risk of developing ‘vascular disease’ if the underlying vascular disease is already present,” said Sperling.

A large part of the controversy is related to terminology and the “stigma” of the “A” word in the same way there used to be fear around using the “C” word — cancer, said Sperling.

“Once people began talking about cancer publicly as a potentially treatable disease and began getting screened and diagnosed before symptoms of cancer were manifest, this has had a tremendous impact on public health.”

She clarified that her work group does not recommend screening asymptomatic people with Alzheimer’s disease biomarkers. “We actually need to prove that treating at the preclinical stage of the disease is able to prevent clinical impairment and dementia,” she said, adding “hopefully, we are getting closer to this.”

Dubois reported no relevant disclosures. Petersen reported receiving personal fees from Roche, Genentech, Eli Lilly and Company, Eisai, and Novo Nordisk outside the submitted work and royalties from Oxford University Press, UpToDate, and Medscape educational activities.

A version of this article appeared on Medscape.com.

A group of international experts is challenging revised diagnostic criteria for Alzheimer’s disease as laid out by the Alzheimer’s Association earlier in 2024.

In a paper published online in JAMA Neurology, the International Working Group (IWG), which includes 46 experts from 17 countries, is recommending that the diagnosis of Alzheimer’s disease be limited to individuals with mild cognitive impairment or dementia and not be applied to cognitively normal individuals with Alzheimer’s disease biomarkers such as amyloid-beta 42/40 or p-tau.

Clinicians should be “very careful” about using the “A” word (Alzheimer’s) for cognitively unimpaired people with Alzheimer’s disease biomarkers, said the paper’s first author Bruno Dubois, MD, professor of neurology, Sorbonne University and Department of Neurology, Pitié-Salpêtrière Hospital, Paris, France.

Providing an Alzheimer’s disease diagnosis to those who have a high chance of never developing cognitive impairment can be psychologically harmful, said Dubois.

“It’s not something small like telling someone they have a fever. Just imagine you’re 65 years old and are amyloid positive, and you’re told you have Alzheimer’s disease. It affects the decisions you make for the rest of your life and changes your vision of your future, even though you may never develop the disease,” he added.
 

Divergent View

The IWG’s perspective on Alzheimer’s disease contrasts with a recent proposal from the Alzheimer’s Association. The Alzheimer’s Association criteria suggest that Alzheimer’s disease should be regarded solely as a biological entity, which could include cognitively normal individuals with one core Alzheimer’s disease biomarker.

The IWG noted that its concerns regarding the application of a purely biological definition of Alzheimer’s disease in clinical practice prompted the group to consider updating its guidelines, potentially offering “an alternative definitional view of Alzheimer’s disease as a clinical-biological construct for clinical use.”

The group conducted a PubMed search for relevant Alzheimer’s disease articles, and included references, published between July 2020 and March 2024. The research showed the majority of biomarker-positive, cognitively normal individuals will not become symptomatic during their lifetime.

The risk of a 55-year-old who is amyloid positive developing Alzheimer’s disease is not that much higher than that for an individual of a similar age who is amyloid negative, Dubois noted. “There’s an 83% chance that person will never develop Alzheimer’s disease.”

Disclosing a diagnosis of Alzheimer’s disease to cognitively normal people with only one core Alzheimer’s disease biomarker represents “the most problematic implication of a purely biological definition of the disease,” the authors noted.

“A biomarker is a marker of pathology, not a biomarker of disease,” said Dubois, adding that a person may have markers for several different brain diseases.

The IWG recommends the following nomenclature: At risk for Alzheimer’s disease for those with Alzheimer’s disease biomarkers but low lifetime risk and presymptomatic Alzheimer’s disease for those with Alzheimer’s disease biomarkers with a very high lifetime risk for progression such as individuals with autosomal dominant genetic mutations and other distinct biomarker profiles that put them at extremely high lifetime risk of developing the disease.

Dubois emphasized the difference between those showing typical Alzheimer’s disease symptoms with positive biomarkers who should be considered to have the disease and those with positive biomarkers but no typical Alzheimer’s disease symptoms who should be considered at risk.

This is an important distinction as it affects research approaches and assessment of risks, he said.

For low-risk asymptomatic individuals, the IWG does not recommend routine diagnostic testing outside of the research setting. “There’s no reason to send a 65-year-old cognitively normal subject off to collect biomarker information,” said Dubois.

He reiterated the importance of clinicians using appropriate and sensitive language surrounding Alzheimer’s disease when face to face with patients. This issue “is not purely semantic; this is real life.”

For these patients in the clinical setting, “we have to be very careful about proposing treatments that may have side effects,” he said.

However, this does not mean asymptomatic at-risk people should not be studied to determine what pharmacological interventions might prevent or delay the onset of clinical disease, he noted.

Presymptomatic individuals who are at a high risk of developing Alzheimer’s disease “should be the target for clinical trials in the future” to determine best ways to delay the conversion to Alzheimer’s disease, he said.

The main focus of such research should be to better understand the “biomarker pattern profile” that is associated with a high risk of developing Alzheimer’s disease, said Dubois.
 

 

 

Plea for Unity

In an accompanying editorial, Ronald C. Petersen, PhD, MD, director, Mayo Clinic Alzheimer’s Disease Research Center and Mayo Clinic Study of Aging, Rochester, Minnesota, and colleagues outline the difference between the IWG and Alzheimer’s Association positions.

As the IWG uses Alzheimer’s disease to define those with cognitive impairment and the Alzheimer’s Association group uses Alzheimer’s disease to define those with the pathology of the disease, the field is now at a crossroads. “Do we name the disease before clinical symptoms?” they asked.

They note that Alzheimer’s Association criteria distinguish between a disease and an illness, whereas the IWG does not. “As such, although the primary disagreement between the groups is semantic, the ramifications of the labeling can be significant.”

It is “incumbent” that the field “come together” on an Alzheimer’s disease definition, the editorial concluded. “Neither the Alzheimer’s Association or IWG documents are appropriate to serve as a guide for how to apply biomarkers in a clinical setting. Appropriate-use criteria are needed to form a bridge between biological frameworks and real-world clinical practice so we can all maximally help all of our patients with this disorder.”

In a comment, Reisa Sperling, MD, professor of neurology, Harvard Medical School, and director, Center for Alzheimer Research and Treatment, Brigham and Women’s Hospital and Massachusetts General Hospital, all in Boston, who is part of the Alzheimer’s Association work group that published the revised criteria for diagnosis and staging of Alzheimer’s disease, likened Alzheimer’s disease, which begins in the brain many years before dementia onset, to cardiovascular disease in that it involves multiple processes. She noted the World Health Organization classifies cardiovascular disease as a “disease” prior to clinical manifestations such as stroke and myocardial infarction.

“If someone has Alzheimer’s disease pathology in their brain, they are at risk for dementia or clinical manifestations of the disease — just like vascular disease quantifies the risk of stroke or heart attack, not risk of developing ‘vascular disease’ if the underlying vascular disease is already present,” said Sperling.

A large part of the controversy is related to terminology and the “stigma” of the “A” word in the same way there used to be fear around using the “C” word — cancer, said Sperling.

“Once people began talking about cancer publicly as a potentially treatable disease and began getting screened and diagnosed before symptoms of cancer were manifest, this has had a tremendous impact on public health.”

She clarified that her work group does not recommend screening asymptomatic people with Alzheimer’s disease biomarkers. “We actually need to prove that treating at the preclinical stage of the disease is able to prevent clinical impairment and dementia,” she said, adding “hopefully, we are getting closer to this.”

Dubois reported no relevant disclosures. Petersen reported receiving personal fees from Roche, Genentech, Eli Lilly and Company, Eisai, and Novo Nordisk outside the submitted work and royalties from Oxford University Press, UpToDate, and Medscape educational activities.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

From JAMA Neurology

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Brews, Bubbles, & Booze: Stroke Risk and Patients’ Favorite Drinks

Article Type
Changed
Tue, 11/05/2024 - 13:25

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Being a Weekend Warrior Linked to Lower Dementia Risk

Article Type
Changed
Tue, 11/05/2024 - 10:09

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cannabis Use Linked to Brain Thinning in Adolescents

Article Type
Changed
Mon, 11/04/2024 - 16:08

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROSCIENCE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Novel Intervention Slows Cognitive Decline in At-Risk Adults

Article Type
Changed
Mon, 11/04/2024 - 12:07

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Rising Stroke Rates in Californians With Sickle Cell Disease

Article Type
Changed
Mon, 10/28/2024 - 15:39

 

TOPLINE:

Stroke rates in Californians with sickle cell disease (SCD) have increased in both children and adults in the post-STOP era. The cumulative incidence of first ischemic stroke was 2.1% by age 20 and 13.5% by age 60.

METHODOLOGY:

  • Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
  • A total of 7636 patients with SCD were included in the study cohort.
  • Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
  • Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
  • The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.

TAKEAWAY:

  • The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
  • Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
  • Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
  • The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.

IN PRACTICE:

“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.

SOURCE:

This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.

LIMITATIONS:

This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.

DISCLOSURES:

The authors reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Stroke rates in Californians with sickle cell disease (SCD) have increased in both children and adults in the post-STOP era. The cumulative incidence of first ischemic stroke was 2.1% by age 20 and 13.5% by age 60.

METHODOLOGY:

  • Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
  • A total of 7636 patients with SCD were included in the study cohort.
  • Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
  • Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
  • The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.

TAKEAWAY:

  • The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
  • Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
  • Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
  • The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.

IN PRACTICE:

“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.

SOURCE:

This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.

LIMITATIONS:

This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.

DISCLOSURES:

The authors reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Stroke rates in Californians with sickle cell disease (SCD) have increased in both children and adults in the post-STOP era. The cumulative incidence of first ischemic stroke was 2.1% by age 20 and 13.5% by age 60.

METHODOLOGY:

  • Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
  • A total of 7636 patients with SCD were included in the study cohort.
  • Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
  • Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
  • The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.

TAKEAWAY:

  • The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
  • Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
  • Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
  • The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.

IN PRACTICE:

“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.

SOURCE:

This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.

LIMITATIONS:

This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.

DISCLOSURES:

The authors reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

More Evidence Ties Semaglutide to Reduced Alzheimer’s Risk

Article Type
Changed
Tue, 10/29/2024 - 05:49

A new study provides real-world evidence to support the potential repurposing of glucagon-like peptide 1 receptor agonists (GLP-1 RAs), used to treat type 2 diabetes and obesity, for prevention of Alzheimer’s disease.

Adults with type 2 diabetes who were prescribed the GLP-1 RA semaglutide had a significantly lower risk for Alzheimer’s disease compared with their peers who were prescribed any of seven other antidiabetic medications, including other types of GLP-1 receptor–targeting medications. 

“These findings support further clinical trials to assess semaglutide’s potential in delaying or preventing Alzheimer’s disease,” wrote the investigators, led by Rong Xu, PhD, with Case Western Reserve School of Medicine, Cleveland, Ohio. 

The study was published online on October 24 in Alzheimer’s & Dementia.
 

Real-World Data

Semaglutide has shown neuroprotective effects in animal models of neurodegenerative diseases, including Alzheimer’s disease and Parkinson’s disease. In animal models of Alzheimer’s disease, the drug reduced beta-amyloid deposition and improved spatial learning and memory, as well as glucose metabolism in the brain. 

In a real-world analysis, Xu and colleagues used electronic health record data to identify 17,104 new users of semaglutide and 1,077,657 new users of seven other antidiabetic medications, including other GLP-1 RAs, insulin, metformin, dipeptidyl peptidase 4 inhibitors, sodium-glucose cotransporter 2 inhibitors, sulfonylurea, and thiazolidinedione.

Over 3 years, treatment with semaglutide was associated with significantly reduced risk of developing Alzheimer’s disease, most strongly compared with insulin (hazard ratio [HR], 0.33) and most weakly compared with other GLP-1 RAs (HR, 0.59). 

Compared with the other medications, semaglutide was associated with a 40%-70% reduced risk for first-time diagnosis of Alzheimer’s disease in patients with type 2 diabetes, with similar reductions seen across obesity status and gender and age groups, the authors reported. 

The findings align with recent evidence suggesting GLP-1 RAs may protect cognitive function. 

For example, as previously reported, in the phase 2b ELAD clinical trial, adults with early-stage Alzheimer’s disease taking the GLP-1 RA liraglutide exhibited slower decline in memory and thinking and experienced less brain atrophy over 12 months compared with placebo. 
 

Promising, but Preliminary 

Reached for comment, Courtney Kloske, PhD, Alzheimer’s Association director of scientific engagement, noted that diabetes is a known risk factor for AD and managing diabetes with drugs such as semaglutide “could benefit brain health simply by managing diabetes.”

“However, we still need large clinical trials in representative populations to determine if semaglutide specifically lowers the risk of Alzheimer’s, so it is too early to recommend it for prevention,” Kloske said. 

She noted that some research suggests that GLP-1 RAs “may help reduce inflammation and positively impact brain energy use. However, more research is needed to fully understand how these processes might contribute to preventing cognitive decline or Alzheimer’s,” Kloske cautioned. 

The Alzheimer’s Association’s “Part the Cloud” initiative has invested more than $68 million to advance 65 clinical trials targeting a variety of compounds, including repurposed drugs that may address known and potential new aspects of the disease, Kloske said. 

The study was supported by grants from the National Institute on Aging and the National Center for Advancing Translational Sciences. Xu and Kloske have no relevant conflicts.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A new study provides real-world evidence to support the potential repurposing of glucagon-like peptide 1 receptor agonists (GLP-1 RAs), used to treat type 2 diabetes and obesity, for prevention of Alzheimer’s disease.

Adults with type 2 diabetes who were prescribed the GLP-1 RA semaglutide had a significantly lower risk for Alzheimer’s disease compared with their peers who were prescribed any of seven other antidiabetic medications, including other types of GLP-1 receptor–targeting medications. 

“These findings support further clinical trials to assess semaglutide’s potential in delaying or preventing Alzheimer’s disease,” wrote the investigators, led by Rong Xu, PhD, with Case Western Reserve School of Medicine, Cleveland, Ohio. 

The study was published online on October 24 in Alzheimer’s & Dementia.
 

Real-World Data

Semaglutide has shown neuroprotective effects in animal models of neurodegenerative diseases, including Alzheimer’s disease and Parkinson’s disease. In animal models of Alzheimer’s disease, the drug reduced beta-amyloid deposition and improved spatial learning and memory, as well as glucose metabolism in the brain. 

In a real-world analysis, Xu and colleagues used electronic health record data to identify 17,104 new users of semaglutide and 1,077,657 new users of seven other antidiabetic medications, including other GLP-1 RAs, insulin, metformin, dipeptidyl peptidase 4 inhibitors, sodium-glucose cotransporter 2 inhibitors, sulfonylurea, and thiazolidinedione.

Over 3 years, treatment with semaglutide was associated with significantly reduced risk of developing Alzheimer’s disease, most strongly compared with insulin (hazard ratio [HR], 0.33) and most weakly compared with other GLP-1 RAs (HR, 0.59). 

Compared with the other medications, semaglutide was associated with a 40%-70% reduced risk for first-time diagnosis of Alzheimer’s disease in patients with type 2 diabetes, with similar reductions seen across obesity status and gender and age groups, the authors reported. 

The findings align with recent evidence suggesting GLP-1 RAs may protect cognitive function. 

For example, as previously reported, in the phase 2b ELAD clinical trial, adults with early-stage Alzheimer’s disease taking the GLP-1 RA liraglutide exhibited slower decline in memory and thinking and experienced less brain atrophy over 12 months compared with placebo. 
 

Promising, but Preliminary 

Reached for comment, Courtney Kloske, PhD, Alzheimer’s Association director of scientific engagement, noted that diabetes is a known risk factor for AD and managing diabetes with drugs such as semaglutide “could benefit brain health simply by managing diabetes.”

“However, we still need large clinical trials in representative populations to determine if semaglutide specifically lowers the risk of Alzheimer’s, so it is too early to recommend it for prevention,” Kloske said. 

She noted that some research suggests that GLP-1 RAs “may help reduce inflammation and positively impact brain energy use. However, more research is needed to fully understand how these processes might contribute to preventing cognitive decline or Alzheimer’s,” Kloske cautioned. 

The Alzheimer’s Association’s “Part the Cloud” initiative has invested more than $68 million to advance 65 clinical trials targeting a variety of compounds, including repurposed drugs that may address known and potential new aspects of the disease, Kloske said. 

The study was supported by grants from the National Institute on Aging and the National Center for Advancing Translational Sciences. Xu and Kloske have no relevant conflicts.
 

A version of this article appeared on Medscape.com.

A new study provides real-world evidence to support the potential repurposing of glucagon-like peptide 1 receptor agonists (GLP-1 RAs), used to treat type 2 diabetes and obesity, for prevention of Alzheimer’s disease.

Adults with type 2 diabetes who were prescribed the GLP-1 RA semaglutide had a significantly lower risk for Alzheimer’s disease compared with their peers who were prescribed any of seven other antidiabetic medications, including other types of GLP-1 receptor–targeting medications. 

“These findings support further clinical trials to assess semaglutide’s potential in delaying or preventing Alzheimer’s disease,” wrote the investigators, led by Rong Xu, PhD, with Case Western Reserve School of Medicine, Cleveland, Ohio. 

The study was published online on October 24 in Alzheimer’s & Dementia.
 

Real-World Data

Semaglutide has shown neuroprotective effects in animal models of neurodegenerative diseases, including Alzheimer’s disease and Parkinson’s disease. In animal models of Alzheimer’s disease, the drug reduced beta-amyloid deposition and improved spatial learning and memory, as well as glucose metabolism in the brain. 

In a real-world analysis, Xu and colleagues used electronic health record data to identify 17,104 new users of semaglutide and 1,077,657 new users of seven other antidiabetic medications, including other GLP-1 RAs, insulin, metformin, dipeptidyl peptidase 4 inhibitors, sodium-glucose cotransporter 2 inhibitors, sulfonylurea, and thiazolidinedione.

Over 3 years, treatment with semaglutide was associated with significantly reduced risk of developing Alzheimer’s disease, most strongly compared with insulin (hazard ratio [HR], 0.33) and most weakly compared with other GLP-1 RAs (HR, 0.59). 

Compared with the other medications, semaglutide was associated with a 40%-70% reduced risk for first-time diagnosis of Alzheimer’s disease in patients with type 2 diabetes, with similar reductions seen across obesity status and gender and age groups, the authors reported. 

The findings align with recent evidence suggesting GLP-1 RAs may protect cognitive function. 

For example, as previously reported, in the phase 2b ELAD clinical trial, adults with early-stage Alzheimer’s disease taking the GLP-1 RA liraglutide exhibited slower decline in memory and thinking and experienced less brain atrophy over 12 months compared with placebo. 
 

Promising, but Preliminary 

Reached for comment, Courtney Kloske, PhD, Alzheimer’s Association director of scientific engagement, noted that diabetes is a known risk factor for AD and managing diabetes with drugs such as semaglutide “could benefit brain health simply by managing diabetes.”

“However, we still need large clinical trials in representative populations to determine if semaglutide specifically lowers the risk of Alzheimer’s, so it is too early to recommend it for prevention,” Kloske said. 

She noted that some research suggests that GLP-1 RAs “may help reduce inflammation and positively impact brain energy use. However, more research is needed to fully understand how these processes might contribute to preventing cognitive decline or Alzheimer’s,” Kloske cautioned. 

The Alzheimer’s Association’s “Part the Cloud” initiative has invested more than $68 million to advance 65 clinical trials targeting a variety of compounds, including repurposed drugs that may address known and potential new aspects of the disease, Kloske said. 

The study was supported by grants from the National Institute on Aging and the National Center for Advancing Translational Sciences. Xu and Kloske have no relevant conflicts.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ALZHEIMER’S & DEMENTIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article