LayerRx Mapping ID
118
Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image
Medscape Lead Concept
902

Blood biomarker may help predict who will develop Alzheimer’s

Article Type
Changed

A blood biomarker that measures astrocyte reactivity may help determine who, among cognitively unimpaired older adults with amyloid-beta, will go on to develop Alzheimer’s disease (AD), new research suggests.

Investigators tested the blood of 1,000 cognitively healthy individuals with and without amyloid-beta pathology and found that only those with a combination of amyloid-beta burden and abnormal astrocyte activation subsequently progressed to AD.

“Our study argues that testing for the presence of brain amyloid along with blood biomarkers of astrocyte reactivity is the optimal screening to identify patients who are most at risk for progressing to Alzheimer’s disease,” senior investigator Tharick A. Pascoal, MD, PhD, associate professor of psychiatry and neurology, University of Pittsburgh, said in a release.

At this point, the biomarker is a research tool, but its application in clinical practice “is not very far away,” Dr. Pascoal told this news organization.

The study was published online  in Nature Medicine.  
 

Multicenter study

In AD, accumulation of amyloid-beta in the brain precedes tau pathology, but not everyone with amyloid-beta develops tau, and, consequently, clinical symptoms. Approximately 30% of older adults have brain amyloid but many never progress to AD, said Dr. Pascoal.

This suggests other biological processes may trigger the deleterious effects of amyloid-beta in the early stages of AD.

Finding predictive markers of early amyloid-beta–related tau pathology would help identify cognitively normal individuals who are more likely to develop AD.

Post-mortem studies show astrocyte reactivity – changes in glial cells in the brain and spinal cord because of an insult in the brain – is an early AD abnormality. Other research suggests a close link between amyloid-beta, astrocyte reactivity, and tau.

In addition, evidence suggests plasma measures of glial fibrillary acidic protein (GFAP) could be a strong proxy of astrocyte reactivity in the brain. Dr. Pascoal explained that when astrocytes are changed or become bigger, more GFAP is released.

The study included 1,016 cognitively normal individuals from three centers; some had amyloid pathology, some did not. Participants’ mean age was 69.6 years, and all were deemed negative or positive for astrocyte reactivity based on plasma GFAP levels.

Results showed amyloid-beta is associated with increased plasma phosphorylated tau only in individuals positive for astrocyte reactivity. In addition, analyses using PET scans showed an AD-like pattern of tau tangle accumulation as a function of amyloid-beta exclusively in those same individuals.
 

Early upstream event

The findings suggest abnormalities in astrocyte reactivity is an early upstream event that likely occurs prior to tau pathology, which is closely related to the development of neurodegeneration and cognitive decline.

It’s likely many types of insults or processes can lead to astrocyte reactivity, possibly including COVID, but more research in this area is needed, said Dr. Pascoal.

“Our study only looked at the consequence of having both amyloid and astrocyte reactivity; it did not elucidate what is causing either of them,” he said.

Although “we were able to have very good results” in the current study, additional studies are needed to better establish the cut-off for GFAP levels that signal progression, said Dr. Pascoal.

The effect of astrocyte reactivity on the association between amyloid-beta and tau phosphorylation was greater in men than women. Dr. Pascoal noted anti-amyloid therapies, which might be modifying the amyloid-beta-astrocyte-tau pathway, tend to have a much larger effect in men than women.

Further studies that measure amyloid-beta, tau, and GFAP biomarkers at multiple timepoints, and with long follow-up, are needed, the investigators note.

The results may have implications for clinical trials, which have increasingly focused on individuals in the earliest preclinical phases of AD. Future studies should include cognitively normal patients who are positive for both amyloid pathology and astrocyte reactivity but have no overt p-tau abnormality, said Dr. Pascoal.

This may provide a time window for interventions very early in the disease process in those at increased risk for AD-related progression.

The study did not determine whether participants with both amyloid and astrocyte reactivity will inevitably develop AD, and to do so would require a longer follow up. “Our outcome was correlation to tau in the brain, which is something we know will lead to AD.”

Although the cohort represents significant socioeconomic diversity, a main limitation of the study was that subjects were mainly White, which limits the generalizability of the findings to a more diverse population.

The study received support from the National Institute of Aging; National Heart Lung and Blood Institute; Alzheimer’s Association; Fonds de Recherche du Québec-Santé; Canadian Consortium of Neurodegeneration in Aging; Weston Brain Institute; Colin Adair Charitable Foundation; Swedish Research Council; Wallenberg Scholar; BrightFocus Foundation; Swedish Alzheimer Foundation; Swedish Brain Foundation; Agneta Prytz-Folkes & Gösta Folkes Foundation; European Union; Swedish State Support for Clinical Research; Alzheimer Drug Discovery Foundation; Bluefield Project, the Olav Thon Foundation, the Erling-Persson Family Foundation, Stiftelsen för Gamla Tjänarinnor, Hjärnfonden, Sweden; the UK Dementia Research Institute at UCL; National Academy of Neuropsychology; Fundação de Amparo a pesquisa do Rio Grande do Sul; Instituto Serrapilheira; and Hjärnfonden.

Dr. Pascoal reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

A blood biomarker that measures astrocyte reactivity may help determine who, among cognitively unimpaired older adults with amyloid-beta, will go on to develop Alzheimer’s disease (AD), new research suggests.

Investigators tested the blood of 1,000 cognitively healthy individuals with and without amyloid-beta pathology and found that only those with a combination of amyloid-beta burden and abnormal astrocyte activation subsequently progressed to AD.

“Our study argues that testing for the presence of brain amyloid along with blood biomarkers of astrocyte reactivity is the optimal screening to identify patients who are most at risk for progressing to Alzheimer’s disease,” senior investigator Tharick A. Pascoal, MD, PhD, associate professor of psychiatry and neurology, University of Pittsburgh, said in a release.

At this point, the biomarker is a research tool, but its application in clinical practice “is not very far away,” Dr. Pascoal told this news organization.

The study was published online  in Nature Medicine.  
 

Multicenter study

In AD, accumulation of amyloid-beta in the brain precedes tau pathology, but not everyone with amyloid-beta develops tau, and, consequently, clinical symptoms. Approximately 30% of older adults have brain amyloid but many never progress to AD, said Dr. Pascoal.

This suggests other biological processes may trigger the deleterious effects of amyloid-beta in the early stages of AD.

Finding predictive markers of early amyloid-beta–related tau pathology would help identify cognitively normal individuals who are more likely to develop AD.

Post-mortem studies show astrocyte reactivity – changes in glial cells in the brain and spinal cord because of an insult in the brain – is an early AD abnormality. Other research suggests a close link between amyloid-beta, astrocyte reactivity, and tau.

In addition, evidence suggests plasma measures of glial fibrillary acidic protein (GFAP) could be a strong proxy of astrocyte reactivity in the brain. Dr. Pascoal explained that when astrocytes are changed or become bigger, more GFAP is released.

The study included 1,016 cognitively normal individuals from three centers; some had amyloid pathology, some did not. Participants’ mean age was 69.6 years, and all were deemed negative or positive for astrocyte reactivity based on plasma GFAP levels.

Results showed amyloid-beta is associated with increased plasma phosphorylated tau only in individuals positive for astrocyte reactivity. In addition, analyses using PET scans showed an AD-like pattern of tau tangle accumulation as a function of amyloid-beta exclusively in those same individuals.
 

Early upstream event

The findings suggest abnormalities in astrocyte reactivity is an early upstream event that likely occurs prior to tau pathology, which is closely related to the development of neurodegeneration and cognitive decline.

It’s likely many types of insults or processes can lead to astrocyte reactivity, possibly including COVID, but more research in this area is needed, said Dr. Pascoal.

“Our study only looked at the consequence of having both amyloid and astrocyte reactivity; it did not elucidate what is causing either of them,” he said.

Although “we were able to have very good results” in the current study, additional studies are needed to better establish the cut-off for GFAP levels that signal progression, said Dr. Pascoal.

The effect of astrocyte reactivity on the association between amyloid-beta and tau phosphorylation was greater in men than women. Dr. Pascoal noted anti-amyloid therapies, which might be modifying the amyloid-beta-astrocyte-tau pathway, tend to have a much larger effect in men than women.

Further studies that measure amyloid-beta, tau, and GFAP biomarkers at multiple timepoints, and with long follow-up, are needed, the investigators note.

The results may have implications for clinical trials, which have increasingly focused on individuals in the earliest preclinical phases of AD. Future studies should include cognitively normal patients who are positive for both amyloid pathology and astrocyte reactivity but have no overt p-tau abnormality, said Dr. Pascoal.

This may provide a time window for interventions very early in the disease process in those at increased risk for AD-related progression.

The study did not determine whether participants with both amyloid and astrocyte reactivity will inevitably develop AD, and to do so would require a longer follow up. “Our outcome was correlation to tau in the brain, which is something we know will lead to AD.”

Although the cohort represents significant socioeconomic diversity, a main limitation of the study was that subjects were mainly White, which limits the generalizability of the findings to a more diverse population.

The study received support from the National Institute of Aging; National Heart Lung and Blood Institute; Alzheimer’s Association; Fonds de Recherche du Québec-Santé; Canadian Consortium of Neurodegeneration in Aging; Weston Brain Institute; Colin Adair Charitable Foundation; Swedish Research Council; Wallenberg Scholar; BrightFocus Foundation; Swedish Alzheimer Foundation; Swedish Brain Foundation; Agneta Prytz-Folkes & Gösta Folkes Foundation; European Union; Swedish State Support for Clinical Research; Alzheimer Drug Discovery Foundation; Bluefield Project, the Olav Thon Foundation, the Erling-Persson Family Foundation, Stiftelsen för Gamla Tjänarinnor, Hjärnfonden, Sweden; the UK Dementia Research Institute at UCL; National Academy of Neuropsychology; Fundação de Amparo a pesquisa do Rio Grande do Sul; Instituto Serrapilheira; and Hjärnfonden.

Dr. Pascoal reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

A blood biomarker that measures astrocyte reactivity may help determine who, among cognitively unimpaired older adults with amyloid-beta, will go on to develop Alzheimer’s disease (AD), new research suggests.

Investigators tested the blood of 1,000 cognitively healthy individuals with and without amyloid-beta pathology and found that only those with a combination of amyloid-beta burden and abnormal astrocyte activation subsequently progressed to AD.

“Our study argues that testing for the presence of brain amyloid along with blood biomarkers of astrocyte reactivity is the optimal screening to identify patients who are most at risk for progressing to Alzheimer’s disease,” senior investigator Tharick A. Pascoal, MD, PhD, associate professor of psychiatry and neurology, University of Pittsburgh, said in a release.

At this point, the biomarker is a research tool, but its application in clinical practice “is not very far away,” Dr. Pascoal told this news organization.

The study was published online  in Nature Medicine.  
 

Multicenter study

In AD, accumulation of amyloid-beta in the brain precedes tau pathology, but not everyone with amyloid-beta develops tau, and, consequently, clinical symptoms. Approximately 30% of older adults have brain amyloid but many never progress to AD, said Dr. Pascoal.

This suggests other biological processes may trigger the deleterious effects of amyloid-beta in the early stages of AD.

Finding predictive markers of early amyloid-beta–related tau pathology would help identify cognitively normal individuals who are more likely to develop AD.

Post-mortem studies show astrocyte reactivity – changes in glial cells in the brain and spinal cord because of an insult in the brain – is an early AD abnormality. Other research suggests a close link between amyloid-beta, astrocyte reactivity, and tau.

In addition, evidence suggests plasma measures of glial fibrillary acidic protein (GFAP) could be a strong proxy of astrocyte reactivity in the brain. Dr. Pascoal explained that when astrocytes are changed or become bigger, more GFAP is released.

The study included 1,016 cognitively normal individuals from three centers; some had amyloid pathology, some did not. Participants’ mean age was 69.6 years, and all were deemed negative or positive for astrocyte reactivity based on plasma GFAP levels.

Results showed amyloid-beta is associated with increased plasma phosphorylated tau only in individuals positive for astrocyte reactivity. In addition, analyses using PET scans showed an AD-like pattern of tau tangle accumulation as a function of amyloid-beta exclusively in those same individuals.
 

Early upstream event

The findings suggest abnormalities in astrocyte reactivity is an early upstream event that likely occurs prior to tau pathology, which is closely related to the development of neurodegeneration and cognitive decline.

It’s likely many types of insults or processes can lead to astrocyte reactivity, possibly including COVID, but more research in this area is needed, said Dr. Pascoal.

“Our study only looked at the consequence of having both amyloid and astrocyte reactivity; it did not elucidate what is causing either of them,” he said.

Although “we were able to have very good results” in the current study, additional studies are needed to better establish the cut-off for GFAP levels that signal progression, said Dr. Pascoal.

The effect of astrocyte reactivity on the association between amyloid-beta and tau phosphorylation was greater in men than women. Dr. Pascoal noted anti-amyloid therapies, which might be modifying the amyloid-beta-astrocyte-tau pathway, tend to have a much larger effect in men than women.

Further studies that measure amyloid-beta, tau, and GFAP biomarkers at multiple timepoints, and with long follow-up, are needed, the investigators note.

The results may have implications for clinical trials, which have increasingly focused on individuals in the earliest preclinical phases of AD. Future studies should include cognitively normal patients who are positive for both amyloid pathology and astrocyte reactivity but have no overt p-tau abnormality, said Dr. Pascoal.

This may provide a time window for interventions very early in the disease process in those at increased risk for AD-related progression.

The study did not determine whether participants with both amyloid and astrocyte reactivity will inevitably develop AD, and to do so would require a longer follow up. “Our outcome was correlation to tau in the brain, which is something we know will lead to AD.”

Although the cohort represents significant socioeconomic diversity, a main limitation of the study was that subjects were mainly White, which limits the generalizability of the findings to a more diverse population.

The study received support from the National Institute of Aging; National Heart Lung and Blood Institute; Alzheimer’s Association; Fonds de Recherche du Québec-Santé; Canadian Consortium of Neurodegeneration in Aging; Weston Brain Institute; Colin Adair Charitable Foundation; Swedish Research Council; Wallenberg Scholar; BrightFocus Foundation; Swedish Alzheimer Foundation; Swedish Brain Foundation; Agneta Prytz-Folkes & Gösta Folkes Foundation; European Union; Swedish State Support for Clinical Research; Alzheimer Drug Discovery Foundation; Bluefield Project, the Olav Thon Foundation, the Erling-Persson Family Foundation, Stiftelsen för Gamla Tjänarinnor, Hjärnfonden, Sweden; the UK Dementia Research Institute at UCL; National Academy of Neuropsychology; Fundação de Amparo a pesquisa do Rio Grande do Sul; Instituto Serrapilheira; and Hjärnfonden.

Dr. Pascoal reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Gout linked to smaller brain volume, higher likelihood of neurodegenerative diseases

Article Type
Changed

 

Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.

“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.

Dr. Anya Topiwala

“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.

“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.


 

Links between gout and neurodegenerative diseases debated in earlier studies

Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.

Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
 

A novel approach that analyzes brain structure and genetics

In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.

Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
 

MRI shows brain changes in patients with gout

In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.

They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.

Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.

Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).

In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.

Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
 

 

 

Genetic analyses reinforce MRI results

Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.

They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.

In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.

Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
 

A novel approach that suggests further related research

Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.

Dr. Puja Khanna

Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.

“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.

Dr. John D. FitzGerald


“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”

“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”


 

Early diagnosis benefits patients

Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.

“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”

Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.

The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.

Publications
Topics
Sections

 

Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.

“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.

Dr. Anya Topiwala

“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.

“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.


 

Links between gout and neurodegenerative diseases debated in earlier studies

Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.

Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
 

A novel approach that analyzes brain structure and genetics

In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.

Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
 

MRI shows brain changes in patients with gout

In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.

They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.

Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.

Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).

In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.

Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
 

 

 

Genetic analyses reinforce MRI results

Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.

They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.

In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.

Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
 

A novel approach that suggests further related research

Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.

Dr. Puja Khanna

Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.

“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.

Dr. John D. FitzGerald


“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”

“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”


 

Early diagnosis benefits patients

Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.

“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”

Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.

The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.

 

Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.

“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.

Dr. Anya Topiwala

“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.

“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.


 

Links between gout and neurodegenerative diseases debated in earlier studies

Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.

Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
 

A novel approach that analyzes brain structure and genetics

In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.

Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
 

MRI shows brain changes in patients with gout

In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.

They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.

Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.

Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).

In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.

Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
 

 

 

Genetic analyses reinforce MRI results

Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.

They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.

In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.

Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
 

A novel approach that suggests further related research

Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.

Dr. Puja Khanna

Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.

“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.

Dr. John D. FitzGerald


“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”

“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”


 

Early diagnosis benefits patients

Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.

“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”

Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.

The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE COMMUNICATIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Game-changing Alzheimer’s research: The latest on biomarkers

Article Type
Changed

The field of neurodegenerative dementias, particularly Alzheimer’s disease (AD), has been revolutionized by the development of imaging and cerebrospinal fluid biomarkers and is on the brink of a new development: emerging plasma biomarkers. Research now recognizes the relationship between the cognitive-behavioral syndromic diagnosis (that is, the illness) and the etiologic diagnosis (the disease) – and the need to consider each separately when developing a diagnostic formulation. The National Institute on Aging and Alzheimer’s Association Research Framework uses the amyloid, tau, and neurodegeneration system to define AD biologically in living patients. Here is an overview of the framework, which requires biomarker evidence of amyloid plaques (amyloid positivity) and neurofibrillary tangles (tau positivity), with evidence of neurodegeneration (neurodegeneration positivity) to support the diagnosis.

The diagnostic approach for symptomatic patients

The differential diagnosis in symptomatic patients with mild cognitive impairment (MCI), mild behavioral impairment, or dementia is broad and includes multiple neurodegenerative diseases (for example, AD, frontotemporal lobar degeneration, dementia with Lewy bodies, argyrophilic grain disease, hippocampal sclerosis); vascular ischemic brain injury (for example, stroke); tumors; infectious, inflammatory, paraneoplastic, or demyelinating diseases; trauma; hydrocephalus; toxic/metabolic insults; and other rare diseases. The patient’s clinical syndrome narrows the differential diagnosis.

Once the clinician has a prioritized differential diagnosis of the brain disease or condition that is probably causing or contributing to the patient’s signs and symptoms, they can then select appropriate assessments and tests, typically starting with a laboratory panel and brain MRI. Strong evidence backed by practice recommendations also supports the use of fluorodeoxyglucose PET as a marker of functional brain abnormalities associated with dementia. Although molecular biomarkers are typically considered at the later stage of the clinical workup, the anticipated future availability of plasma biomarkers will probably change the timing of molecular biomarker assessment in patients with suspected cognitive impairment owing to AD.
 

Molecular PET biomarkers

Three PET tracers approved by the U.S. Food and Drug Administration for the detection of cerebral amyloid plaques have high sensitivity (89%-98%) and specificity (88%-100%), compared with autopsy, the gold standard diagnostic tool. However, these scans are costly and are not reimbursed by Medicare and Medicaid. Because all amyloid PET scans are covered by the Veterans Administration, this test is more readily accessible for patients receiving VA benefits.

The appropriate-use criteria developed by the Amyloid Imaging Task Force recommends amyloid PET for patients with persistent or progressive MCI or dementia. In such patients, a negative amyloid PET scan would strongly weigh against AD, supporting a differential diagnosis of other etiologies. Although a positive amyloid PET scan in patients with MCI or dementia indicates the presence of amyloid plaques, it does not necessarily confirm AD as the cause. Cerebral amyloid plaques may coexist with other pathologies and increase with age, even in cognitively normal individuals.

The IDEAS study looked at the clinical utility of amyloid PET in a real-world dementia specialist setting. In the study, dementia subspecialists documented their presumed etiologic diagnosis (and level of confidence) before and after amyloid PET. Of the 11,409 patients who completed the study, the etiologic diagnosis changed from AD to non-AD in just over 25% of cases and from non-AD to AD in 10.5%. Clinical management changed in about 60% of patients with MCI and 63.5% of patients with dementia.

In May 2020, the FDA approved flortaucipir F-18, the first diagnostic tau radiotracer for use with PET to estimate the density and distribution of aggregated tau neurofibrillary tangles in adults with cognitive impairment undergoing evaluation for AD. Regulatory approval of flortaucipir F-18 was based on findings from two clinical trials of terminally ill patients who were followed to autopsy. The studies included patients with a spectrum of clinically diagnosed dementias and those with normal cognition. The primary outcome of the studies was accurate visual interpretation of the images in detecting advanced AD tau neurofibrillary tangle pathology (Braak stage V or VI tau pathology). Sensitivity of five trained readers ranged from 68% to 86%, and specificity ranged from 63% to 100%; interrater agreement was 0.87. Tau PET is not yet reimbursed and is therefore not yet readily available in the clinical setting. Moreover, appropriate use criteria have not yet been published.
 

 

 

Molecular fluid biomarkers

Cerebrospinal fluid (CSF) analysis is currently the most readily available and reimbursed test to aid in diagnosing AD, with appropriate-use criteria for patients with suspected AD. CSF biomarkers for AD are useful in cognitively impaired patients when the etiologic diagnosis is equivocal, there is only an intermediate level of diagnostic confidence, or there is very high confidence in the etiologic diagnosis. Testing for CSF biomarkers is also recommended for patients at very early clinical stages (for example, early MCI) or with atypical clinical presentations.

A decreased concentration of amyloid-beta 42 in CSF is a marker of amyloid neuritic plaques in the brain. An increased concentration of total tau in CSF reflects injury to neurons, and an increased concentration of specific isoforms of hyperphosphorylated tau reflects neurofibrillary tangles. Presently, the ratios of t-tau to amyloid-beta 42, amyloid-beta 42 to amyloid-beta 40, and phosphorylated-tau 181 to amyloid-beta 42 are the best-performing markers of AD neuropathologic changes and are more accurate than assessing individual biomarkers. These CSF biomarkers of AD have been validated against autopsy, and ratio values of CSF amyloid-beta 42 have been further validated against amyloid PET, with overall sensitivity and specificity of approximately 90% and 84%, respectively.

Some of the most exciting recent advances in AD center around the measurement of these proteins and others in plasma. Appropriate-use criteria for plasma biomarkers in the evaluation of patients with cognitive impairment were published in 2022. In addition to their use in clinical trials, these criteria cautiously recommend using these biomarkers in specialized memory clinics in the diagnostic workup of patients with cognitive symptoms, along with confirmatory CSF markers or PET. Additional data are needed before plasma biomarkers of AD are used as standalone diagnostic markers or considered in the primary care setting.

We have made remarkable progress toward more precise molecular diagnosis of brain diseases underlying cognitive impairment and dementia. Ongoing efforts to evaluate the utility of these measures in clinical practice include the need to increase diversity of patients and providers. Ultimately, the tremendous progress in molecular biomarkers for the diseases causing dementia will help the field work toward our common goal of early and accurate diagnosis, better management, and hope for people living with these diseases.

Bradford C. Dickerson, MD, MMSc, is a professor, department of neurology, Harvard Medical School, and director, Frontotemporal Disorders Unit, department of neurology, at Massachusetts General Hospital, both in Boston.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The field of neurodegenerative dementias, particularly Alzheimer’s disease (AD), has been revolutionized by the development of imaging and cerebrospinal fluid biomarkers and is on the brink of a new development: emerging plasma biomarkers. Research now recognizes the relationship between the cognitive-behavioral syndromic diagnosis (that is, the illness) and the etiologic diagnosis (the disease) – and the need to consider each separately when developing a diagnostic formulation. The National Institute on Aging and Alzheimer’s Association Research Framework uses the amyloid, tau, and neurodegeneration system to define AD biologically in living patients. Here is an overview of the framework, which requires biomarker evidence of amyloid plaques (amyloid positivity) and neurofibrillary tangles (tau positivity), with evidence of neurodegeneration (neurodegeneration positivity) to support the diagnosis.

The diagnostic approach for symptomatic patients

The differential diagnosis in symptomatic patients with mild cognitive impairment (MCI), mild behavioral impairment, or dementia is broad and includes multiple neurodegenerative diseases (for example, AD, frontotemporal lobar degeneration, dementia with Lewy bodies, argyrophilic grain disease, hippocampal sclerosis); vascular ischemic brain injury (for example, stroke); tumors; infectious, inflammatory, paraneoplastic, or demyelinating diseases; trauma; hydrocephalus; toxic/metabolic insults; and other rare diseases. The patient’s clinical syndrome narrows the differential diagnosis.

Once the clinician has a prioritized differential diagnosis of the brain disease or condition that is probably causing or contributing to the patient’s signs and symptoms, they can then select appropriate assessments and tests, typically starting with a laboratory panel and brain MRI. Strong evidence backed by practice recommendations also supports the use of fluorodeoxyglucose PET as a marker of functional brain abnormalities associated with dementia. Although molecular biomarkers are typically considered at the later stage of the clinical workup, the anticipated future availability of plasma biomarkers will probably change the timing of molecular biomarker assessment in patients with suspected cognitive impairment owing to AD.
 

Molecular PET biomarkers

Three PET tracers approved by the U.S. Food and Drug Administration for the detection of cerebral amyloid plaques have high sensitivity (89%-98%) and specificity (88%-100%), compared with autopsy, the gold standard diagnostic tool. However, these scans are costly and are not reimbursed by Medicare and Medicaid. Because all amyloid PET scans are covered by the Veterans Administration, this test is more readily accessible for patients receiving VA benefits.

The appropriate-use criteria developed by the Amyloid Imaging Task Force recommends amyloid PET for patients with persistent or progressive MCI or dementia. In such patients, a negative amyloid PET scan would strongly weigh against AD, supporting a differential diagnosis of other etiologies. Although a positive amyloid PET scan in patients with MCI or dementia indicates the presence of amyloid plaques, it does not necessarily confirm AD as the cause. Cerebral amyloid plaques may coexist with other pathologies and increase with age, even in cognitively normal individuals.

The IDEAS study looked at the clinical utility of amyloid PET in a real-world dementia specialist setting. In the study, dementia subspecialists documented their presumed etiologic diagnosis (and level of confidence) before and after amyloid PET. Of the 11,409 patients who completed the study, the etiologic diagnosis changed from AD to non-AD in just over 25% of cases and from non-AD to AD in 10.5%. Clinical management changed in about 60% of patients with MCI and 63.5% of patients with dementia.

In May 2020, the FDA approved flortaucipir F-18, the first diagnostic tau radiotracer for use with PET to estimate the density and distribution of aggregated tau neurofibrillary tangles in adults with cognitive impairment undergoing evaluation for AD. Regulatory approval of flortaucipir F-18 was based on findings from two clinical trials of terminally ill patients who were followed to autopsy. The studies included patients with a spectrum of clinically diagnosed dementias and those with normal cognition. The primary outcome of the studies was accurate visual interpretation of the images in detecting advanced AD tau neurofibrillary tangle pathology (Braak stage V or VI tau pathology). Sensitivity of five trained readers ranged from 68% to 86%, and specificity ranged from 63% to 100%; interrater agreement was 0.87. Tau PET is not yet reimbursed and is therefore not yet readily available in the clinical setting. Moreover, appropriate use criteria have not yet been published.
 

 

 

Molecular fluid biomarkers

Cerebrospinal fluid (CSF) analysis is currently the most readily available and reimbursed test to aid in diagnosing AD, with appropriate-use criteria for patients with suspected AD. CSF biomarkers for AD are useful in cognitively impaired patients when the etiologic diagnosis is equivocal, there is only an intermediate level of diagnostic confidence, or there is very high confidence in the etiologic diagnosis. Testing for CSF biomarkers is also recommended for patients at very early clinical stages (for example, early MCI) or with atypical clinical presentations.

A decreased concentration of amyloid-beta 42 in CSF is a marker of amyloid neuritic plaques in the brain. An increased concentration of total tau in CSF reflects injury to neurons, and an increased concentration of specific isoforms of hyperphosphorylated tau reflects neurofibrillary tangles. Presently, the ratios of t-tau to amyloid-beta 42, amyloid-beta 42 to amyloid-beta 40, and phosphorylated-tau 181 to amyloid-beta 42 are the best-performing markers of AD neuropathologic changes and are more accurate than assessing individual biomarkers. These CSF biomarkers of AD have been validated against autopsy, and ratio values of CSF amyloid-beta 42 have been further validated against amyloid PET, with overall sensitivity and specificity of approximately 90% and 84%, respectively.

Some of the most exciting recent advances in AD center around the measurement of these proteins and others in plasma. Appropriate-use criteria for plasma biomarkers in the evaluation of patients with cognitive impairment were published in 2022. In addition to their use in clinical trials, these criteria cautiously recommend using these biomarkers in specialized memory clinics in the diagnostic workup of patients with cognitive symptoms, along with confirmatory CSF markers or PET. Additional data are needed before plasma biomarkers of AD are used as standalone diagnostic markers or considered in the primary care setting.

We have made remarkable progress toward more precise molecular diagnosis of brain diseases underlying cognitive impairment and dementia. Ongoing efforts to evaluate the utility of these measures in clinical practice include the need to increase diversity of patients and providers. Ultimately, the tremendous progress in molecular biomarkers for the diseases causing dementia will help the field work toward our common goal of early and accurate diagnosis, better management, and hope for people living with these diseases.

Bradford C. Dickerson, MD, MMSc, is a professor, department of neurology, Harvard Medical School, and director, Frontotemporal Disorders Unit, department of neurology, at Massachusetts General Hospital, both in Boston.

A version of this article first appeared on Medscape.com.

The field of neurodegenerative dementias, particularly Alzheimer’s disease (AD), has been revolutionized by the development of imaging and cerebrospinal fluid biomarkers and is on the brink of a new development: emerging plasma biomarkers. Research now recognizes the relationship between the cognitive-behavioral syndromic diagnosis (that is, the illness) and the etiologic diagnosis (the disease) – and the need to consider each separately when developing a diagnostic formulation. The National Institute on Aging and Alzheimer’s Association Research Framework uses the amyloid, tau, and neurodegeneration system to define AD biologically in living patients. Here is an overview of the framework, which requires biomarker evidence of amyloid plaques (amyloid positivity) and neurofibrillary tangles (tau positivity), with evidence of neurodegeneration (neurodegeneration positivity) to support the diagnosis.

The diagnostic approach for symptomatic patients

The differential diagnosis in symptomatic patients with mild cognitive impairment (MCI), mild behavioral impairment, or dementia is broad and includes multiple neurodegenerative diseases (for example, AD, frontotemporal lobar degeneration, dementia with Lewy bodies, argyrophilic grain disease, hippocampal sclerosis); vascular ischemic brain injury (for example, stroke); tumors; infectious, inflammatory, paraneoplastic, or demyelinating diseases; trauma; hydrocephalus; toxic/metabolic insults; and other rare diseases. The patient’s clinical syndrome narrows the differential diagnosis.

Once the clinician has a prioritized differential diagnosis of the brain disease or condition that is probably causing or contributing to the patient’s signs and symptoms, they can then select appropriate assessments and tests, typically starting with a laboratory panel and brain MRI. Strong evidence backed by practice recommendations also supports the use of fluorodeoxyglucose PET as a marker of functional brain abnormalities associated with dementia. Although molecular biomarkers are typically considered at the later stage of the clinical workup, the anticipated future availability of plasma biomarkers will probably change the timing of molecular biomarker assessment in patients with suspected cognitive impairment owing to AD.
 

Molecular PET biomarkers

Three PET tracers approved by the U.S. Food and Drug Administration for the detection of cerebral amyloid plaques have high sensitivity (89%-98%) and specificity (88%-100%), compared with autopsy, the gold standard diagnostic tool. However, these scans are costly and are not reimbursed by Medicare and Medicaid. Because all amyloid PET scans are covered by the Veterans Administration, this test is more readily accessible for patients receiving VA benefits.

The appropriate-use criteria developed by the Amyloid Imaging Task Force recommends amyloid PET for patients with persistent or progressive MCI or dementia. In such patients, a negative amyloid PET scan would strongly weigh against AD, supporting a differential diagnosis of other etiologies. Although a positive amyloid PET scan in patients with MCI or dementia indicates the presence of amyloid plaques, it does not necessarily confirm AD as the cause. Cerebral amyloid plaques may coexist with other pathologies and increase with age, even in cognitively normal individuals.

The IDEAS study looked at the clinical utility of amyloid PET in a real-world dementia specialist setting. In the study, dementia subspecialists documented their presumed etiologic diagnosis (and level of confidence) before and after amyloid PET. Of the 11,409 patients who completed the study, the etiologic diagnosis changed from AD to non-AD in just over 25% of cases and from non-AD to AD in 10.5%. Clinical management changed in about 60% of patients with MCI and 63.5% of patients with dementia.

In May 2020, the FDA approved flortaucipir F-18, the first diagnostic tau radiotracer for use with PET to estimate the density and distribution of aggregated tau neurofibrillary tangles in adults with cognitive impairment undergoing evaluation for AD. Regulatory approval of flortaucipir F-18 was based on findings from two clinical trials of terminally ill patients who were followed to autopsy. The studies included patients with a spectrum of clinically diagnosed dementias and those with normal cognition. The primary outcome of the studies was accurate visual interpretation of the images in detecting advanced AD tau neurofibrillary tangle pathology (Braak stage V or VI tau pathology). Sensitivity of five trained readers ranged from 68% to 86%, and specificity ranged from 63% to 100%; interrater agreement was 0.87. Tau PET is not yet reimbursed and is therefore not yet readily available in the clinical setting. Moreover, appropriate use criteria have not yet been published.
 

 

 

Molecular fluid biomarkers

Cerebrospinal fluid (CSF) analysis is currently the most readily available and reimbursed test to aid in diagnosing AD, with appropriate-use criteria for patients with suspected AD. CSF biomarkers for AD are useful in cognitively impaired patients when the etiologic diagnosis is equivocal, there is only an intermediate level of diagnostic confidence, or there is very high confidence in the etiologic diagnosis. Testing for CSF biomarkers is also recommended for patients at very early clinical stages (for example, early MCI) or with atypical clinical presentations.

A decreased concentration of amyloid-beta 42 in CSF is a marker of amyloid neuritic plaques in the brain. An increased concentration of total tau in CSF reflects injury to neurons, and an increased concentration of specific isoforms of hyperphosphorylated tau reflects neurofibrillary tangles. Presently, the ratios of t-tau to amyloid-beta 42, amyloid-beta 42 to amyloid-beta 40, and phosphorylated-tau 181 to amyloid-beta 42 are the best-performing markers of AD neuropathologic changes and are more accurate than assessing individual biomarkers. These CSF biomarkers of AD have been validated against autopsy, and ratio values of CSF amyloid-beta 42 have been further validated against amyloid PET, with overall sensitivity and specificity of approximately 90% and 84%, respectively.

Some of the most exciting recent advances in AD center around the measurement of these proteins and others in plasma. Appropriate-use criteria for plasma biomarkers in the evaluation of patients with cognitive impairment were published in 2022. In addition to their use in clinical trials, these criteria cautiously recommend using these biomarkers in specialized memory clinics in the diagnostic workup of patients with cognitive symptoms, along with confirmatory CSF markers or PET. Additional data are needed before plasma biomarkers of AD are used as standalone diagnostic markers or considered in the primary care setting.

We have made remarkable progress toward more precise molecular diagnosis of brain diseases underlying cognitive impairment and dementia. Ongoing efforts to evaluate the utility of these measures in clinical practice include the need to increase diversity of patients and providers. Ultimately, the tremendous progress in molecular biomarkers for the diseases causing dementia will help the field work toward our common goal of early and accurate diagnosis, better management, and hope for people living with these diseases.

Bradford C. Dickerson, MD, MMSc, is a professor, department of neurology, Harvard Medical School, and director, Frontotemporal Disorders Unit, department of neurology, at Massachusetts General Hospital, both in Boston.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Flavanol supplement improves memory in adults with poor diets

Article Type
Changed

Taking a daily flavanol supplement improves hippocampal-dependent memory in older adults who have a relatively poor diet, results of a large new study suggest.

There’s increasing evidence that certain nutrients are important for the aging body and brain, study investigator Scott Small, MD, the Boris and Rose Katz Professor of Neurology, Columbia University Vagelos College of Physicians and Surgeons, New York, told this news organization.

“With this new study, I think we can begin to say flavanols might be the first one that really is a nutrient for the aging brain.”

These findings, said Dr. Small, represent “the beginning of a new era” that will eventually lead to formal recommendations” related to ideal intake of flavanols to reduce cognitive aging.

The findings were published online in the Proceedings of the National Academy of Science.
 

Better cognitive aging

Cognitive aging refers to the decline in cognitive abilities that are not thought to be caused by neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease. Cognitive aging targets two areas of the brain: the hippocampus, which is related to memory function, and the prefrontal cortex, which is related to attention and executive function.

Previous research has linked flavanols, which are found in foods like apples, pears, berries, and cocoa beans, to improved cognitive aging. The evidence shows that consuming these nutrients might be associated with the hippocampal-dependent memory component of cognitive aging.

The new study, known as COcoa Supplement and Multivitamin Outcomes Study-Web (COSMOS-Web), included 3,562 generally healthy men and women, mean age 71 years, who were mostly well-educated and non-Hispanic/non-Latinx White individuals.

Participants were randomly assigned to receive oral flavanol-containing cocoa extract (500 mg of cocoa flavanols, including 80 mg of epicatechin) or a placebo daily.

The primary endpoint was hippocampal-dependent memory at year 1 as assessed with the ModRey, a neuropsychological test designed to measure hippocampal function.

Results showed participants in both groups had a typical learning (practice) effect, with similar improvements (d = 0.025; P = .42).

Researchers used other tests to measure cognition: the Color/Directional Flanker Task, a measure of prefrontal cortex function, and the ModBent, a measure that’s sensitive to dentate gyrus function. The flavanol intervention did not affect ModBent results or performance on the Flanker test after 1 year.

However, it was a different story for those with a poor diet at baseline. Researchers stratified participants into tertiles on the basis of diet quality as measured by the Healthy Eating Index (HEI) scores. Those in the lowest tertile had poorer baseline hippocampal-dependent memory performance but not memory related to the prefrontal cortex.

The flavanol intervention improved performance on the ModRey test, compared with placebo in participants in the low HEI tertile (overall effect: d = 0.086; P = .011) but not among those with a medium or high HEI at baseline.

“We confirmed that the flavanol intervention only benefits people who are relatively deficient at baseline,” said Dr. Small.

The correlation with hippocampal-dependent memory was confirmed in a subset of 1,361 study participants who provided a urine sample. Researchers measured urinary 5-(3′,4′-dihydroxyphenyl)-gamma-valerolactone metabolite (gVLM) concentrations, a validated biomarker of flavanol consumption.

After stratifying these results into tertiles, researchers found performance on the ModRey was significantly improved with the dietary flavanol intervention (overall effect: d = 0.141; P = .006) in the lowest gVLM tertile.
 

 

 

Memory restored

When participants in the lowest tertile consumed the supplement, “their flavanol levels went back to normal, and when that happened, their memory was restored,” said Dr. Small.

It appears that there is a sort of ceiling effect to the flavanol benefits. “It seems what you need to do is normalize your flavanol levels; if you go above normal, there was no evidence that your memory keeps on getting better,” said Dr. Small.

The study included only older adults, so it’s unclear what the impact of flavanol supplementation is in younger adults. But cognitive aging “begins its slippery side” in the 40s, said Dr. Small. “If this is truly a nutrient that is taken to prevent that slide from happening, it might be beneficial to start in our 40s.”

He recognized that the effect size is not large but said this is “very dependent” on baseline factors and most study participants had a rather healthy diet. “None of our participants were really highly deficient” in flavanols, he said.

“To see a stronger effect size, we need to do another study where we recruit people who are very low, truly deficient, in flavanols, and then see what happens.”

Showing that flavanols are linked to the hippocampal and not to the prefrontal component of cognitive aging “speaks to the mechanism,” said Dr. Small.

Though the exact mechanism linking flavanols with enhanced memory isn’t clear, there are some clues; for example, research suggests cognitive aging affects the dentate gyrus, a subregion of the hippocampus.

The flavanol supplements were well tolerated. “I can say with close to certainty that this is very safe,” said Dr. Small, adding the flavanols have now been used in numerous studies.

The findings suggest flavanol consumption might be part of future dietary guidelines. “I suspect that once there is sufficient evidence, flavanols will be part of the dietary recommendations for healthy aging,” said Dr. Small.
 

A word of caution

Heather M. Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said that though science suggests a balanced diet is good for overall brain health, no single food, beverage, ingredient, vitamin, or supplement has yet been proven to prevent dementia, treat or cure Alzheimer’s, or benefit cognitive function or brain health.

Experts agree the best source of vitamins and other nutrients is from whole foods as part of a balanced diet. “We recognize that, for a variety of reasons, this may not always be possible,” said Dr. Snyder.

However, she noted, dietary supplements are not subject to the same rigorous review and regulation process as medications.

“The Alzheimer’s Association strongly encourages individuals to have conversations with their physicians about all medications and dietary supplements they are currently taking or interested in starting.” 

COSMOS is supported by an investigator-initiated grant from Mars Edge, a segment of Mars, company engaged in flavanol research and flavanol-related commercial activities, which included infrastructure support and the donation of study pills and packaging. Small reports receiving an unrestricted research grant from Mars.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Taking a daily flavanol supplement improves hippocampal-dependent memory in older adults who have a relatively poor diet, results of a large new study suggest.

There’s increasing evidence that certain nutrients are important for the aging body and brain, study investigator Scott Small, MD, the Boris and Rose Katz Professor of Neurology, Columbia University Vagelos College of Physicians and Surgeons, New York, told this news organization.

“With this new study, I think we can begin to say flavanols might be the first one that really is a nutrient for the aging brain.”

These findings, said Dr. Small, represent “the beginning of a new era” that will eventually lead to formal recommendations” related to ideal intake of flavanols to reduce cognitive aging.

The findings were published online in the Proceedings of the National Academy of Science.
 

Better cognitive aging

Cognitive aging refers to the decline in cognitive abilities that are not thought to be caused by neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease. Cognitive aging targets two areas of the brain: the hippocampus, which is related to memory function, and the prefrontal cortex, which is related to attention and executive function.

Previous research has linked flavanols, which are found in foods like apples, pears, berries, and cocoa beans, to improved cognitive aging. The evidence shows that consuming these nutrients might be associated with the hippocampal-dependent memory component of cognitive aging.

The new study, known as COcoa Supplement and Multivitamin Outcomes Study-Web (COSMOS-Web), included 3,562 generally healthy men and women, mean age 71 years, who were mostly well-educated and non-Hispanic/non-Latinx White individuals.

Participants were randomly assigned to receive oral flavanol-containing cocoa extract (500 mg of cocoa flavanols, including 80 mg of epicatechin) or a placebo daily.

The primary endpoint was hippocampal-dependent memory at year 1 as assessed with the ModRey, a neuropsychological test designed to measure hippocampal function.

Results showed participants in both groups had a typical learning (practice) effect, with similar improvements (d = 0.025; P = .42).

Researchers used other tests to measure cognition: the Color/Directional Flanker Task, a measure of prefrontal cortex function, and the ModBent, a measure that’s sensitive to dentate gyrus function. The flavanol intervention did not affect ModBent results or performance on the Flanker test after 1 year.

However, it was a different story for those with a poor diet at baseline. Researchers stratified participants into tertiles on the basis of diet quality as measured by the Healthy Eating Index (HEI) scores. Those in the lowest tertile had poorer baseline hippocampal-dependent memory performance but not memory related to the prefrontal cortex.

The flavanol intervention improved performance on the ModRey test, compared with placebo in participants in the low HEI tertile (overall effect: d = 0.086; P = .011) but not among those with a medium or high HEI at baseline.

“We confirmed that the flavanol intervention only benefits people who are relatively deficient at baseline,” said Dr. Small.

The correlation with hippocampal-dependent memory was confirmed in a subset of 1,361 study participants who provided a urine sample. Researchers measured urinary 5-(3′,4′-dihydroxyphenyl)-gamma-valerolactone metabolite (gVLM) concentrations, a validated biomarker of flavanol consumption.

After stratifying these results into tertiles, researchers found performance on the ModRey was significantly improved with the dietary flavanol intervention (overall effect: d = 0.141; P = .006) in the lowest gVLM tertile.
 

 

 

Memory restored

When participants in the lowest tertile consumed the supplement, “their flavanol levels went back to normal, and when that happened, their memory was restored,” said Dr. Small.

It appears that there is a sort of ceiling effect to the flavanol benefits. “It seems what you need to do is normalize your flavanol levels; if you go above normal, there was no evidence that your memory keeps on getting better,” said Dr. Small.

The study included only older adults, so it’s unclear what the impact of flavanol supplementation is in younger adults. But cognitive aging “begins its slippery side” in the 40s, said Dr. Small. “If this is truly a nutrient that is taken to prevent that slide from happening, it might be beneficial to start in our 40s.”

He recognized that the effect size is not large but said this is “very dependent” on baseline factors and most study participants had a rather healthy diet. “None of our participants were really highly deficient” in flavanols, he said.

“To see a stronger effect size, we need to do another study where we recruit people who are very low, truly deficient, in flavanols, and then see what happens.”

Showing that flavanols are linked to the hippocampal and not to the prefrontal component of cognitive aging “speaks to the mechanism,” said Dr. Small.

Though the exact mechanism linking flavanols with enhanced memory isn’t clear, there are some clues; for example, research suggests cognitive aging affects the dentate gyrus, a subregion of the hippocampus.

The flavanol supplements were well tolerated. “I can say with close to certainty that this is very safe,” said Dr. Small, adding the flavanols have now been used in numerous studies.

The findings suggest flavanol consumption might be part of future dietary guidelines. “I suspect that once there is sufficient evidence, flavanols will be part of the dietary recommendations for healthy aging,” said Dr. Small.
 

A word of caution

Heather M. Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said that though science suggests a balanced diet is good for overall brain health, no single food, beverage, ingredient, vitamin, or supplement has yet been proven to prevent dementia, treat or cure Alzheimer’s, or benefit cognitive function or brain health.

Experts agree the best source of vitamins and other nutrients is from whole foods as part of a balanced diet. “We recognize that, for a variety of reasons, this may not always be possible,” said Dr. Snyder.

However, she noted, dietary supplements are not subject to the same rigorous review and regulation process as medications.

“The Alzheimer’s Association strongly encourages individuals to have conversations with their physicians about all medications and dietary supplements they are currently taking or interested in starting.” 

COSMOS is supported by an investigator-initiated grant from Mars Edge, a segment of Mars, company engaged in flavanol research and flavanol-related commercial activities, which included infrastructure support and the donation of study pills and packaging. Small reports receiving an unrestricted research grant from Mars.

A version of this article first appeared on Medscape.com.

Taking a daily flavanol supplement improves hippocampal-dependent memory in older adults who have a relatively poor diet, results of a large new study suggest.

There’s increasing evidence that certain nutrients are important for the aging body and brain, study investigator Scott Small, MD, the Boris and Rose Katz Professor of Neurology, Columbia University Vagelos College of Physicians and Surgeons, New York, told this news organization.

“With this new study, I think we can begin to say flavanols might be the first one that really is a nutrient for the aging brain.”

These findings, said Dr. Small, represent “the beginning of a new era” that will eventually lead to formal recommendations” related to ideal intake of flavanols to reduce cognitive aging.

The findings were published online in the Proceedings of the National Academy of Science.
 

Better cognitive aging

Cognitive aging refers to the decline in cognitive abilities that are not thought to be caused by neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease. Cognitive aging targets two areas of the brain: the hippocampus, which is related to memory function, and the prefrontal cortex, which is related to attention and executive function.

Previous research has linked flavanols, which are found in foods like apples, pears, berries, and cocoa beans, to improved cognitive aging. The evidence shows that consuming these nutrients might be associated with the hippocampal-dependent memory component of cognitive aging.

The new study, known as COcoa Supplement and Multivitamin Outcomes Study-Web (COSMOS-Web), included 3,562 generally healthy men and women, mean age 71 years, who were mostly well-educated and non-Hispanic/non-Latinx White individuals.

Participants were randomly assigned to receive oral flavanol-containing cocoa extract (500 mg of cocoa flavanols, including 80 mg of epicatechin) or a placebo daily.

The primary endpoint was hippocampal-dependent memory at year 1 as assessed with the ModRey, a neuropsychological test designed to measure hippocampal function.

Results showed participants in both groups had a typical learning (practice) effect, with similar improvements (d = 0.025; P = .42).

Researchers used other tests to measure cognition: the Color/Directional Flanker Task, a measure of prefrontal cortex function, and the ModBent, a measure that’s sensitive to dentate gyrus function. The flavanol intervention did not affect ModBent results or performance on the Flanker test after 1 year.

However, it was a different story for those with a poor diet at baseline. Researchers stratified participants into tertiles on the basis of diet quality as measured by the Healthy Eating Index (HEI) scores. Those in the lowest tertile had poorer baseline hippocampal-dependent memory performance but not memory related to the prefrontal cortex.

The flavanol intervention improved performance on the ModRey test, compared with placebo in participants in the low HEI tertile (overall effect: d = 0.086; P = .011) but not among those with a medium or high HEI at baseline.

“We confirmed that the flavanol intervention only benefits people who are relatively deficient at baseline,” said Dr. Small.

The correlation with hippocampal-dependent memory was confirmed in a subset of 1,361 study participants who provided a urine sample. Researchers measured urinary 5-(3′,4′-dihydroxyphenyl)-gamma-valerolactone metabolite (gVLM) concentrations, a validated biomarker of flavanol consumption.

After stratifying these results into tertiles, researchers found performance on the ModRey was significantly improved with the dietary flavanol intervention (overall effect: d = 0.141; P = .006) in the lowest gVLM tertile.
 

 

 

Memory restored

When participants in the lowest tertile consumed the supplement, “their flavanol levels went back to normal, and when that happened, their memory was restored,” said Dr. Small.

It appears that there is a sort of ceiling effect to the flavanol benefits. “It seems what you need to do is normalize your flavanol levels; if you go above normal, there was no evidence that your memory keeps on getting better,” said Dr. Small.

The study included only older adults, so it’s unclear what the impact of flavanol supplementation is in younger adults. But cognitive aging “begins its slippery side” in the 40s, said Dr. Small. “If this is truly a nutrient that is taken to prevent that slide from happening, it might be beneficial to start in our 40s.”

He recognized that the effect size is not large but said this is “very dependent” on baseline factors and most study participants had a rather healthy diet. “None of our participants were really highly deficient” in flavanols, he said.

“To see a stronger effect size, we need to do another study where we recruit people who are very low, truly deficient, in flavanols, and then see what happens.”

Showing that flavanols are linked to the hippocampal and not to the prefrontal component of cognitive aging “speaks to the mechanism,” said Dr. Small.

Though the exact mechanism linking flavanols with enhanced memory isn’t clear, there are some clues; for example, research suggests cognitive aging affects the dentate gyrus, a subregion of the hippocampus.

The flavanol supplements were well tolerated. “I can say with close to certainty that this is very safe,” said Dr. Small, adding the flavanols have now been used in numerous studies.

The findings suggest flavanol consumption might be part of future dietary guidelines. “I suspect that once there is sufficient evidence, flavanols will be part of the dietary recommendations for healthy aging,” said Dr. Small.
 

A word of caution

Heather M. Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said that though science suggests a balanced diet is good for overall brain health, no single food, beverage, ingredient, vitamin, or supplement has yet been proven to prevent dementia, treat or cure Alzheimer’s, or benefit cognitive function or brain health.

Experts agree the best source of vitamins and other nutrients is from whole foods as part of a balanced diet. “We recognize that, for a variety of reasons, this may not always be possible,” said Dr. Snyder.

However, she noted, dietary supplements are not subject to the same rigorous review and regulation process as medications.

“The Alzheimer’s Association strongly encourages individuals to have conversations with their physicians about all medications and dietary supplements they are currently taking or interested in starting.” 

COSMOS is supported by an investigator-initiated grant from Mars Edge, a segment of Mars, company engaged in flavanol research and flavanol-related commercial activities, which included infrastructure support and the donation of study pills and packaging. Small reports receiving an unrestricted research grant from Mars.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

CMS to cover Alzheimer’s drugs after traditional FDA okay

Article Type
Changed

The Centers for Medicare & Medicaid Services has announced that Medicare will cover drugs designed to slow Alzheimer’s disease once they receive traditional approval by the Food and Drug Administration.

The one caveat is that CMS will require physicians to participate in registries that collect evidence about how these drugs work in the real world.

Physicians will be able to submit this evidence through a nationwide, CMS-facilitated portal that will be available when any product gains traditional approval and will collect information via an easy-to-use format.

“If the FDA grants traditional approval, then Medicare will cover it in appropriate settings that also support the collection of real-world information to study the usefulness of these drugs for people with Medicare,” the CMS says in a news release.

“CMS has always been committed to helping people obtain timely access to innovative treatments that meaningfully improve care and outcomes for this disease,” added CMS Administrator Chiquita Brooks-LaSure.

“If the FDA grants traditional approval, CMS is prepared to ensure anyone with Medicare Part B who meets the criteria is covered,” Ms. Brooks-LaSure explained.

The CMS says broader Medicare coverage for an Alzheimer’s drug would begin on the same day the FDA grants traditional approval. Under CMS’ current coverage policy, if the FDA grants traditional approval to other drugs in this class, they would also be eligible for broader coverage.

Currently two drugs in this class – aducanumab (Aduhelm) and lecanemab (Leqembi) – have received accelerated approval from the FDA, but no product has received traditional approval.

Lecanemab might be the first to cross the line.

On June 9, the FDA Peripheral and Central Nervous System Drugs Advisory Committee will discuss results of a confirmatory trial of lecanemab, with a potential decision on traditional approval expected shortly thereafter.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The Centers for Medicare & Medicaid Services has announced that Medicare will cover drugs designed to slow Alzheimer’s disease once they receive traditional approval by the Food and Drug Administration.

The one caveat is that CMS will require physicians to participate in registries that collect evidence about how these drugs work in the real world.

Physicians will be able to submit this evidence through a nationwide, CMS-facilitated portal that will be available when any product gains traditional approval and will collect information via an easy-to-use format.

“If the FDA grants traditional approval, then Medicare will cover it in appropriate settings that also support the collection of real-world information to study the usefulness of these drugs for people with Medicare,” the CMS says in a news release.

“CMS has always been committed to helping people obtain timely access to innovative treatments that meaningfully improve care and outcomes for this disease,” added CMS Administrator Chiquita Brooks-LaSure.

“If the FDA grants traditional approval, CMS is prepared to ensure anyone with Medicare Part B who meets the criteria is covered,” Ms. Brooks-LaSure explained.

The CMS says broader Medicare coverage for an Alzheimer’s drug would begin on the same day the FDA grants traditional approval. Under CMS’ current coverage policy, if the FDA grants traditional approval to other drugs in this class, they would also be eligible for broader coverage.

Currently two drugs in this class – aducanumab (Aduhelm) and lecanemab (Leqembi) – have received accelerated approval from the FDA, but no product has received traditional approval.

Lecanemab might be the first to cross the line.

On June 9, the FDA Peripheral and Central Nervous System Drugs Advisory Committee will discuss results of a confirmatory trial of lecanemab, with a potential decision on traditional approval expected shortly thereafter.

A version of this article first appeared on Medscape.com.

The Centers for Medicare & Medicaid Services has announced that Medicare will cover drugs designed to slow Alzheimer’s disease once they receive traditional approval by the Food and Drug Administration.

The one caveat is that CMS will require physicians to participate in registries that collect evidence about how these drugs work in the real world.

Physicians will be able to submit this evidence through a nationwide, CMS-facilitated portal that will be available when any product gains traditional approval and will collect information via an easy-to-use format.

“If the FDA grants traditional approval, then Medicare will cover it in appropriate settings that also support the collection of real-world information to study the usefulness of these drugs for people with Medicare,” the CMS says in a news release.

“CMS has always been committed to helping people obtain timely access to innovative treatments that meaningfully improve care and outcomes for this disease,” added CMS Administrator Chiquita Brooks-LaSure.

“If the FDA grants traditional approval, CMS is prepared to ensure anyone with Medicare Part B who meets the criteria is covered,” Ms. Brooks-LaSure explained.

The CMS says broader Medicare coverage for an Alzheimer’s drug would begin on the same day the FDA grants traditional approval. Under CMS’ current coverage policy, if the FDA grants traditional approval to other drugs in this class, they would also be eligible for broader coverage.

Currently two drugs in this class – aducanumab (Aduhelm) and lecanemab (Leqembi) – have received accelerated approval from the FDA, but no product has received traditional approval.

Lecanemab might be the first to cross the line.

On June 9, the FDA Peripheral and Central Nervous System Drugs Advisory Committee will discuss results of a confirmatory trial of lecanemab, with a potential decision on traditional approval expected shortly thereafter.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Younger age of type 2 diabetes onset linked to dementia risk

Article Type
Changed

People who develop type 2 diabetes before age 60 years are at threefold greater risk for dementia compared with those who don’t develop diabetes, new findings suggest.

Moreover, the new data from the prospective Atherosclerosis Risk in Communities (ARIC) cohort also suggest that the previously identified increased risk for dementia among people with prediabetes appears to be entirely explained by the subset who go on to develop type 2 diabetes.

“Our findings suggest that preventing prediabetes progression, especially in younger individuals, may be an important way to reduce the dementia burden,” wrote PhD student Jiaqi Hu of Johns Hopkins University, Baltimore, and colleagues. Their article was  published online in Diabetologia.

The result builds on previous findings linking dysglycemia and cognitive decline, the study’s lead author, Elizabeth Selvin, PhD, of the Bloomberg School of Public Health at Johns Hopkins, said in an interview.  

“Our prior work in the ARIC study suggests that improving glucose control could help prevent dementia in later life,” she said.  

Johns Hopkins University
Dr. Elizabeth Selvin

Other studies have also linked higher A1c levels and diabetes in midlife to increased rates of cognitive decline. In addition, Dr. Selvin noted, “There is growing evidence that focusing on vascular health, especially focusing on diabetes and blood pressure, in midlife can stave off dementia in later life.”

This new study is the first to examine the effect of diabetes in the relationship between prediabetes and dementia, as well as the age of diabetes onset on subsequent dementia.
 

Prediabetes linked to dementia via diabetes development

Of the 11,656 ARIC participants without diabetes at baseline during 1990-1992 (age 46-70 years), 20.0% had prediabetes (defined as A1c 5.7%-6.4% or 39-46 mmol/mol). During a median follow-up of 15.9 years, 3,143 participants developed diabetes. The proportions of patients who developed diabetes were 44.6% among those with prediabetes at baseline versus 22.5% of those without.

Dementia developed in 2,247 participants over a median follow-up of 24.7 years. The cumulative incidence of dementia was 23.9% among those who developed diabetes versus 20.5% among those who did not.

After adjustment for demographics and for the Alzheimer’s disease–linked apolipoprotein E (APOE) gene, prediabetes was significantly associated with incident dementia (hazard ratio [HR], 1.19). However, significance disappeared after adjustment for incident diabetes (HR, 1.09), the researchers reported.  
 

Younger age at diabetes diagnosis raises dementia risk  

Age at diabetes diagnosis made a difference in dementia risk. With adjustments for lifestyle, demographic, and clinical factors, those diagnosed with diabetes before age 60 years had a nearly threefold increased risk for dementia compared with those who never developed diabetes (HR, 2.92; P < .001).

The dementia risk was also significantly increased, although to a lesser degree, among those aged 60-69 years at diabetes diagnosis (HR, 1.73; P < .001) and age 70-79 years at diabetes diagnosis (HR, 1.23; P < .001). The relationship was not significant for those aged 80 years and older (HR, 1.13).  

“Prevention efforts in people with diabetes diagnosed younger than 65 years should be a high priority,” the authors urged.

Taken together, the data suggest that prolonged exposure to hyperglycemia plays a major role in dementia development.

“Putative mechanisms include acute and chronic hyperglycemia, glucose toxicity, insulin resistance, and microvascular dysfunction of the central nervous system. ... Glucose toxicity and microvascular dysfunction are associated with increased inflammatory and oxidative stress, leading to increased blood–brain permeability,” the researchers wrote.

Dr. Selvin said that her group is pursuing further work in this area using continuous glucose monitoring. “We plan to look at ... how glycemic control and different patterns of glucose in older adults may be linked to cognitive decline and other neurocognitive outcomes.”

The researchers reported no relevant financial relationships. Dr. Selvin has reported being on the advisory board for Diabetologia; she had no role in peer review of the manuscript.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

People who develop type 2 diabetes before age 60 years are at threefold greater risk for dementia compared with those who don’t develop diabetes, new findings suggest.

Moreover, the new data from the prospective Atherosclerosis Risk in Communities (ARIC) cohort also suggest that the previously identified increased risk for dementia among people with prediabetes appears to be entirely explained by the subset who go on to develop type 2 diabetes.

“Our findings suggest that preventing prediabetes progression, especially in younger individuals, may be an important way to reduce the dementia burden,” wrote PhD student Jiaqi Hu of Johns Hopkins University, Baltimore, and colleagues. Their article was  published online in Diabetologia.

The result builds on previous findings linking dysglycemia and cognitive decline, the study’s lead author, Elizabeth Selvin, PhD, of the Bloomberg School of Public Health at Johns Hopkins, said in an interview.  

“Our prior work in the ARIC study suggests that improving glucose control could help prevent dementia in later life,” she said.  

Johns Hopkins University
Dr. Elizabeth Selvin

Other studies have also linked higher A1c levels and diabetes in midlife to increased rates of cognitive decline. In addition, Dr. Selvin noted, “There is growing evidence that focusing on vascular health, especially focusing on diabetes and blood pressure, in midlife can stave off dementia in later life.”

This new study is the first to examine the effect of diabetes in the relationship between prediabetes and dementia, as well as the age of diabetes onset on subsequent dementia.
 

Prediabetes linked to dementia via diabetes development

Of the 11,656 ARIC participants without diabetes at baseline during 1990-1992 (age 46-70 years), 20.0% had prediabetes (defined as A1c 5.7%-6.4% or 39-46 mmol/mol). During a median follow-up of 15.9 years, 3,143 participants developed diabetes. The proportions of patients who developed diabetes were 44.6% among those with prediabetes at baseline versus 22.5% of those without.

Dementia developed in 2,247 participants over a median follow-up of 24.7 years. The cumulative incidence of dementia was 23.9% among those who developed diabetes versus 20.5% among those who did not.

After adjustment for demographics and for the Alzheimer’s disease–linked apolipoprotein E (APOE) gene, prediabetes was significantly associated with incident dementia (hazard ratio [HR], 1.19). However, significance disappeared after adjustment for incident diabetes (HR, 1.09), the researchers reported.  
 

Younger age at diabetes diagnosis raises dementia risk  

Age at diabetes diagnosis made a difference in dementia risk. With adjustments for lifestyle, demographic, and clinical factors, those diagnosed with diabetes before age 60 years had a nearly threefold increased risk for dementia compared with those who never developed diabetes (HR, 2.92; P < .001).

The dementia risk was also significantly increased, although to a lesser degree, among those aged 60-69 years at diabetes diagnosis (HR, 1.73; P < .001) and age 70-79 years at diabetes diagnosis (HR, 1.23; P < .001). The relationship was not significant for those aged 80 years and older (HR, 1.13).  

“Prevention efforts in people with diabetes diagnosed younger than 65 years should be a high priority,” the authors urged.

Taken together, the data suggest that prolonged exposure to hyperglycemia plays a major role in dementia development.

“Putative mechanisms include acute and chronic hyperglycemia, glucose toxicity, insulin resistance, and microvascular dysfunction of the central nervous system. ... Glucose toxicity and microvascular dysfunction are associated with increased inflammatory and oxidative stress, leading to increased blood–brain permeability,” the researchers wrote.

Dr. Selvin said that her group is pursuing further work in this area using continuous glucose monitoring. “We plan to look at ... how glycemic control and different patterns of glucose in older adults may be linked to cognitive decline and other neurocognitive outcomes.”

The researchers reported no relevant financial relationships. Dr. Selvin has reported being on the advisory board for Diabetologia; she had no role in peer review of the manuscript.

A version of this article first appeared on Medscape.com.

People who develop type 2 diabetes before age 60 years are at threefold greater risk for dementia compared with those who don’t develop diabetes, new findings suggest.

Moreover, the new data from the prospective Atherosclerosis Risk in Communities (ARIC) cohort also suggest that the previously identified increased risk for dementia among people with prediabetes appears to be entirely explained by the subset who go on to develop type 2 diabetes.

“Our findings suggest that preventing prediabetes progression, especially in younger individuals, may be an important way to reduce the dementia burden,” wrote PhD student Jiaqi Hu of Johns Hopkins University, Baltimore, and colleagues. Their article was  published online in Diabetologia.

The result builds on previous findings linking dysglycemia and cognitive decline, the study’s lead author, Elizabeth Selvin, PhD, of the Bloomberg School of Public Health at Johns Hopkins, said in an interview.  

“Our prior work in the ARIC study suggests that improving glucose control could help prevent dementia in later life,” she said.  

Johns Hopkins University
Dr. Elizabeth Selvin

Other studies have also linked higher A1c levels and diabetes in midlife to increased rates of cognitive decline. In addition, Dr. Selvin noted, “There is growing evidence that focusing on vascular health, especially focusing on diabetes and blood pressure, in midlife can stave off dementia in later life.”

This new study is the first to examine the effect of diabetes in the relationship between prediabetes and dementia, as well as the age of diabetes onset on subsequent dementia.
 

Prediabetes linked to dementia via diabetes development

Of the 11,656 ARIC participants without diabetes at baseline during 1990-1992 (age 46-70 years), 20.0% had prediabetes (defined as A1c 5.7%-6.4% or 39-46 mmol/mol). During a median follow-up of 15.9 years, 3,143 participants developed diabetes. The proportions of patients who developed diabetes were 44.6% among those with prediabetes at baseline versus 22.5% of those without.

Dementia developed in 2,247 participants over a median follow-up of 24.7 years. The cumulative incidence of dementia was 23.9% among those who developed diabetes versus 20.5% among those who did not.

After adjustment for demographics and for the Alzheimer’s disease–linked apolipoprotein E (APOE) gene, prediabetes was significantly associated with incident dementia (hazard ratio [HR], 1.19). However, significance disappeared after adjustment for incident diabetes (HR, 1.09), the researchers reported.  
 

Younger age at diabetes diagnosis raises dementia risk  

Age at diabetes diagnosis made a difference in dementia risk. With adjustments for lifestyle, demographic, and clinical factors, those diagnosed with diabetes before age 60 years had a nearly threefold increased risk for dementia compared with those who never developed diabetes (HR, 2.92; P < .001).

The dementia risk was also significantly increased, although to a lesser degree, among those aged 60-69 years at diabetes diagnosis (HR, 1.73; P < .001) and age 70-79 years at diabetes diagnosis (HR, 1.23; P < .001). The relationship was not significant for those aged 80 years and older (HR, 1.13).  

“Prevention efforts in people with diabetes diagnosed younger than 65 years should be a high priority,” the authors urged.

Taken together, the data suggest that prolonged exposure to hyperglycemia plays a major role in dementia development.

“Putative mechanisms include acute and chronic hyperglycemia, glucose toxicity, insulin resistance, and microvascular dysfunction of the central nervous system. ... Glucose toxicity and microvascular dysfunction are associated with increased inflammatory and oxidative stress, leading to increased blood–brain permeability,” the researchers wrote.

Dr. Selvin said that her group is pursuing further work in this area using continuous glucose monitoring. “We plan to look at ... how glycemic control and different patterns of glucose in older adults may be linked to cognitive decline and other neurocognitive outcomes.”

The researchers reported no relevant financial relationships. Dr. Selvin has reported being on the advisory board for Diabetologia; she had no role in peer review of the manuscript.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DIABETOLOGIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Alzheimer’s Disease Etiology

Article Type
Changed

Publications
Topics
Sections

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Which interventions could lessen the burden of dementia?

Article Type
Changed

Using a microsimulation algorithm that accounts for the effect on mortality, a team from Marseille, France, has shown that interventions targeting the three main vascular risk factors for dementia – hypertension, diabetes, and physical inactivity – could significantly reduce the burden of dementia by 2040.

Among the three modifiable risk factors, the prevention of hypertension would be the most efficient, with by far the biggest impact on dementia.

Although these modeling results could appear too optimistic, since total disappearance of the risk factors was assumed, the authors say the results do show that targeted interventions for these factors could be effective in reducing the future burden of dementia.
 

Increasing prevalence

According to the World Alzheimer Report 2018, 50 million people around the world were living with dementia; a population roughly around the size of South Korea or Spain. That community is likely to rise to about 152 million people by 2050, which is similar to the size of Russia or Bangladesh, the result of an aging population.

Among modifiable risk factors, many studies support a deleterious effect of hypertension, diabetes, and physical inactivity on the risk of dementia. However, since the distribution of these risk factors could have a direct impact on mortality, reducing it should increase life expectancy and the number of cases of dementia.

The team, headed by Hélène Jacqmin-Gadda, PhD, research director at the University of Bordeaux (France), has developed a microsimulation model capable of predicting the burden of dementia while accounting for the impact on mortality. The team used this approach to assess the impact of interventions targeting these three main risk factors on the burden of dementia in France by 2040.
 

Removing risk factors

The researchers estimated the incidence of dementia for men and women using data from the 2020 PAQUID cohort, and these data were combined with the projections forecast by the French National Institute of Statistics and Economic Studies to account for mortality with and without dementia.

Without intervention, the prevalence rate of dementia in 2040 would be 9.6% among men and 14% among women older than 65 years.

These figures would decrease to 6.4% (−33%) and 10.4% (−26%), respectively, under the intervention scenario whereby the three modifiable vascular risk factors (hypertension, diabetes, and physical inactivity) would be removed simultaneously beginning in 2020. The prevalence rates are significantly reduced for men and women from age 75 years. In this scenario, life expectancy without dementia would increase by 3.4 years in men and 2.6 years in women, the result of men being more exposed to these three risk factors.

Other scenarios have estimated dementia prevalence with the disappearance of just one of these risk factors. For example, the disappearance of hypertension alone from 2020 could decrease dementia prevalence by 21% in men and 16% in women (because this risk factor is less common in women than in men) by 2040. This reduction would be associated with a decrease in the lifelong probability of dementia among men and women and a gain in life expectancy without dementia of 2 years in men and 1.4 years in women.

Among the three factors, hypertension has the largest impact on dementia burden in the French population, since this is, by far, the most prevalent (69% in men and 49% in women), while intervention targeting only diabetes or physical inactivity would lead to a reduction in dementia prevalence of only 4%-7%.

The authors reported no conflicts of interest.

This article was translated from Univadis France. A version appeared on Medscape.com.

Publications
Topics
Sections

Using a microsimulation algorithm that accounts for the effect on mortality, a team from Marseille, France, has shown that interventions targeting the three main vascular risk factors for dementia – hypertension, diabetes, and physical inactivity – could significantly reduce the burden of dementia by 2040.

Among the three modifiable risk factors, the prevention of hypertension would be the most efficient, with by far the biggest impact on dementia.

Although these modeling results could appear too optimistic, since total disappearance of the risk factors was assumed, the authors say the results do show that targeted interventions for these factors could be effective in reducing the future burden of dementia.
 

Increasing prevalence

According to the World Alzheimer Report 2018, 50 million people around the world were living with dementia; a population roughly around the size of South Korea or Spain. That community is likely to rise to about 152 million people by 2050, which is similar to the size of Russia or Bangladesh, the result of an aging population.

Among modifiable risk factors, many studies support a deleterious effect of hypertension, diabetes, and physical inactivity on the risk of dementia. However, since the distribution of these risk factors could have a direct impact on mortality, reducing it should increase life expectancy and the number of cases of dementia.

The team, headed by Hélène Jacqmin-Gadda, PhD, research director at the University of Bordeaux (France), has developed a microsimulation model capable of predicting the burden of dementia while accounting for the impact on mortality. The team used this approach to assess the impact of interventions targeting these three main risk factors on the burden of dementia in France by 2040.
 

Removing risk factors

The researchers estimated the incidence of dementia for men and women using data from the 2020 PAQUID cohort, and these data were combined with the projections forecast by the French National Institute of Statistics and Economic Studies to account for mortality with and without dementia.

Without intervention, the prevalence rate of dementia in 2040 would be 9.6% among men and 14% among women older than 65 years.

These figures would decrease to 6.4% (−33%) and 10.4% (−26%), respectively, under the intervention scenario whereby the three modifiable vascular risk factors (hypertension, diabetes, and physical inactivity) would be removed simultaneously beginning in 2020. The prevalence rates are significantly reduced for men and women from age 75 years. In this scenario, life expectancy without dementia would increase by 3.4 years in men and 2.6 years in women, the result of men being more exposed to these three risk factors.

Other scenarios have estimated dementia prevalence with the disappearance of just one of these risk factors. For example, the disappearance of hypertension alone from 2020 could decrease dementia prevalence by 21% in men and 16% in women (because this risk factor is less common in women than in men) by 2040. This reduction would be associated with a decrease in the lifelong probability of dementia among men and women and a gain in life expectancy without dementia of 2 years in men and 1.4 years in women.

Among the three factors, hypertension has the largest impact on dementia burden in the French population, since this is, by far, the most prevalent (69% in men and 49% in women), while intervention targeting only diabetes or physical inactivity would lead to a reduction in dementia prevalence of only 4%-7%.

The authors reported no conflicts of interest.

This article was translated from Univadis France. A version appeared on Medscape.com.

Using a microsimulation algorithm that accounts for the effect on mortality, a team from Marseille, France, has shown that interventions targeting the three main vascular risk factors for dementia – hypertension, diabetes, and physical inactivity – could significantly reduce the burden of dementia by 2040.

Among the three modifiable risk factors, the prevention of hypertension would be the most efficient, with by far the biggest impact on dementia.

Although these modeling results could appear too optimistic, since total disappearance of the risk factors was assumed, the authors say the results do show that targeted interventions for these factors could be effective in reducing the future burden of dementia.
 

Increasing prevalence

According to the World Alzheimer Report 2018, 50 million people around the world were living with dementia; a population roughly around the size of South Korea or Spain. That community is likely to rise to about 152 million people by 2050, which is similar to the size of Russia or Bangladesh, the result of an aging population.

Among modifiable risk factors, many studies support a deleterious effect of hypertension, diabetes, and physical inactivity on the risk of dementia. However, since the distribution of these risk factors could have a direct impact on mortality, reducing it should increase life expectancy and the number of cases of dementia.

The team, headed by Hélène Jacqmin-Gadda, PhD, research director at the University of Bordeaux (France), has developed a microsimulation model capable of predicting the burden of dementia while accounting for the impact on mortality. The team used this approach to assess the impact of interventions targeting these three main risk factors on the burden of dementia in France by 2040.
 

Removing risk factors

The researchers estimated the incidence of dementia for men and women using data from the 2020 PAQUID cohort, and these data were combined with the projections forecast by the French National Institute of Statistics and Economic Studies to account for mortality with and without dementia.

Without intervention, the prevalence rate of dementia in 2040 would be 9.6% among men and 14% among women older than 65 years.

These figures would decrease to 6.4% (−33%) and 10.4% (−26%), respectively, under the intervention scenario whereby the three modifiable vascular risk factors (hypertension, diabetes, and physical inactivity) would be removed simultaneously beginning in 2020. The prevalence rates are significantly reduced for men and women from age 75 years. In this scenario, life expectancy without dementia would increase by 3.4 years in men and 2.6 years in women, the result of men being more exposed to these three risk factors.

Other scenarios have estimated dementia prevalence with the disappearance of just one of these risk factors. For example, the disappearance of hypertension alone from 2020 could decrease dementia prevalence by 21% in men and 16% in women (because this risk factor is less common in women than in men) by 2040. This reduction would be associated with a decrease in the lifelong probability of dementia among men and women and a gain in life expectancy without dementia of 2 years in men and 1.4 years in women.

Among the three factors, hypertension has the largest impact on dementia burden in the French population, since this is, by far, the most prevalent (69% in men and 49% in women), while intervention targeting only diabetes or physical inactivity would lead to a reduction in dementia prevalence of only 4%-7%.

The authors reported no conflicts of interest.

This article was translated from Univadis France. A version appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE EUROPEAN JOURNAL OF EPIDEMIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Internet use a modifiable dementia risk factor in older adults?

Article Type
Changed

Self-reported, regular Internet use, but not overuse, in older adults is linked to a lower dementia risk, new research suggests.

Investigators followed more than 18,000 older individuals and found that regular Internet use was associated with about a 50% reduction in dementia risk, compared with their counterparts who did not use the Internet regularly.

They also found that longer duration of regular Internet use was associated with a reduced risk of dementia, although excessive daily Internet usage appeared to adversely affect dementia risk.

“Online engagement can develop and maintain cognitive reserve – resiliency against physiological damage to the brain – and increased cognitive reserve can, in turn, compensate for brain aging and reduce the risk of dementia,” study investigator Gawon Cho, a doctoral candidate at New York University School of Global Public Health, said in an interview.

The study was published online in the Journal of the American Geriatrics Society.
 

Unexamined benefits

Prior research has shown that older adult Internet users have “better overall cognitive performance, verbal reasoning, and memory,” compared with nonusers, the authors note.

However, because this body of research consists of cross-sectional analyses and longitudinal studies with brief follow-up periods, the long-term cognitive benefits of Internet usage remain “unexamined.”

In addition, despite “extensive evidence of a disproportionately high burden of dementia in people of color, individuals without higher education, and adults who experienced other socioeconomic hardships, little is known about whether the Internet has exacerbated population-level disparities in cognitive health,” the investigators add.

Another question concerns whether excessive Internet usage may actually be detrimental to neurocognitive outcomes. However, “existing evidence on the adverse effects of Internet usage is concentrated in younger populations whose brains are still undergoing maturation.”

Ms. Cho said the motivation for the study was the lack of longitudinal studies on this topic, especially those with sufficient follow-up periods. In addition, she said, there is insufficient evidence about how changes in Internet usage in older age are associated with prospective dementia risk.

For the study, investigators turned to participants in the Health and Retirement Study, an ongoing longitudinal survey of a nationally representative sample of U.S.-based older adults (aged ≥ 50 years).

All participants (n = 18,154; 47.36% male; median age, 55.17 years) were dementia-free, community-dwelling older adults who completed a 2002 baseline cognitive assessment and were asked about Internet usage every 2 years thereafter.

Participants were followed from 2002 to 2018 for a maximum of 17.1 years (median, 7.9 years), which is the longest follow-up period to date. Of the total sample, 64.76% were regular Internet users.

The study’s primary outcome was incident dementia, based on performance on the Modified Telephone Interview for Cognitive Status (TICS-M), which was administered every 2 years.

The exposure examined in the study was cumulative Internet usage in late adulthood, defined as “the number of biennial waves where participants used the Internet regularly during the first three waves.”

In addition, participants were asked how many hours they spent using the Internet during the past week for activities other than viewing television shows or movies.

The researchers also investigated whether the link between Internet usage and dementia risk varied by educational attainment, race-ethnicity, sex, and generational cohort.

Covariates included baseline TICS-M score, health, age, household income, marital status, and region of residence.
 

 

 

U-shaped curve

More than half of the sample (52.96%) showed no changes in Internet use from baseline during the study period, while one-fifth (20.54%) did show changes in use.

Investigators found a robust link between Internet usage and lower dementia risk (cause-specific hazard ratio, 0.57 [95% CI, 0.46-0.71]) – a finding that remained even after adjusting for self-selection into baseline usage (csHR, 0.54 [0.41-0.72]) and signs of cognitive decline at baseline (csHR, 0.62 [0.46-0.85]).

Each additional wave of regular Internet usage was associated with a 21% decrease in the risk of dementia (95% CI, 13%-29%), wherein additional regular periods were associated with reduced dementia risk (csHR, 0.80 [95% CI, 0.68-0.95]).

“The difference in risk between regular and nonregular users did not vary by educational attainment, race-ethnicity, sex, and generation,” the investigators note.

A U-shaped association was found between daily hours of online engagement, wherein the lowest risk was observed in those with 0.1-2 hours of usage (compared with 0 hours of usage). The risk increased in a “monotonic fashion” after 2 hours, with 6.1-8 hours of usage showing the highest risk.

This finding was not considered statistically significant, but the “consistent U-shaped trend offers a preliminary suggestion that excessive online engagement may have adverse cognitive effects on older adults,” the investigators note.

“Among older adults, regular Internet users may experience a lower risk of dementia compared to nonregular users, and longer periods of regular Internet usage in late adulthood may help reduce the risks of subsequent dementia incidence,” said Ms. Cho. “Nonetheless, using the Internet excessively daily may negatively affect the risk of dementia in older adults.”
 

Bidirectional relationship?

Commenting for this article, Claire Sexton, DPhil, Alzheimer’s Association senior director of scientific programs and outreach, noted that some risk factors for Alzheimer’s or other dementias can’t be changed, while others are modifiable, “either at a personal or a population level.”

She called the current research “important” because it “identifies a potentially modifiable factor that may influence dementia risk.”

However, cautioned Dr. Sexton, who was not involved with the study, the findings cannot establish cause and effect. In fact, the relationship may be bidirectional.

“It may be that regular Internet usage is associated with increased cognitive stimulation, and in turn reduced risk of dementia; or it may be that individuals with lower risk of dementia are more likely to engage in regular Internet usage,” she said. Thus, “interventional studies are able to shed more light on causation.”

The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan, Ann Arbor. Ms. Cho, her coauthors, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

Self-reported, regular Internet use, but not overuse, in older adults is linked to a lower dementia risk, new research suggests.

Investigators followed more than 18,000 older individuals and found that regular Internet use was associated with about a 50% reduction in dementia risk, compared with their counterparts who did not use the Internet regularly.

They also found that longer duration of regular Internet use was associated with a reduced risk of dementia, although excessive daily Internet usage appeared to adversely affect dementia risk.

“Online engagement can develop and maintain cognitive reserve – resiliency against physiological damage to the brain – and increased cognitive reserve can, in turn, compensate for brain aging and reduce the risk of dementia,” study investigator Gawon Cho, a doctoral candidate at New York University School of Global Public Health, said in an interview.

The study was published online in the Journal of the American Geriatrics Society.
 

Unexamined benefits

Prior research has shown that older adult Internet users have “better overall cognitive performance, verbal reasoning, and memory,” compared with nonusers, the authors note.

However, because this body of research consists of cross-sectional analyses and longitudinal studies with brief follow-up periods, the long-term cognitive benefits of Internet usage remain “unexamined.”

In addition, despite “extensive evidence of a disproportionately high burden of dementia in people of color, individuals without higher education, and adults who experienced other socioeconomic hardships, little is known about whether the Internet has exacerbated population-level disparities in cognitive health,” the investigators add.

Another question concerns whether excessive Internet usage may actually be detrimental to neurocognitive outcomes. However, “existing evidence on the adverse effects of Internet usage is concentrated in younger populations whose brains are still undergoing maturation.”

Ms. Cho said the motivation for the study was the lack of longitudinal studies on this topic, especially those with sufficient follow-up periods. In addition, she said, there is insufficient evidence about how changes in Internet usage in older age are associated with prospective dementia risk.

For the study, investigators turned to participants in the Health and Retirement Study, an ongoing longitudinal survey of a nationally representative sample of U.S.-based older adults (aged ≥ 50 years).

All participants (n = 18,154; 47.36% male; median age, 55.17 years) were dementia-free, community-dwelling older adults who completed a 2002 baseline cognitive assessment and were asked about Internet usage every 2 years thereafter.

Participants were followed from 2002 to 2018 for a maximum of 17.1 years (median, 7.9 years), which is the longest follow-up period to date. Of the total sample, 64.76% were regular Internet users.

The study’s primary outcome was incident dementia, based on performance on the Modified Telephone Interview for Cognitive Status (TICS-M), which was administered every 2 years.

The exposure examined in the study was cumulative Internet usage in late adulthood, defined as “the number of biennial waves where participants used the Internet regularly during the first three waves.”

In addition, participants were asked how many hours they spent using the Internet during the past week for activities other than viewing television shows or movies.

The researchers also investigated whether the link between Internet usage and dementia risk varied by educational attainment, race-ethnicity, sex, and generational cohort.

Covariates included baseline TICS-M score, health, age, household income, marital status, and region of residence.
 

 

 

U-shaped curve

More than half of the sample (52.96%) showed no changes in Internet use from baseline during the study period, while one-fifth (20.54%) did show changes in use.

Investigators found a robust link between Internet usage and lower dementia risk (cause-specific hazard ratio, 0.57 [95% CI, 0.46-0.71]) – a finding that remained even after adjusting for self-selection into baseline usage (csHR, 0.54 [0.41-0.72]) and signs of cognitive decline at baseline (csHR, 0.62 [0.46-0.85]).

Each additional wave of regular Internet usage was associated with a 21% decrease in the risk of dementia (95% CI, 13%-29%), wherein additional regular periods were associated with reduced dementia risk (csHR, 0.80 [95% CI, 0.68-0.95]).

“The difference in risk between regular and nonregular users did not vary by educational attainment, race-ethnicity, sex, and generation,” the investigators note.

A U-shaped association was found between daily hours of online engagement, wherein the lowest risk was observed in those with 0.1-2 hours of usage (compared with 0 hours of usage). The risk increased in a “monotonic fashion” after 2 hours, with 6.1-8 hours of usage showing the highest risk.

This finding was not considered statistically significant, but the “consistent U-shaped trend offers a preliminary suggestion that excessive online engagement may have adverse cognitive effects on older adults,” the investigators note.

“Among older adults, regular Internet users may experience a lower risk of dementia compared to nonregular users, and longer periods of regular Internet usage in late adulthood may help reduce the risks of subsequent dementia incidence,” said Ms. Cho. “Nonetheless, using the Internet excessively daily may negatively affect the risk of dementia in older adults.”
 

Bidirectional relationship?

Commenting for this article, Claire Sexton, DPhil, Alzheimer’s Association senior director of scientific programs and outreach, noted that some risk factors for Alzheimer’s or other dementias can’t be changed, while others are modifiable, “either at a personal or a population level.”

She called the current research “important” because it “identifies a potentially modifiable factor that may influence dementia risk.”

However, cautioned Dr. Sexton, who was not involved with the study, the findings cannot establish cause and effect. In fact, the relationship may be bidirectional.

“It may be that regular Internet usage is associated with increased cognitive stimulation, and in turn reduced risk of dementia; or it may be that individuals with lower risk of dementia are more likely to engage in regular Internet usage,” she said. Thus, “interventional studies are able to shed more light on causation.”

The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan, Ann Arbor. Ms. Cho, her coauthors, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Self-reported, regular Internet use, but not overuse, in older adults is linked to a lower dementia risk, new research suggests.

Investigators followed more than 18,000 older individuals and found that regular Internet use was associated with about a 50% reduction in dementia risk, compared with their counterparts who did not use the Internet regularly.

They also found that longer duration of regular Internet use was associated with a reduced risk of dementia, although excessive daily Internet usage appeared to adversely affect dementia risk.

“Online engagement can develop and maintain cognitive reserve – resiliency against physiological damage to the brain – and increased cognitive reserve can, in turn, compensate for brain aging and reduce the risk of dementia,” study investigator Gawon Cho, a doctoral candidate at New York University School of Global Public Health, said in an interview.

The study was published online in the Journal of the American Geriatrics Society.
 

Unexamined benefits

Prior research has shown that older adult Internet users have “better overall cognitive performance, verbal reasoning, and memory,” compared with nonusers, the authors note.

However, because this body of research consists of cross-sectional analyses and longitudinal studies with brief follow-up periods, the long-term cognitive benefits of Internet usage remain “unexamined.”

In addition, despite “extensive evidence of a disproportionately high burden of dementia in people of color, individuals without higher education, and adults who experienced other socioeconomic hardships, little is known about whether the Internet has exacerbated population-level disparities in cognitive health,” the investigators add.

Another question concerns whether excessive Internet usage may actually be detrimental to neurocognitive outcomes. However, “existing evidence on the adverse effects of Internet usage is concentrated in younger populations whose brains are still undergoing maturation.”

Ms. Cho said the motivation for the study was the lack of longitudinal studies on this topic, especially those with sufficient follow-up periods. In addition, she said, there is insufficient evidence about how changes in Internet usage in older age are associated with prospective dementia risk.

For the study, investigators turned to participants in the Health and Retirement Study, an ongoing longitudinal survey of a nationally representative sample of U.S.-based older adults (aged ≥ 50 years).

All participants (n = 18,154; 47.36% male; median age, 55.17 years) were dementia-free, community-dwelling older adults who completed a 2002 baseline cognitive assessment and were asked about Internet usage every 2 years thereafter.

Participants were followed from 2002 to 2018 for a maximum of 17.1 years (median, 7.9 years), which is the longest follow-up period to date. Of the total sample, 64.76% were regular Internet users.

The study’s primary outcome was incident dementia, based on performance on the Modified Telephone Interview for Cognitive Status (TICS-M), which was administered every 2 years.

The exposure examined in the study was cumulative Internet usage in late adulthood, defined as “the number of biennial waves where participants used the Internet regularly during the first three waves.”

In addition, participants were asked how many hours they spent using the Internet during the past week for activities other than viewing television shows or movies.

The researchers also investigated whether the link between Internet usage and dementia risk varied by educational attainment, race-ethnicity, sex, and generational cohort.

Covariates included baseline TICS-M score, health, age, household income, marital status, and region of residence.
 

 

 

U-shaped curve

More than half of the sample (52.96%) showed no changes in Internet use from baseline during the study period, while one-fifth (20.54%) did show changes in use.

Investigators found a robust link between Internet usage and lower dementia risk (cause-specific hazard ratio, 0.57 [95% CI, 0.46-0.71]) – a finding that remained even after adjusting for self-selection into baseline usage (csHR, 0.54 [0.41-0.72]) and signs of cognitive decline at baseline (csHR, 0.62 [0.46-0.85]).

Each additional wave of regular Internet usage was associated with a 21% decrease in the risk of dementia (95% CI, 13%-29%), wherein additional regular periods were associated with reduced dementia risk (csHR, 0.80 [95% CI, 0.68-0.95]).

“The difference in risk between regular and nonregular users did not vary by educational attainment, race-ethnicity, sex, and generation,” the investigators note.

A U-shaped association was found between daily hours of online engagement, wherein the lowest risk was observed in those with 0.1-2 hours of usage (compared with 0 hours of usage). The risk increased in a “monotonic fashion” after 2 hours, with 6.1-8 hours of usage showing the highest risk.

This finding was not considered statistically significant, but the “consistent U-shaped trend offers a preliminary suggestion that excessive online engagement may have adverse cognitive effects on older adults,” the investigators note.

“Among older adults, regular Internet users may experience a lower risk of dementia compared to nonregular users, and longer periods of regular Internet usage in late adulthood may help reduce the risks of subsequent dementia incidence,” said Ms. Cho. “Nonetheless, using the Internet excessively daily may negatively affect the risk of dementia in older adults.”
 

Bidirectional relationship?

Commenting for this article, Claire Sexton, DPhil, Alzheimer’s Association senior director of scientific programs and outreach, noted that some risk factors for Alzheimer’s or other dementias can’t be changed, while others are modifiable, “either at a personal or a population level.”

She called the current research “important” because it “identifies a potentially modifiable factor that may influence dementia risk.”

However, cautioned Dr. Sexton, who was not involved with the study, the findings cannot establish cause and effect. In fact, the relationship may be bidirectional.

“It may be that regular Internet usage is associated with increased cognitive stimulation, and in turn reduced risk of dementia; or it may be that individuals with lower risk of dementia are more likely to engage in regular Internet usage,” she said. Thus, “interventional studies are able to shed more light on causation.”

The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan, Ann Arbor. Ms. Cho, her coauthors, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Worsening cognitive impairments

Article Type
Changed

The history and findings in this case are suggestive of Alzheimer's disease (AD). 

AD is the most common type of dementia. It is characterized by cognitive and behavioral impairment that significantly impairs a patient's social and occupational functioning. The predominant AD pathogenesis hypothesis suggests that AD is largely caused by the accumulation of insoluble amyloid beta deposits and neurofibrillary tangles induced by highly phosphorylated tau proteins in the neocortex, hippocampus, and amygdala, as well as significant loss of neurons and synapses, which leads to brain atrophy. Estimates suggest that approximately 6.2 million people ≥ 65 years of age have AD and that by 2060, the number of Americans with AD may increase to 13.8 million, the result of an aging population and the lack of effective prevention and treatment strategies. AD is a chronic disease that confers tremendous emotional and economic burdens to individuals, families, and society. 

Insidiously progressive memory loss is commonly seen in patients presenting with AD. As the disease progresses over the course of several years, other areas of cognition are impaired. Patients may develop language disorders (eg, anomic aphasia or anomia) and impairment in visuospatial skills and executive functions. Slowly progressive behavioral changes are also observed in many individuals with AD.

Criteria for the clinical diagnosis of AD (eg, insidious onset of cognitive impairment, clear history of worsening symptoms) have been developed and are frequently employed. Among individuals who meet the core clinical criteria for probable AD dementia, biomarker evidence may help to increase the certainty that AD is the basis of the clinical dementia syndrome. Several cerebrospinal fluid and blood biomarkers have shown excellent diagnostic ability by identifying tau pathology and cerebral amyloid beta for AD. Neuroimaging is becoming increasingly important for identifying the underlying causes of cognitive impairment. Currently, MRI is considered the preferred neuroimaging modality for AD as it enables accurate measurement of the three-dimensional volume of brain structures, particularly the size of the hippocampus and related regions. CT may be used when MRI is not possible, such as in a patient with a pacemaker. 

PET is increasingly being used as a noninvasive method for depicting tau pathology deposition and distribution in patients with cognitive impairment. In 2020, the US Food and Drug Administration approved the first tau PET tracer, 18F-flortaucipir, a significant achievement in improving AD diagnosis. 

Currently, the only therapies available for AD are symptomatic therapies. Cholinesterase inhibitors and a partial N-methyl-d-aspartate antagonist are the standard medical treatment for AD. Recently approved antiamyloid therapies are also available for patients with mild cognitive impairment or mild dementia. These include aducanumab, a first-in-class amyloid beta–directed antibody that was approved in 2021; and lecanemab, another amyloid beta–directed antibody that was approved in 2023. Both aducanumab and lecanemab are recommended for the treatment of patients with mild cognitive impairment or mild dementia stage of disease, the population in which the safety and efficacy of these newer agents were demonstrated in clinical trials. 

Psychotropic agents are often used to treat the secondary symptoms of AD, such as depression, agitation, aggression, hallucinations, delusions, and/or sleep disorders, which can be problematic. Behavioral interventions, including patient-centered approaches and caregiver training, may also be beneficial for managing the cognitive and behavioral manifestations of AD. These modalities are often used in combination with pharmacologic interventions, such as anxiolytics for anxiety and agitation, neuroleptics for delusions or hallucinations, and antidepressants or mood stabilizers for mood disorders and specific manifestations (eg, episodes of anger or rage). Regular physical activity and exercise is also emerging as a potential strategy for delaying AD progression and possibly conferring a protective effect on brain health.

Behavioral interventions, including patient-centered approaches and caregiver training, may also be beneficial for managing the cognitive and behavioral manifestations of AD. These modalities are often used in combination with pharmacologic interventions, such as anxiolytics for anxiety and agitation, neuroleptics for delusions or hallucinations, and antidepressants or mood stabilizers for mood disorders and specific manifestations (eg, episodes of anger or rage). Regular physical activity and exercise is also emerging as a potential strategy for delaying AD progression and possibly conferring a protective effect on brain health.

 

Jasvinder Chawla, MD, Professor of Neurology, Loyola University Medical Center, Maywood; Director, Clinical Neurophysiology Lab, Department of Neurology, Hines VA Hospital, Hines, IL.

Jasvinder Chawla, MD, has disclosed no relevant financial relationships.


Image Quizzes are fictional or fictionalized clinical scenarios intended to provide evidence-based educational takeaways.

Author and Disclosure Information

Reviewed by Jasvinder Chawla, MD

Publications
Topics
Sections
Author and Disclosure Information

Reviewed by Jasvinder Chawla, MD

Author and Disclosure Information

Reviewed by Jasvinder Chawla, MD

The history and findings in this case are suggestive of Alzheimer's disease (AD). 

AD is the most common type of dementia. It is characterized by cognitive and behavioral impairment that significantly impairs a patient's social and occupational functioning. The predominant AD pathogenesis hypothesis suggests that AD is largely caused by the accumulation of insoluble amyloid beta deposits and neurofibrillary tangles induced by highly phosphorylated tau proteins in the neocortex, hippocampus, and amygdala, as well as significant loss of neurons and synapses, which leads to brain atrophy. Estimates suggest that approximately 6.2 million people ≥ 65 years of age have AD and that by 2060, the number of Americans with AD may increase to 13.8 million, the result of an aging population and the lack of effective prevention and treatment strategies. AD is a chronic disease that confers tremendous emotional and economic burdens to individuals, families, and society. 

Insidiously progressive memory loss is commonly seen in patients presenting with AD. As the disease progresses over the course of several years, other areas of cognition are impaired. Patients may develop language disorders (eg, anomic aphasia or anomia) and impairment in visuospatial skills and executive functions. Slowly progressive behavioral changes are also observed in many individuals with AD.

Criteria for the clinical diagnosis of AD (eg, insidious onset of cognitive impairment, clear history of worsening symptoms) have been developed and are frequently employed. Among individuals who meet the core clinical criteria for probable AD dementia, biomarker evidence may help to increase the certainty that AD is the basis of the clinical dementia syndrome. Several cerebrospinal fluid and blood biomarkers have shown excellent diagnostic ability by identifying tau pathology and cerebral amyloid beta for AD. Neuroimaging is becoming increasingly important for identifying the underlying causes of cognitive impairment. Currently, MRI is considered the preferred neuroimaging modality for AD as it enables accurate measurement of the three-dimensional volume of brain structures, particularly the size of the hippocampus and related regions. CT may be used when MRI is not possible, such as in a patient with a pacemaker. 

PET is increasingly being used as a noninvasive method for depicting tau pathology deposition and distribution in patients with cognitive impairment. In 2020, the US Food and Drug Administration approved the first tau PET tracer, 18F-flortaucipir, a significant achievement in improving AD diagnosis. 

Currently, the only therapies available for AD are symptomatic therapies. Cholinesterase inhibitors and a partial N-methyl-d-aspartate antagonist are the standard medical treatment for AD. Recently approved antiamyloid therapies are also available for patients with mild cognitive impairment or mild dementia. These include aducanumab, a first-in-class amyloid beta–directed antibody that was approved in 2021; and lecanemab, another amyloid beta–directed antibody that was approved in 2023. Both aducanumab and lecanemab are recommended for the treatment of patients with mild cognitive impairment or mild dementia stage of disease, the population in which the safety and efficacy of these newer agents were demonstrated in clinical trials. 

Psychotropic agents are often used to treat the secondary symptoms of AD, such as depression, agitation, aggression, hallucinations, delusions, and/or sleep disorders, which can be problematic. Behavioral interventions, including patient-centered approaches and caregiver training, may also be beneficial for managing the cognitive and behavioral manifestations of AD. These modalities are often used in combination with pharmacologic interventions, such as anxiolytics for anxiety and agitation, neuroleptics for delusions or hallucinations, and antidepressants or mood stabilizers for mood disorders and specific manifestations (eg, episodes of anger or rage). Regular physical activity and exercise is also emerging as a potential strategy for delaying AD progression and possibly conferring a protective effect on brain health.

Behavioral interventions, including patient-centered approaches and caregiver training, may also be beneficial for managing the cognitive and behavioral manifestations of AD. These modalities are often used in combination with pharmacologic interventions, such as anxiolytics for anxiety and agitation, neuroleptics for delusions or hallucinations, and antidepressants or mood stabilizers for mood disorders and specific manifestations (eg, episodes of anger or rage). Regular physical activity and exercise is also emerging as a potential strategy for delaying AD progression and possibly conferring a protective effect on brain health.

 

Jasvinder Chawla, MD, Professor of Neurology, Loyola University Medical Center, Maywood; Director, Clinical Neurophysiology Lab, Department of Neurology, Hines VA Hospital, Hines, IL.

Jasvinder Chawla, MD, has disclosed no relevant financial relationships.


Image Quizzes are fictional or fictionalized clinical scenarios intended to provide evidence-based educational takeaways.

The history and findings in this case are suggestive of Alzheimer's disease (AD). 

AD is the most common type of dementia. It is characterized by cognitive and behavioral impairment that significantly impairs a patient's social and occupational functioning. The predominant AD pathogenesis hypothesis suggests that AD is largely caused by the accumulation of insoluble amyloid beta deposits and neurofibrillary tangles induced by highly phosphorylated tau proteins in the neocortex, hippocampus, and amygdala, as well as significant loss of neurons and synapses, which leads to brain atrophy. Estimates suggest that approximately 6.2 million people ≥ 65 years of age have AD and that by 2060, the number of Americans with AD may increase to 13.8 million, the result of an aging population and the lack of effective prevention and treatment strategies. AD is a chronic disease that confers tremendous emotional and economic burdens to individuals, families, and society. 

Insidiously progressive memory loss is commonly seen in patients presenting with AD. As the disease progresses over the course of several years, other areas of cognition are impaired. Patients may develop language disorders (eg, anomic aphasia or anomia) and impairment in visuospatial skills and executive functions. Slowly progressive behavioral changes are also observed in many individuals with AD.

Criteria for the clinical diagnosis of AD (eg, insidious onset of cognitive impairment, clear history of worsening symptoms) have been developed and are frequently employed. Among individuals who meet the core clinical criteria for probable AD dementia, biomarker evidence may help to increase the certainty that AD is the basis of the clinical dementia syndrome. Several cerebrospinal fluid and blood biomarkers have shown excellent diagnostic ability by identifying tau pathology and cerebral amyloid beta for AD. Neuroimaging is becoming increasingly important for identifying the underlying causes of cognitive impairment. Currently, MRI is considered the preferred neuroimaging modality for AD as it enables accurate measurement of the three-dimensional volume of brain structures, particularly the size of the hippocampus and related regions. CT may be used when MRI is not possible, such as in a patient with a pacemaker. 

PET is increasingly being used as a noninvasive method for depicting tau pathology deposition and distribution in patients with cognitive impairment. In 2020, the US Food and Drug Administration approved the first tau PET tracer, 18F-flortaucipir, a significant achievement in improving AD diagnosis. 

Currently, the only therapies available for AD are symptomatic therapies. Cholinesterase inhibitors and a partial N-methyl-d-aspartate antagonist are the standard medical treatment for AD. Recently approved antiamyloid therapies are also available for patients with mild cognitive impairment or mild dementia. These include aducanumab, a first-in-class amyloid beta–directed antibody that was approved in 2021; and lecanemab, another amyloid beta–directed antibody that was approved in 2023. Both aducanumab and lecanemab are recommended for the treatment of patients with mild cognitive impairment or mild dementia stage of disease, the population in which the safety and efficacy of these newer agents were demonstrated in clinical trials. 

Psychotropic agents are often used to treat the secondary symptoms of AD, such as depression, agitation, aggression, hallucinations, delusions, and/or sleep disorders, which can be problematic. Behavioral interventions, including patient-centered approaches and caregiver training, may also be beneficial for managing the cognitive and behavioral manifestations of AD. These modalities are often used in combination with pharmacologic interventions, such as anxiolytics for anxiety and agitation, neuroleptics for delusions or hallucinations, and antidepressants or mood stabilizers for mood disorders and specific manifestations (eg, episodes of anger or rage). Regular physical activity and exercise is also emerging as a potential strategy for delaying AD progression and possibly conferring a protective effect on brain health.

Behavioral interventions, including patient-centered approaches and caregiver training, may also be beneficial for managing the cognitive and behavioral manifestations of AD. These modalities are often used in combination with pharmacologic interventions, such as anxiolytics for anxiety and agitation, neuroleptics for delusions or hallucinations, and antidepressants or mood stabilizers for mood disorders and specific manifestations (eg, episodes of anger or rage). Regular physical activity and exercise is also emerging as a potential strategy for delaying AD progression and possibly conferring a protective effect on brain health.

 

Jasvinder Chawla, MD, Professor of Neurology, Loyola University Medical Center, Maywood; Director, Clinical Neurophysiology Lab, Department of Neurology, Hines VA Hospital, Hines, IL.

Jasvinder Chawla, MD, has disclosed no relevant financial relationships.


Image Quizzes are fictional or fictionalized clinical scenarios intended to provide evidence-based educational takeaways.

Publications
Publications
Topics
Article Type
Sections
Questionnaire Body

James Cavallini / Science Source

 

 

 

 

 

A 73-year-old male restaurant manager presents with concerns of progressively worsening cognitive impairment. The patient's symptoms began approximately 2 years ago. At that time, he attributed them to normal aging. Recently, however, he has begun to have increasing difficulties at work. On several occasions, he has forgotten to place important supply orders and has made errors with staff scheduling. His wife reports that he frequently misplaces items at home, such as his cell phone and car keys, and has been experiencing noticeable deficits with his short-term memory. In addition, he has been "unlike himself" for quite some time, with uncharacteristic episodes of depression, anxiety, and emotional lability. The patient's past medical history is significant for mild obesity, hypertension, and dyslipidemia. There is no history of neurotoxic exposure, head injuries, strokes, or seizures. His family history is negative for dementia. Current medications include rosuvastatin 40 mg/d and metoprolol 100 mg/d. His current height and weight are 5 ft 11 in and 223 lb (BMI 31.1).

No abnormalities are noted on physical exam; the patient's blood pressure, pulse oximetry, and heart rate are within normal ranges. Laboratory tests are within normal ranges, except for elevated levels of fasting blood glucose level (119 mg/dL) and A1c (6.3%). The patient scores 19 on the Montreal Cognitive Assessment test. His clinician orders MRI scanning, which reveals generalized atrophy of brain tissue and an accentuated loss of tissue involving the temporal lobes. 

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article