User login
Cognitive decline risk in adult childhood cancer survivors
Among more than 2,300 adult survivors of childhood cancer and their siblings, who served as controls, new-onset memory impairment emerged more often in survivors decades later.
The increased risk was associated with the cancer treatment that was provided as well as modifiable health behaviors and chronic health conditions.
Even 35 years after being diagnosed, cancer survivors who never received chemotherapies or radiation therapies known to damage the brain reported far greater memory impairment than did their siblings, first author Nicholas Phillips, MD, told this news organization.
What the findings suggest is that “we need to educate oncologists and primary care providers on the risks our survivors face long after completion of therapy,” said Dr. Phillips, of the epidemiology and cancer control department at St. Jude Children’s Research Hospital, Memphis, Tenn.
The study was published online in JAMA Network Open.
Cancer survivors face an elevated risk for severe neurocognitive effects that can emerge 5-10 years following their diagnosis and treatment. However, it’s unclear whether new-onset neurocognitive problems can still develop a decade or more following diagnosis.
Over a long-term follow-up, Dr. Phillips and colleagues explored this question in 2,375 adult survivors of childhood cancer from the Childhood Cancer Survivor Study and 232 of their siblings.
Among the cancer cohort, 1,316 patients were survivors of acute lymphoblastic leukemia (ALL), 488 were survivors of central nervous system (CNS) tumors, and 571 had survived Hodgkin lymphoma.
The researchers determined the prevalence of new-onset neurocognitive impairment between baseline (23 years after diagnosis) and follow-up (35 years after diagnosis). New-onset neurocognitive impairment – present at follow-up but not at baseline – was defined as having a score in the worst 10% of the sibling cohort.
A higher proportion of survivors had new-onset memory impairment at follow-up compared with siblings. Specifically, about 8% of siblings had new-onset memory trouble, compared with 14% of ALL survivors treated with chemotherapy only, 26% of ALL survivors treated with cranial radiation, 35% of CNS tumor survivors, and 17% of Hodgkin lymphoma survivors.
New-onset memory impairment was associated with cranial radiation among CNS tumor survivors (relative risk [RR], 1.97) and alkylator chemotherapy at or above 8,000 mg/m2 among survivors of ALL who were treated without cranial radiation (RR, 2.80). The authors also found that smoking, low educational attainment, and low physical activity were associated with an elevated risk for new-onset memory impairment.
Dr. Phillips noted that current guidelines emphasize the importance of short-term monitoring of a survivor’s neurocognitive status on the basis of that person’s chemotherapy and radiation exposures.
However, “our study suggests that all survivors, regardless of their therapy, should be screened regularly for new-onset neurocognitive problems. And this screening should be done regularly for decades after diagnosis,” he said in an interview.
Dr. Phillips also noted the importance of communicating lifestyle modifications, such as not smoking and maintaining an active lifestyle.
“We need to start early and use the power of repetition when communicating with our survivors and their families,” Dr. Phillips said. “When our families and survivors hear the word ‘exercise,’ they think of gym memberships, lifting weights, and running on treadmills. But what we really want our survivors to do is stay active.”
What this means is engaging for about 2.5 hours a week in a range of activities, such as ballet, basketball, volleyball, bicycling, or swimming.
“And if our kids want to quit after 3 months, let them know that this is okay. They just need to replace that activity with another activity,” said Dr. Phillips. “We want them to find a fun hobby that they will enjoy that will keep them active.”
The study was supported by the National Cancer Institute. Dr. Phillips has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Among more than 2,300 adult survivors of childhood cancer and their siblings, who served as controls, new-onset memory impairment emerged more often in survivors decades later.
The increased risk was associated with the cancer treatment that was provided as well as modifiable health behaviors and chronic health conditions.
Even 35 years after being diagnosed, cancer survivors who never received chemotherapies or radiation therapies known to damage the brain reported far greater memory impairment than did their siblings, first author Nicholas Phillips, MD, told this news organization.
What the findings suggest is that “we need to educate oncologists and primary care providers on the risks our survivors face long after completion of therapy,” said Dr. Phillips, of the epidemiology and cancer control department at St. Jude Children’s Research Hospital, Memphis, Tenn.
The study was published online in JAMA Network Open.
Cancer survivors face an elevated risk for severe neurocognitive effects that can emerge 5-10 years following their diagnosis and treatment. However, it’s unclear whether new-onset neurocognitive problems can still develop a decade or more following diagnosis.
Over a long-term follow-up, Dr. Phillips and colleagues explored this question in 2,375 adult survivors of childhood cancer from the Childhood Cancer Survivor Study and 232 of their siblings.
Among the cancer cohort, 1,316 patients were survivors of acute lymphoblastic leukemia (ALL), 488 were survivors of central nervous system (CNS) tumors, and 571 had survived Hodgkin lymphoma.
The researchers determined the prevalence of new-onset neurocognitive impairment between baseline (23 years after diagnosis) and follow-up (35 years after diagnosis). New-onset neurocognitive impairment – present at follow-up but not at baseline – was defined as having a score in the worst 10% of the sibling cohort.
A higher proportion of survivors had new-onset memory impairment at follow-up compared with siblings. Specifically, about 8% of siblings had new-onset memory trouble, compared with 14% of ALL survivors treated with chemotherapy only, 26% of ALL survivors treated with cranial radiation, 35% of CNS tumor survivors, and 17% of Hodgkin lymphoma survivors.
New-onset memory impairment was associated with cranial radiation among CNS tumor survivors (relative risk [RR], 1.97) and alkylator chemotherapy at or above 8,000 mg/m2 among survivors of ALL who were treated without cranial radiation (RR, 2.80). The authors also found that smoking, low educational attainment, and low physical activity were associated with an elevated risk for new-onset memory impairment.
Dr. Phillips noted that current guidelines emphasize the importance of short-term monitoring of a survivor’s neurocognitive status on the basis of that person’s chemotherapy and radiation exposures.
However, “our study suggests that all survivors, regardless of their therapy, should be screened regularly for new-onset neurocognitive problems. And this screening should be done regularly for decades after diagnosis,” he said in an interview.
Dr. Phillips also noted the importance of communicating lifestyle modifications, such as not smoking and maintaining an active lifestyle.
“We need to start early and use the power of repetition when communicating with our survivors and their families,” Dr. Phillips said. “When our families and survivors hear the word ‘exercise,’ they think of gym memberships, lifting weights, and running on treadmills. But what we really want our survivors to do is stay active.”
What this means is engaging for about 2.5 hours a week in a range of activities, such as ballet, basketball, volleyball, bicycling, or swimming.
“And if our kids want to quit after 3 months, let them know that this is okay. They just need to replace that activity with another activity,” said Dr. Phillips. “We want them to find a fun hobby that they will enjoy that will keep them active.”
The study was supported by the National Cancer Institute. Dr. Phillips has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Among more than 2,300 adult survivors of childhood cancer and their siblings, who served as controls, new-onset memory impairment emerged more often in survivors decades later.
The increased risk was associated with the cancer treatment that was provided as well as modifiable health behaviors and chronic health conditions.
Even 35 years after being diagnosed, cancer survivors who never received chemotherapies or radiation therapies known to damage the brain reported far greater memory impairment than did their siblings, first author Nicholas Phillips, MD, told this news organization.
What the findings suggest is that “we need to educate oncologists and primary care providers on the risks our survivors face long after completion of therapy,” said Dr. Phillips, of the epidemiology and cancer control department at St. Jude Children’s Research Hospital, Memphis, Tenn.
The study was published online in JAMA Network Open.
Cancer survivors face an elevated risk for severe neurocognitive effects that can emerge 5-10 years following their diagnosis and treatment. However, it’s unclear whether new-onset neurocognitive problems can still develop a decade or more following diagnosis.
Over a long-term follow-up, Dr. Phillips and colleagues explored this question in 2,375 adult survivors of childhood cancer from the Childhood Cancer Survivor Study and 232 of their siblings.
Among the cancer cohort, 1,316 patients were survivors of acute lymphoblastic leukemia (ALL), 488 were survivors of central nervous system (CNS) tumors, and 571 had survived Hodgkin lymphoma.
The researchers determined the prevalence of new-onset neurocognitive impairment between baseline (23 years after diagnosis) and follow-up (35 years after diagnosis). New-onset neurocognitive impairment – present at follow-up but not at baseline – was defined as having a score in the worst 10% of the sibling cohort.
A higher proportion of survivors had new-onset memory impairment at follow-up compared with siblings. Specifically, about 8% of siblings had new-onset memory trouble, compared with 14% of ALL survivors treated with chemotherapy only, 26% of ALL survivors treated with cranial radiation, 35% of CNS tumor survivors, and 17% of Hodgkin lymphoma survivors.
New-onset memory impairment was associated with cranial radiation among CNS tumor survivors (relative risk [RR], 1.97) and alkylator chemotherapy at or above 8,000 mg/m2 among survivors of ALL who were treated without cranial radiation (RR, 2.80). The authors also found that smoking, low educational attainment, and low physical activity were associated with an elevated risk for new-onset memory impairment.
Dr. Phillips noted that current guidelines emphasize the importance of short-term monitoring of a survivor’s neurocognitive status on the basis of that person’s chemotherapy and radiation exposures.
However, “our study suggests that all survivors, regardless of their therapy, should be screened regularly for new-onset neurocognitive problems. And this screening should be done regularly for decades after diagnosis,” he said in an interview.
Dr. Phillips also noted the importance of communicating lifestyle modifications, such as not smoking and maintaining an active lifestyle.
“We need to start early and use the power of repetition when communicating with our survivors and their families,” Dr. Phillips said. “When our families and survivors hear the word ‘exercise,’ they think of gym memberships, lifting weights, and running on treadmills. But what we really want our survivors to do is stay active.”
What this means is engaging for about 2.5 hours a week in a range of activities, such as ballet, basketball, volleyball, bicycling, or swimming.
“And if our kids want to quit after 3 months, let them know that this is okay. They just need to replace that activity with another activity,” said Dr. Phillips. “We want them to find a fun hobby that they will enjoy that will keep them active.”
The study was supported by the National Cancer Institute. Dr. Phillips has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
Muscle fat: A new risk factor for cognitive decline?
Investigators assessed muscle fat in more than 1,600 adults in their 70s and evaluated their cognitive function over a 10-year period. They found that increases in muscle adiposity from year 1 to year 6 were associated with greater cognitive decline over time, independent of total weight, other fat deposits, muscle characteristics, and traditional dementia risk factors.
The findings were similar between Black and White people and between men and women.
“Increasing adiposity – or fat deposition – in skeletal muscles predicted faster cognitive decline, irrespective of demographics or other disease, and this effect was distinct from that of other types of fat or other muscle characteristics, such as strength or mass,” study investigator Caterina Rosano MD, MPH, professor of epidemiology at the University of Pittsburgh, said in an interview.
The study was published in the Journal of the American Geriatrics Society.
Biologically plausible
“There has been a growing recognition that overall adiposity and muscle measures, such as strength and mass, are individual indicators of future dementia risk and both strengthen the algorithms to predict cognitive decline,” said Dr. Rosano, associate director for clinical translation at the University of Pittsburgh’s Aging Institute. “However, adiposity in the muscle has not been examined.”
Some evidence supports a “biologically plausible link” between muscle adiposity and dementia risk. For example, muscle adiposity increases the risk for type 2 diabetes and hypertension, both of which are dementia risk factors.
Skeletal muscle adiposity increases with older age, even in older adults who lose weight, and is “highly prevalent” among older adults of African ancestry.
The researchers examined a large, biracial sample of older adults participating in the Health, Aging and Body Composition study, which enrolled men and women aged between 70 and 79 years. Participants were followed for an average of 9.0 ± 1.8 years.
During years 1 and 6, participants’ body composition was analyzed, including intermuscular adipose tissue (IMAT), visceral and subcutaneous adiposity, total fat mass, and muscle area.
In years 1, 3, 5, 8, and 10, participants’ cognition was measured using the modified Mini-Mental State (3MS) exam.
The main independent variable was 5-year change in thigh IMAT (year 6 minus year 1), and the main dependent variable was 3MS decline (from year 5 to year 10).
The researchers adjusted all the models for traditional dementia risk factors at baseline including 3MS, education, apo E4 allele, diabetes, hypertension, and physical activity and also calculated interactions between IMAT change by race or sex.
These models also accounted for change in muscle strength, muscle area, body weight, abdominal subcutaneous and visceral adiposity, and total body fat mass as well as cytokines related to adiposity.
‘Rich and engaging crosstalk’
The final sample included 1634 participants (mean age, 73.38 years at baseline; 48% female; 35% Black; mean baseline 3MS score, 91.6).
Thigh IMAT increased by 39.0% in all participants from year 1 to year 6, which corresponded to an increase of 4.85 cm2 or 0.97 cm2/year. During the same time period, muscle strength decreased by 14.0% (P < .05), although thigh muscle area remained stable, decreasing less than 0.5%.
There were decreases in both abdominal subcutaneous and visceral adiposity of 3.92% and 6.43%, respectively (P < .05). There was a decrease of 3.3% in 3MS from year 5 to year 10.
Several variables were associated with 3MS decline, independent of any change in thigh IMAT: older age, less education, and having at least one copy of the APOe4 allele. These variables were included in the model of IMAT change predicting 3MS change.
A statistically significant association of IMAT increase with 3MS decline was found. The IMAT increase of 4.85 cm2 corresponded to a 3MS decline of an additional 3.6 points (P < .0001) from year 5 to year 10, “indicating a clinically important change.”
The association between increasing thigh IMAT with declining 3MS “remained statistically significant” after adjusting for race, age, education, and apo E4 (P < .0001) and was independent of changes in thigh muscle area, muscle strength, and other adiposity measures.
In participants with increased IMAT in years 1-6, the mean 3MS score fell to approximately 87 points at year 10, compared with those without increased IMAT, with a 3MS score that dropped to approximately 89 points.
Interactions by race and sex were not statistically significant (P > .08).
“Our results suggest that adiposity in muscles can predict cognitive decline, in addition to (not instead of) other traditional dementia risk factors,” said Dr. Rosano.
There is “a rich and engaging crosstalk between muscle, adipose tissue, and the brain all throughout our lives, happening through factors released in the bloodstream that can reach the brain, however, the specific identity of the factors responsible for the crosstalk of muscle adiposity and brain in older adults has not yet been discovered,” she noted.
Although muscle adiposity is “not yet routinely measured in clinical settings, it is being measured opportunistically on clinical CT scans obtained as part of routine patient care,” she added. “These CT measurements have already been validated in many studies of older adults; thus, clinicians could have access to this novel information without additional cost, time, or radiation exposure.”
Causality not proven
In a comment, Bruce Albala, PhD, professor, department of environmental and occupational health, University of California, Irvine, noted that the 3MS assessment is scored on a 100-point scale, with a score less than 78 “generally regarded as indicating cognitive impairment or approaching a dementia condition.” In the current study, the mean 3MS score of participants with increased IMAT was still “well above the dementia cut-off.”
Moreover, “even if there is a relationship or correlation between IMAT and cognition, this does not prove or even suggest causality, especially from a biological mechanistic approach,” said Dr. Albaba, an adjunct professor of neurology, who was not involved in the study. “Clearly, more research is needed even to understand the relationship between these two factors.”
The study was supported by the National Institute on Aging. Dr. Rosano and coauthors and Dr. Albala declared no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Investigators assessed muscle fat in more than 1,600 adults in their 70s and evaluated their cognitive function over a 10-year period. They found that increases in muscle adiposity from year 1 to year 6 were associated with greater cognitive decline over time, independent of total weight, other fat deposits, muscle characteristics, and traditional dementia risk factors.
The findings were similar between Black and White people and between men and women.
“Increasing adiposity – or fat deposition – in skeletal muscles predicted faster cognitive decline, irrespective of demographics or other disease, and this effect was distinct from that of other types of fat or other muscle characteristics, such as strength or mass,” study investigator Caterina Rosano MD, MPH, professor of epidemiology at the University of Pittsburgh, said in an interview.
The study was published in the Journal of the American Geriatrics Society.
Biologically plausible
“There has been a growing recognition that overall adiposity and muscle measures, such as strength and mass, are individual indicators of future dementia risk and both strengthen the algorithms to predict cognitive decline,” said Dr. Rosano, associate director for clinical translation at the University of Pittsburgh’s Aging Institute. “However, adiposity in the muscle has not been examined.”
Some evidence supports a “biologically plausible link” between muscle adiposity and dementia risk. For example, muscle adiposity increases the risk for type 2 diabetes and hypertension, both of which are dementia risk factors.
Skeletal muscle adiposity increases with older age, even in older adults who lose weight, and is “highly prevalent” among older adults of African ancestry.
The researchers examined a large, biracial sample of older adults participating in the Health, Aging and Body Composition study, which enrolled men and women aged between 70 and 79 years. Participants were followed for an average of 9.0 ± 1.8 years.
During years 1 and 6, participants’ body composition was analyzed, including intermuscular adipose tissue (IMAT), visceral and subcutaneous adiposity, total fat mass, and muscle area.
In years 1, 3, 5, 8, and 10, participants’ cognition was measured using the modified Mini-Mental State (3MS) exam.
The main independent variable was 5-year change in thigh IMAT (year 6 minus year 1), and the main dependent variable was 3MS decline (from year 5 to year 10).
The researchers adjusted all the models for traditional dementia risk factors at baseline including 3MS, education, apo E4 allele, diabetes, hypertension, and physical activity and also calculated interactions between IMAT change by race or sex.
These models also accounted for change in muscle strength, muscle area, body weight, abdominal subcutaneous and visceral adiposity, and total body fat mass as well as cytokines related to adiposity.
‘Rich and engaging crosstalk’
The final sample included 1634 participants (mean age, 73.38 years at baseline; 48% female; 35% Black; mean baseline 3MS score, 91.6).
Thigh IMAT increased by 39.0% in all participants from year 1 to year 6, which corresponded to an increase of 4.85 cm2 or 0.97 cm2/year. During the same time period, muscle strength decreased by 14.0% (P < .05), although thigh muscle area remained stable, decreasing less than 0.5%.
There were decreases in both abdominal subcutaneous and visceral adiposity of 3.92% and 6.43%, respectively (P < .05). There was a decrease of 3.3% in 3MS from year 5 to year 10.
Several variables were associated with 3MS decline, independent of any change in thigh IMAT: older age, less education, and having at least one copy of the APOe4 allele. These variables were included in the model of IMAT change predicting 3MS change.
A statistically significant association of IMAT increase with 3MS decline was found. The IMAT increase of 4.85 cm2 corresponded to a 3MS decline of an additional 3.6 points (P < .0001) from year 5 to year 10, “indicating a clinically important change.”
The association between increasing thigh IMAT with declining 3MS “remained statistically significant” after adjusting for race, age, education, and apo E4 (P < .0001) and was independent of changes in thigh muscle area, muscle strength, and other adiposity measures.
In participants with increased IMAT in years 1-6, the mean 3MS score fell to approximately 87 points at year 10, compared with those without increased IMAT, with a 3MS score that dropped to approximately 89 points.
Interactions by race and sex were not statistically significant (P > .08).
“Our results suggest that adiposity in muscles can predict cognitive decline, in addition to (not instead of) other traditional dementia risk factors,” said Dr. Rosano.
There is “a rich and engaging crosstalk between muscle, adipose tissue, and the brain all throughout our lives, happening through factors released in the bloodstream that can reach the brain, however, the specific identity of the factors responsible for the crosstalk of muscle adiposity and brain in older adults has not yet been discovered,” she noted.
Although muscle adiposity is “not yet routinely measured in clinical settings, it is being measured opportunistically on clinical CT scans obtained as part of routine patient care,” she added. “These CT measurements have already been validated in many studies of older adults; thus, clinicians could have access to this novel information without additional cost, time, or radiation exposure.”
Causality not proven
In a comment, Bruce Albala, PhD, professor, department of environmental and occupational health, University of California, Irvine, noted that the 3MS assessment is scored on a 100-point scale, with a score less than 78 “generally regarded as indicating cognitive impairment or approaching a dementia condition.” In the current study, the mean 3MS score of participants with increased IMAT was still “well above the dementia cut-off.”
Moreover, “even if there is a relationship or correlation between IMAT and cognition, this does not prove or even suggest causality, especially from a biological mechanistic approach,” said Dr. Albaba, an adjunct professor of neurology, who was not involved in the study. “Clearly, more research is needed even to understand the relationship between these two factors.”
The study was supported by the National Institute on Aging. Dr. Rosano and coauthors and Dr. Albala declared no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Investigators assessed muscle fat in more than 1,600 adults in their 70s and evaluated their cognitive function over a 10-year period. They found that increases in muscle adiposity from year 1 to year 6 were associated with greater cognitive decline over time, independent of total weight, other fat deposits, muscle characteristics, and traditional dementia risk factors.
The findings were similar between Black and White people and between men and women.
“Increasing adiposity – or fat deposition – in skeletal muscles predicted faster cognitive decline, irrespective of demographics or other disease, and this effect was distinct from that of other types of fat or other muscle characteristics, such as strength or mass,” study investigator Caterina Rosano MD, MPH, professor of epidemiology at the University of Pittsburgh, said in an interview.
The study was published in the Journal of the American Geriatrics Society.
Biologically plausible
“There has been a growing recognition that overall adiposity and muscle measures, such as strength and mass, are individual indicators of future dementia risk and both strengthen the algorithms to predict cognitive decline,” said Dr. Rosano, associate director for clinical translation at the University of Pittsburgh’s Aging Institute. “However, adiposity in the muscle has not been examined.”
Some evidence supports a “biologically plausible link” between muscle adiposity and dementia risk. For example, muscle adiposity increases the risk for type 2 diabetes and hypertension, both of which are dementia risk factors.
Skeletal muscle adiposity increases with older age, even in older adults who lose weight, and is “highly prevalent” among older adults of African ancestry.
The researchers examined a large, biracial sample of older adults participating in the Health, Aging and Body Composition study, which enrolled men and women aged between 70 and 79 years. Participants were followed for an average of 9.0 ± 1.8 years.
During years 1 and 6, participants’ body composition was analyzed, including intermuscular adipose tissue (IMAT), visceral and subcutaneous adiposity, total fat mass, and muscle area.
In years 1, 3, 5, 8, and 10, participants’ cognition was measured using the modified Mini-Mental State (3MS) exam.
The main independent variable was 5-year change in thigh IMAT (year 6 minus year 1), and the main dependent variable was 3MS decline (from year 5 to year 10).
The researchers adjusted all the models for traditional dementia risk factors at baseline including 3MS, education, apo E4 allele, diabetes, hypertension, and physical activity and also calculated interactions between IMAT change by race or sex.
These models also accounted for change in muscle strength, muscle area, body weight, abdominal subcutaneous and visceral adiposity, and total body fat mass as well as cytokines related to adiposity.
‘Rich and engaging crosstalk’
The final sample included 1634 participants (mean age, 73.38 years at baseline; 48% female; 35% Black; mean baseline 3MS score, 91.6).
Thigh IMAT increased by 39.0% in all participants from year 1 to year 6, which corresponded to an increase of 4.85 cm2 or 0.97 cm2/year. During the same time period, muscle strength decreased by 14.0% (P < .05), although thigh muscle area remained stable, decreasing less than 0.5%.
There were decreases in both abdominal subcutaneous and visceral adiposity of 3.92% and 6.43%, respectively (P < .05). There was a decrease of 3.3% in 3MS from year 5 to year 10.
Several variables were associated with 3MS decline, independent of any change in thigh IMAT: older age, less education, and having at least one copy of the APOe4 allele. These variables were included in the model of IMAT change predicting 3MS change.
A statistically significant association of IMAT increase with 3MS decline was found. The IMAT increase of 4.85 cm2 corresponded to a 3MS decline of an additional 3.6 points (P < .0001) from year 5 to year 10, “indicating a clinically important change.”
The association between increasing thigh IMAT with declining 3MS “remained statistically significant” after adjusting for race, age, education, and apo E4 (P < .0001) and was independent of changes in thigh muscle area, muscle strength, and other adiposity measures.
In participants with increased IMAT in years 1-6, the mean 3MS score fell to approximately 87 points at year 10, compared with those without increased IMAT, with a 3MS score that dropped to approximately 89 points.
Interactions by race and sex were not statistically significant (P > .08).
“Our results suggest that adiposity in muscles can predict cognitive decline, in addition to (not instead of) other traditional dementia risk factors,” said Dr. Rosano.
There is “a rich and engaging crosstalk between muscle, adipose tissue, and the brain all throughout our lives, happening through factors released in the bloodstream that can reach the brain, however, the specific identity of the factors responsible for the crosstalk of muscle adiposity and brain in older adults has not yet been discovered,” she noted.
Although muscle adiposity is “not yet routinely measured in clinical settings, it is being measured opportunistically on clinical CT scans obtained as part of routine patient care,” she added. “These CT measurements have already been validated in many studies of older adults; thus, clinicians could have access to this novel information without additional cost, time, or radiation exposure.”
Causality not proven
In a comment, Bruce Albala, PhD, professor, department of environmental and occupational health, University of California, Irvine, noted that the 3MS assessment is scored on a 100-point scale, with a score less than 78 “generally regarded as indicating cognitive impairment or approaching a dementia condition.” In the current study, the mean 3MS score of participants with increased IMAT was still “well above the dementia cut-off.”
Moreover, “even if there is a relationship or correlation between IMAT and cognition, this does not prove or even suggest causality, especially from a biological mechanistic approach,” said Dr. Albaba, an adjunct professor of neurology, who was not involved in the study. “Clearly, more research is needed even to understand the relationship between these two factors.”
The study was supported by the National Institute on Aging. Dr. Rosano and coauthors and Dr. Albala declared no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY
Blood biomarker may help predict who will develop Alzheimer’s
A blood biomarker that measures astrocyte reactivity may help determine who, among cognitively unimpaired older adults with amyloid-beta, will go on to develop Alzheimer’s disease (AD), new research suggests.
Investigators tested the blood of 1,000 cognitively healthy individuals with and without amyloid-beta pathology and found that only those with a combination of amyloid-beta burden and abnormal astrocyte activation subsequently progressed to AD.
“Our study argues that testing for the presence of brain amyloid along with blood biomarkers of astrocyte reactivity is the optimal screening to identify patients who are most at risk for progressing to Alzheimer’s disease,” senior investigator Tharick A. Pascoal, MD, PhD, associate professor of psychiatry and neurology, University of Pittsburgh, said in a release.
At this point, the biomarker is a research tool, but its application in clinical practice “is not very far away,” Dr. Pascoal told this news organization.
The study was published online in Nature Medicine.
Multicenter study
In AD, accumulation of amyloid-beta in the brain precedes tau pathology, but not everyone with amyloid-beta develops tau, and, consequently, clinical symptoms. Approximately 30% of older adults have brain amyloid but many never progress to AD, said Dr. Pascoal.
This suggests other biological processes may trigger the deleterious effects of amyloid-beta in the early stages of AD.
Finding predictive markers of early amyloid-beta–related tau pathology would help identify cognitively normal individuals who are more likely to develop AD.
Post-mortem studies show astrocyte reactivity – changes in glial cells in the brain and spinal cord because of an insult in the brain – is an early AD abnormality. Other research suggests a close link between amyloid-beta, astrocyte reactivity, and tau.
In addition, evidence suggests plasma measures of glial fibrillary acidic protein (GFAP) could be a strong proxy of astrocyte reactivity in the brain. Dr. Pascoal explained that when astrocytes are changed or become bigger, more GFAP is released.
The study included 1,016 cognitively normal individuals from three centers; some had amyloid pathology, some did not. Participants’ mean age was 69.6 years, and all were deemed negative or positive for astrocyte reactivity based on plasma GFAP levels.
Results showed amyloid-beta is associated with increased plasma phosphorylated tau only in individuals positive for astrocyte reactivity. In addition, analyses using PET scans showed an AD-like pattern of tau tangle accumulation as a function of amyloid-beta exclusively in those same individuals.
Early upstream event
The findings suggest abnormalities in astrocyte reactivity is an early upstream event that likely occurs prior to tau pathology, which is closely related to the development of neurodegeneration and cognitive decline.
It’s likely many types of insults or processes can lead to astrocyte reactivity, possibly including COVID, but more research in this area is needed, said Dr. Pascoal.
“Our study only looked at the consequence of having both amyloid and astrocyte reactivity; it did not elucidate what is causing either of them,” he said.
Although “we were able to have very good results” in the current study, additional studies are needed to better establish the cut-off for GFAP levels that signal progression, said Dr. Pascoal.
The effect of astrocyte reactivity on the association between amyloid-beta and tau phosphorylation was greater in men than women. Dr. Pascoal noted anti-amyloid therapies, which might be modifying the amyloid-beta-astrocyte-tau pathway, tend to have a much larger effect in men than women.
Further studies that measure amyloid-beta, tau, and GFAP biomarkers at multiple timepoints, and with long follow-up, are needed, the investigators note.
The results may have implications for clinical trials, which have increasingly focused on individuals in the earliest preclinical phases of AD. Future studies should include cognitively normal patients who are positive for both amyloid pathology and astrocyte reactivity but have no overt p-tau abnormality, said Dr. Pascoal.
This may provide a time window for interventions very early in the disease process in those at increased risk for AD-related progression.
The study did not determine whether participants with both amyloid and astrocyte reactivity will inevitably develop AD, and to do so would require a longer follow up. “Our outcome was correlation to tau in the brain, which is something we know will lead to AD.”
Although the cohort represents significant socioeconomic diversity, a main limitation of the study was that subjects were mainly White, which limits the generalizability of the findings to a more diverse population.
The study received support from the National Institute of Aging; National Heart Lung and Blood Institute; Alzheimer’s Association; Fonds de Recherche du Québec-Santé; Canadian Consortium of Neurodegeneration in Aging; Weston Brain Institute; Colin Adair Charitable Foundation; Swedish Research Council; Wallenberg Scholar; BrightFocus Foundation; Swedish Alzheimer Foundation; Swedish Brain Foundation; Agneta Prytz-Folkes & Gösta Folkes Foundation; European Union; Swedish State Support for Clinical Research; Alzheimer Drug Discovery Foundation; Bluefield Project, the Olav Thon Foundation, the Erling-Persson Family Foundation, Stiftelsen för Gamla Tjänarinnor, Hjärnfonden, Sweden; the UK Dementia Research Institute at UCL; National Academy of Neuropsychology; Fundação de Amparo a pesquisa do Rio Grande do Sul; Instituto Serrapilheira; and Hjärnfonden.
Dr. Pascoal reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A blood biomarker that measures astrocyte reactivity may help determine who, among cognitively unimpaired older adults with amyloid-beta, will go on to develop Alzheimer’s disease (AD), new research suggests.
Investigators tested the blood of 1,000 cognitively healthy individuals with and without amyloid-beta pathology and found that only those with a combination of amyloid-beta burden and abnormal astrocyte activation subsequently progressed to AD.
“Our study argues that testing for the presence of brain amyloid along with blood biomarkers of astrocyte reactivity is the optimal screening to identify patients who are most at risk for progressing to Alzheimer’s disease,” senior investigator Tharick A. Pascoal, MD, PhD, associate professor of psychiatry and neurology, University of Pittsburgh, said in a release.
At this point, the biomarker is a research tool, but its application in clinical practice “is not very far away,” Dr. Pascoal told this news organization.
The study was published online in Nature Medicine.
Multicenter study
In AD, accumulation of amyloid-beta in the brain precedes tau pathology, but not everyone with amyloid-beta develops tau, and, consequently, clinical symptoms. Approximately 30% of older adults have brain amyloid but many never progress to AD, said Dr. Pascoal.
This suggests other biological processes may trigger the deleterious effects of amyloid-beta in the early stages of AD.
Finding predictive markers of early amyloid-beta–related tau pathology would help identify cognitively normal individuals who are more likely to develop AD.
Post-mortem studies show astrocyte reactivity – changes in glial cells in the brain and spinal cord because of an insult in the brain – is an early AD abnormality. Other research suggests a close link between amyloid-beta, astrocyte reactivity, and tau.
In addition, evidence suggests plasma measures of glial fibrillary acidic protein (GFAP) could be a strong proxy of astrocyte reactivity in the brain. Dr. Pascoal explained that when astrocytes are changed or become bigger, more GFAP is released.
The study included 1,016 cognitively normal individuals from three centers; some had amyloid pathology, some did not. Participants’ mean age was 69.6 years, and all were deemed negative or positive for astrocyte reactivity based on plasma GFAP levels.
Results showed amyloid-beta is associated with increased plasma phosphorylated tau only in individuals positive for astrocyte reactivity. In addition, analyses using PET scans showed an AD-like pattern of tau tangle accumulation as a function of amyloid-beta exclusively in those same individuals.
Early upstream event
The findings suggest abnormalities in astrocyte reactivity is an early upstream event that likely occurs prior to tau pathology, which is closely related to the development of neurodegeneration and cognitive decline.
It’s likely many types of insults or processes can lead to astrocyte reactivity, possibly including COVID, but more research in this area is needed, said Dr. Pascoal.
“Our study only looked at the consequence of having both amyloid and astrocyte reactivity; it did not elucidate what is causing either of them,” he said.
Although “we were able to have very good results” in the current study, additional studies are needed to better establish the cut-off for GFAP levels that signal progression, said Dr. Pascoal.
The effect of astrocyte reactivity on the association between amyloid-beta and tau phosphorylation was greater in men than women. Dr. Pascoal noted anti-amyloid therapies, which might be modifying the amyloid-beta-astrocyte-tau pathway, tend to have a much larger effect in men than women.
Further studies that measure amyloid-beta, tau, and GFAP biomarkers at multiple timepoints, and with long follow-up, are needed, the investigators note.
The results may have implications for clinical trials, which have increasingly focused on individuals in the earliest preclinical phases of AD. Future studies should include cognitively normal patients who are positive for both amyloid pathology and astrocyte reactivity but have no overt p-tau abnormality, said Dr. Pascoal.
This may provide a time window for interventions very early in the disease process in those at increased risk for AD-related progression.
The study did not determine whether participants with both amyloid and astrocyte reactivity will inevitably develop AD, and to do so would require a longer follow up. “Our outcome was correlation to tau in the brain, which is something we know will lead to AD.”
Although the cohort represents significant socioeconomic diversity, a main limitation of the study was that subjects were mainly White, which limits the generalizability of the findings to a more diverse population.
The study received support from the National Institute of Aging; National Heart Lung and Blood Institute; Alzheimer’s Association; Fonds de Recherche du Québec-Santé; Canadian Consortium of Neurodegeneration in Aging; Weston Brain Institute; Colin Adair Charitable Foundation; Swedish Research Council; Wallenberg Scholar; BrightFocus Foundation; Swedish Alzheimer Foundation; Swedish Brain Foundation; Agneta Prytz-Folkes & Gösta Folkes Foundation; European Union; Swedish State Support for Clinical Research; Alzheimer Drug Discovery Foundation; Bluefield Project, the Olav Thon Foundation, the Erling-Persson Family Foundation, Stiftelsen för Gamla Tjänarinnor, Hjärnfonden, Sweden; the UK Dementia Research Institute at UCL; National Academy of Neuropsychology; Fundação de Amparo a pesquisa do Rio Grande do Sul; Instituto Serrapilheira; and Hjärnfonden.
Dr. Pascoal reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A blood biomarker that measures astrocyte reactivity may help determine who, among cognitively unimpaired older adults with amyloid-beta, will go on to develop Alzheimer’s disease (AD), new research suggests.
Investigators tested the blood of 1,000 cognitively healthy individuals with and without amyloid-beta pathology and found that only those with a combination of amyloid-beta burden and abnormal astrocyte activation subsequently progressed to AD.
“Our study argues that testing for the presence of brain amyloid along with blood biomarkers of astrocyte reactivity is the optimal screening to identify patients who are most at risk for progressing to Alzheimer’s disease,” senior investigator Tharick A. Pascoal, MD, PhD, associate professor of psychiatry and neurology, University of Pittsburgh, said in a release.
At this point, the biomarker is a research tool, but its application in clinical practice “is not very far away,” Dr. Pascoal told this news organization.
The study was published online in Nature Medicine.
Multicenter study
In AD, accumulation of amyloid-beta in the brain precedes tau pathology, but not everyone with amyloid-beta develops tau, and, consequently, clinical symptoms. Approximately 30% of older adults have brain amyloid but many never progress to AD, said Dr. Pascoal.
This suggests other biological processes may trigger the deleterious effects of amyloid-beta in the early stages of AD.
Finding predictive markers of early amyloid-beta–related tau pathology would help identify cognitively normal individuals who are more likely to develop AD.
Post-mortem studies show astrocyte reactivity – changes in glial cells in the brain and spinal cord because of an insult in the brain – is an early AD abnormality. Other research suggests a close link between amyloid-beta, astrocyte reactivity, and tau.
In addition, evidence suggests plasma measures of glial fibrillary acidic protein (GFAP) could be a strong proxy of astrocyte reactivity in the brain. Dr. Pascoal explained that when astrocytes are changed or become bigger, more GFAP is released.
The study included 1,016 cognitively normal individuals from three centers; some had amyloid pathology, some did not. Participants’ mean age was 69.6 years, and all were deemed negative or positive for astrocyte reactivity based on plasma GFAP levels.
Results showed amyloid-beta is associated with increased plasma phosphorylated tau only in individuals positive for astrocyte reactivity. In addition, analyses using PET scans showed an AD-like pattern of tau tangle accumulation as a function of amyloid-beta exclusively in those same individuals.
Early upstream event
The findings suggest abnormalities in astrocyte reactivity is an early upstream event that likely occurs prior to tau pathology, which is closely related to the development of neurodegeneration and cognitive decline.
It’s likely many types of insults or processes can lead to astrocyte reactivity, possibly including COVID, but more research in this area is needed, said Dr. Pascoal.
“Our study only looked at the consequence of having both amyloid and astrocyte reactivity; it did not elucidate what is causing either of them,” he said.
Although “we were able to have very good results” in the current study, additional studies are needed to better establish the cut-off for GFAP levels that signal progression, said Dr. Pascoal.
The effect of astrocyte reactivity on the association between amyloid-beta and tau phosphorylation was greater in men than women. Dr. Pascoal noted anti-amyloid therapies, which might be modifying the amyloid-beta-astrocyte-tau pathway, tend to have a much larger effect in men than women.
Further studies that measure amyloid-beta, tau, and GFAP biomarkers at multiple timepoints, and with long follow-up, are needed, the investigators note.
The results may have implications for clinical trials, which have increasingly focused on individuals in the earliest preclinical phases of AD. Future studies should include cognitively normal patients who are positive for both amyloid pathology and astrocyte reactivity but have no overt p-tau abnormality, said Dr. Pascoal.
This may provide a time window for interventions very early in the disease process in those at increased risk for AD-related progression.
The study did not determine whether participants with both amyloid and astrocyte reactivity will inevitably develop AD, and to do so would require a longer follow up. “Our outcome was correlation to tau in the brain, which is something we know will lead to AD.”
Although the cohort represents significant socioeconomic diversity, a main limitation of the study was that subjects were mainly White, which limits the generalizability of the findings to a more diverse population.
The study received support from the National Institute of Aging; National Heart Lung and Blood Institute; Alzheimer’s Association; Fonds de Recherche du Québec-Santé; Canadian Consortium of Neurodegeneration in Aging; Weston Brain Institute; Colin Adair Charitable Foundation; Swedish Research Council; Wallenberg Scholar; BrightFocus Foundation; Swedish Alzheimer Foundation; Swedish Brain Foundation; Agneta Prytz-Folkes & Gösta Folkes Foundation; European Union; Swedish State Support for Clinical Research; Alzheimer Drug Discovery Foundation; Bluefield Project, the Olav Thon Foundation, the Erling-Persson Family Foundation, Stiftelsen för Gamla Tjänarinnor, Hjärnfonden, Sweden; the UK Dementia Research Institute at UCL; National Academy of Neuropsychology; Fundação de Amparo a pesquisa do Rio Grande do Sul; Instituto Serrapilheira; and Hjärnfonden.
Dr. Pascoal reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Gout linked to smaller brain volume, higher likelihood of neurodegenerative diseases
Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.
“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.
“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.
“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.
Links between gout and neurodegenerative diseases debated in earlier studies
Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.
Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
A novel approach that analyzes brain structure and genetics
In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.
Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
MRI shows brain changes in patients with gout
In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.
They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.
Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.
Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).
In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.
Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
Genetic analyses reinforce MRI results
Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.
They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.
In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.
Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
A novel approach that suggests further related research
Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.
Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.
“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.
“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”
“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”
Early diagnosis benefits patients
Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.
“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”
Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.
The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.
Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.
“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.
“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.
“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.
Links between gout and neurodegenerative diseases debated in earlier studies
Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.
Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
A novel approach that analyzes brain structure and genetics
In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.
Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
MRI shows brain changes in patients with gout
In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.
They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.
Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.
Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).
In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.
Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
Genetic analyses reinforce MRI results
Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.
They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.
In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.
Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
A novel approach that suggests further related research
Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.
Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.
“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.
“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”
“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”
Early diagnosis benefits patients
Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.
“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”
Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.
The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.
Patients with gout may have smaller brain volumes and higher brain iron markers than people without gout, and also be more likely to develop Parkinson’s disease, probable essential tremor, and dementia, researchers in the United Kingdom report.
“We were surprised about the regions of the brain affected by gout, several of which are important for motor function. The other intriguing finding was that the risk of dementia amongst gout patients was strongly time-dependent: highest in the first 3 years after their gout diagnosis,” lead study author Anya Topiwala, BMBCh, DPhil, said in an interview.
“Our combination of traditional and genetic approaches increases the confidence that gout is causing the brain findings,” said Dr. Topiwala, a clinical research fellow and consultant psychiatrist in the Nuffield Department of Population Health at the University of Oxford, England.
“We suggest that clinicians be vigilant for cognitive and motor problems after gout diagnosis, particularly in the early stages,” she added.
Links between gout and neurodegenerative diseases debated in earlier studies
Gout, the most common inflammatory arthritis, affects around 1%-4% of people, the authors wrote, with monosodium urate crystal deposits causing acute flares of pain and swelling in joints and periarticular tissues.
Whether and how gout may affect the brain has been debated in the literature. Gout and hyperuricemia have been linked with elevated stroke risk; and although observational studies have linked hyperuricemia with lower dementia risk, especially Alzheimer’s disease, Mendelian randomization studies have had conflicting results in Alzheimer’s disease.
A novel approach that analyzes brain structure and genetics
In a study published in Nature Communications, Dr. Topiwala and her colleagues combined observational and Mendelian randomization techniques to explore relationships between gout and neurodegenerative diseases. They analyzed data from over 303,000 volunteer participants between 40 and 69 years of age recruited between 2006 and 2010 to contribute their detailed genetic and health information to the U.K. Biobank, a large-scale biomedical database and research resource.
Patients with gout tended to be older and male. At baseline, all participants’ serum urate levels were measured, and 30.8% of patients with gout reported that they currently used urate-lowering therapy.
MRI shows brain changes in patients with gout
In what the authors said is the first investigation of neuroimaging markers in patients with gout, they compared differences in gray matter volumes found in the 1,165 participants with gout and the 32,202 controls without gout who had MRI data.
They found no marked sex differences in associations. Urate was inversely linked with global brain volume and with gray and white matter volumes, and gout appeared to age global gray matter by 2 years.
Patients with gout and higher urate showed significant differences in regional gray matter volumes, especially in the cerebellum, pons, and midbrain, as well as subcortical differences in the nucleus accumbens, putamen, and caudate. They also showed significant differences in white matter tract microstructure in the fornix.
Patients with gout were more likely to develop dementia (average hazard ratio [HR] over study = 1.60), especially in the first 3 years after gout diagnosis (HR = 7.40). They were also at higher risk for vascular dementia (average HR = 2.41), compared with all-cause dementia, but not for Alzheimer’s disease (average HR = 1.62).
In asymptomatic participants though, urate and dementia were inversely linked (HR = 0.85), with no time dependence.
Gout was linked with higher incidence of Parkinson’s disease (HR = 1.43) and probable essential tremor (HR = 6.75). In asymptomatic participants, urate and Parkinson’s disease (HR = 0.89), but not probable essential tremor, were inversely linked.
Genetic analyses reinforce MRI results
Using Mendelian randomization estimates, the authors found that genetic links generally reflected their observational findings. Both genetically predicted gout and serum urate were significantly linked with regional gray matter volumes, including cerebellar, midbrain, pons, and brainstem.
They also found significant links with higher magnetic susceptibility in the putamen and caudate, markers of higher iron. But while genetically predicted gout was significantly linked with global gray matter volume, urate was not.
In males, but not in females, urate was positively linked with alcohol intake and lower socioeconomic status.
Dr. Topiwala acknowledged several limitations to the study, writing that “the results from the volunteer participants may not apply to other populations; the cross-sectional serum urate measurements may not reflect chronic exposure; and Parkinson’s disease and essential tremor may have been diagnostically confounded.”
A novel approach that suggests further related research
Asked to comment on the study, Puja Khanna, MD, MPH, a rheumatologist and clinical associate professor of medicine at the University of Michigan, Ann Arbor, called its novel use of neuroimaging interesting.
Dr. Khanna, who was not involved in the study, said she would like to know more about the role that horizontal pleiotropy – one genetic variant having independent effects on multiple traits – plays in this disease process, and about the impact of the antioxidative properties of urate in maintaining neuroprotection.
“[The] U.K. Biobank is an excellent database to look at questions of association,” John D. FitzGerald, MD, PhD, MPH, MBA, professor and clinical chief of rheumatology at the University of California, Los Angeles, said in an interview.
“This is a fairly rigorous study,” added Dr. FitzGerald, also not involved in the study. “While it has lots of strengths,” including its large sample size and Mendelian randomization, it also has “abundant weaknesses,” he added. “It is largely cross-sectional, with single urate measurement and single brain MRI.”
“Causation is the big question,” Dr. FitzGerald noted. “Does treating gout (or urate) help prevent dementia or neurodegenerative decline?”
Early diagnosis benefits patients
Dr. Khanna and Dr. FitzGerald joined the authors in advising doctors to monitor their gout patients for cognitive and motor symptoms of neurodegenerative disease.
“It is clearly important to pay close attention to the neurologic exam and history in gout, especially because it is a disease of the aging population,” Dr. Khanna advised. “Addressing dementia when gout is diagnosed can lead to prompt mitigation strategies that can hugely impact patients.”
Dr. Topiwala and her colleagues would like to investigate why the dementia risk was time-dependent. “Is this because of the acute inflammatory response in gout, or could it just be that patients with gout visit their doctors more frequently, so any cognitive problems are picked up sooner?” she asked.
The authors, and Dr. Khanna and Dr. FitzGerald, report no relevant financial relationships. The Wellcome Trust; the U.K. Medical Research Council; the European Commission Horizon 2020 research and innovation program; the British Heart Foundation; the U.S. National Institutes of Health; the Engineering and Physical Sciences Research Council; and the National Institute for Health and Care Research funded the study.
FROM NATURE COMMUNICATIONS
Game-changing Alzheimer’s research: The latest on biomarkers
The field of neurodegenerative dementias, particularly Alzheimer’s disease (AD), has been revolutionized by the development of imaging and cerebrospinal fluid biomarkers and is on the brink of a new development: emerging plasma biomarkers. Research now recognizes the relationship between the cognitive-behavioral syndromic diagnosis (that is, the illness) and the etiologic diagnosis (the disease) – and the need to consider each separately when developing a diagnostic formulation. The National Institute on Aging and Alzheimer’s Association Research Framework uses the amyloid, tau, and neurodegeneration system to define AD biologically in living patients. Here is an overview of the framework, which requires biomarker evidence of amyloid plaques (amyloid positivity) and neurofibrillary tangles (tau positivity), with evidence of neurodegeneration (neurodegeneration positivity) to support the diagnosis.
The diagnostic approach for symptomatic patients
The differential diagnosis in symptomatic patients with mild cognitive impairment (MCI), mild behavioral impairment, or dementia is broad and includes multiple neurodegenerative diseases (for example, AD, frontotemporal lobar degeneration, dementia with Lewy bodies, argyrophilic grain disease, hippocampal sclerosis); vascular ischemic brain injury (for example, stroke); tumors; infectious, inflammatory, paraneoplastic, or demyelinating diseases; trauma; hydrocephalus; toxic/metabolic insults; and other rare diseases. The patient’s clinical syndrome narrows the differential diagnosis.
Once the clinician has a prioritized differential diagnosis of the brain disease or condition that is probably causing or contributing to the patient’s signs and symptoms, they can then select appropriate assessments and tests, typically starting with a laboratory panel and brain MRI. Strong evidence backed by practice recommendations also supports the use of fluorodeoxyglucose PET as a marker of functional brain abnormalities associated with dementia. Although molecular biomarkers are typically considered at the later stage of the clinical workup, the anticipated future availability of plasma biomarkers will probably change the timing of molecular biomarker assessment in patients with suspected cognitive impairment owing to AD.
Molecular PET biomarkers
Three PET tracers approved by the U.S. Food and Drug Administration for the detection of cerebral amyloid plaques have high sensitivity (89%-98%) and specificity (88%-100%), compared with autopsy, the gold standard diagnostic tool. However, these scans are costly and are not reimbursed by Medicare and Medicaid. Because all amyloid PET scans are covered by the Veterans Administration, this test is more readily accessible for patients receiving VA benefits.
The appropriate-use criteria developed by the Amyloid Imaging Task Force recommends amyloid PET for patients with persistent or progressive MCI or dementia. In such patients, a negative amyloid PET scan would strongly weigh against AD, supporting a differential diagnosis of other etiologies. Although a positive amyloid PET scan in patients with MCI or dementia indicates the presence of amyloid plaques, it does not necessarily confirm AD as the cause. Cerebral amyloid plaques may coexist with other pathologies and increase with age, even in cognitively normal individuals.
The IDEAS study looked at the clinical utility of amyloid PET in a real-world dementia specialist setting. In the study, dementia subspecialists documented their presumed etiologic diagnosis (and level of confidence) before and after amyloid PET. Of the 11,409 patients who completed the study, the etiologic diagnosis changed from AD to non-AD in just over 25% of cases and from non-AD to AD in 10.5%. Clinical management changed in about 60% of patients with MCI and 63.5% of patients with dementia.
In May 2020, the FDA approved flortaucipir F-18, the first diagnostic tau radiotracer for use with PET to estimate the density and distribution of aggregated tau neurofibrillary tangles in adults with cognitive impairment undergoing evaluation for AD. Regulatory approval of flortaucipir F-18 was based on findings from two clinical trials of terminally ill patients who were followed to autopsy. The studies included patients with a spectrum of clinically diagnosed dementias and those with normal cognition. The primary outcome of the studies was accurate visual interpretation of the images in detecting advanced AD tau neurofibrillary tangle pathology (Braak stage V or VI tau pathology). Sensitivity of five trained readers ranged from 68% to 86%, and specificity ranged from 63% to 100%; interrater agreement was 0.87. Tau PET is not yet reimbursed and is therefore not yet readily available in the clinical setting. Moreover, appropriate use criteria have not yet been published.
Molecular fluid biomarkers
Cerebrospinal fluid (CSF) analysis is currently the most readily available and reimbursed test to aid in diagnosing AD, with appropriate-use criteria for patients with suspected AD. CSF biomarkers for AD are useful in cognitively impaired patients when the etiologic diagnosis is equivocal, there is only an intermediate level of diagnostic confidence, or there is very high confidence in the etiologic diagnosis. Testing for CSF biomarkers is also recommended for patients at very early clinical stages (for example, early MCI) or with atypical clinical presentations.
A decreased concentration of amyloid-beta 42 in CSF is a marker of amyloid neuritic plaques in the brain. An increased concentration of total tau in CSF reflects injury to neurons, and an increased concentration of specific isoforms of hyperphosphorylated tau reflects neurofibrillary tangles. Presently, the ratios of t-tau to amyloid-beta 42, amyloid-beta 42 to amyloid-beta 40, and phosphorylated-tau 181 to amyloid-beta 42 are the best-performing markers of AD neuropathologic changes and are more accurate than assessing individual biomarkers. These CSF biomarkers of AD have been validated against autopsy, and ratio values of CSF amyloid-beta 42 have been further validated against amyloid PET, with overall sensitivity and specificity of approximately 90% and 84%, respectively.
Some of the most exciting recent advances in AD center around the measurement of these proteins and others in plasma. Appropriate-use criteria for plasma biomarkers in the evaluation of patients with cognitive impairment were published in 2022. In addition to their use in clinical trials, these criteria cautiously recommend using these biomarkers in specialized memory clinics in the diagnostic workup of patients with cognitive symptoms, along with confirmatory CSF markers or PET. Additional data are needed before plasma biomarkers of AD are used as standalone diagnostic markers or considered in the primary care setting.
We have made remarkable progress toward more precise molecular diagnosis of brain diseases underlying cognitive impairment and dementia. Ongoing efforts to evaluate the utility of these measures in clinical practice include the need to increase diversity of patients and providers. Ultimately, the tremendous progress in molecular biomarkers for the diseases causing dementia will help the field work toward our common goal of early and accurate diagnosis, better management, and hope for people living with these diseases.
Bradford C. Dickerson, MD, MMSc, is a professor, department of neurology, Harvard Medical School, and director, Frontotemporal Disorders Unit, department of neurology, at Massachusetts General Hospital, both in Boston.
A version of this article first appeared on Medscape.com.
The field of neurodegenerative dementias, particularly Alzheimer’s disease (AD), has been revolutionized by the development of imaging and cerebrospinal fluid biomarkers and is on the brink of a new development: emerging plasma biomarkers. Research now recognizes the relationship between the cognitive-behavioral syndromic diagnosis (that is, the illness) and the etiologic diagnosis (the disease) – and the need to consider each separately when developing a diagnostic formulation. The National Institute on Aging and Alzheimer’s Association Research Framework uses the amyloid, tau, and neurodegeneration system to define AD biologically in living patients. Here is an overview of the framework, which requires biomarker evidence of amyloid plaques (amyloid positivity) and neurofibrillary tangles (tau positivity), with evidence of neurodegeneration (neurodegeneration positivity) to support the diagnosis.
The diagnostic approach for symptomatic patients
The differential diagnosis in symptomatic patients with mild cognitive impairment (MCI), mild behavioral impairment, or dementia is broad and includes multiple neurodegenerative diseases (for example, AD, frontotemporal lobar degeneration, dementia with Lewy bodies, argyrophilic grain disease, hippocampal sclerosis); vascular ischemic brain injury (for example, stroke); tumors; infectious, inflammatory, paraneoplastic, or demyelinating diseases; trauma; hydrocephalus; toxic/metabolic insults; and other rare diseases. The patient’s clinical syndrome narrows the differential diagnosis.
Once the clinician has a prioritized differential diagnosis of the brain disease or condition that is probably causing or contributing to the patient’s signs and symptoms, they can then select appropriate assessments and tests, typically starting with a laboratory panel and brain MRI. Strong evidence backed by practice recommendations also supports the use of fluorodeoxyglucose PET as a marker of functional brain abnormalities associated with dementia. Although molecular biomarkers are typically considered at the later stage of the clinical workup, the anticipated future availability of plasma biomarkers will probably change the timing of molecular biomarker assessment in patients with suspected cognitive impairment owing to AD.
Molecular PET biomarkers
Three PET tracers approved by the U.S. Food and Drug Administration for the detection of cerebral amyloid plaques have high sensitivity (89%-98%) and specificity (88%-100%), compared with autopsy, the gold standard diagnostic tool. However, these scans are costly and are not reimbursed by Medicare and Medicaid. Because all amyloid PET scans are covered by the Veterans Administration, this test is more readily accessible for patients receiving VA benefits.
The appropriate-use criteria developed by the Amyloid Imaging Task Force recommends amyloid PET for patients with persistent or progressive MCI or dementia. In such patients, a negative amyloid PET scan would strongly weigh against AD, supporting a differential diagnosis of other etiologies. Although a positive amyloid PET scan in patients with MCI or dementia indicates the presence of amyloid plaques, it does not necessarily confirm AD as the cause. Cerebral amyloid plaques may coexist with other pathologies and increase with age, even in cognitively normal individuals.
The IDEAS study looked at the clinical utility of amyloid PET in a real-world dementia specialist setting. In the study, dementia subspecialists documented their presumed etiologic diagnosis (and level of confidence) before and after amyloid PET. Of the 11,409 patients who completed the study, the etiologic diagnosis changed from AD to non-AD in just over 25% of cases and from non-AD to AD in 10.5%. Clinical management changed in about 60% of patients with MCI and 63.5% of patients with dementia.
In May 2020, the FDA approved flortaucipir F-18, the first diagnostic tau radiotracer for use with PET to estimate the density and distribution of aggregated tau neurofibrillary tangles in adults with cognitive impairment undergoing evaluation for AD. Regulatory approval of flortaucipir F-18 was based on findings from two clinical trials of terminally ill patients who were followed to autopsy. The studies included patients with a spectrum of clinically diagnosed dementias and those with normal cognition. The primary outcome of the studies was accurate visual interpretation of the images in detecting advanced AD tau neurofibrillary tangle pathology (Braak stage V or VI tau pathology). Sensitivity of five trained readers ranged from 68% to 86%, and specificity ranged from 63% to 100%; interrater agreement was 0.87. Tau PET is not yet reimbursed and is therefore not yet readily available in the clinical setting. Moreover, appropriate use criteria have not yet been published.
Molecular fluid biomarkers
Cerebrospinal fluid (CSF) analysis is currently the most readily available and reimbursed test to aid in diagnosing AD, with appropriate-use criteria for patients with suspected AD. CSF biomarkers for AD are useful in cognitively impaired patients when the etiologic diagnosis is equivocal, there is only an intermediate level of diagnostic confidence, or there is very high confidence in the etiologic diagnosis. Testing for CSF biomarkers is also recommended for patients at very early clinical stages (for example, early MCI) or with atypical clinical presentations.
A decreased concentration of amyloid-beta 42 in CSF is a marker of amyloid neuritic plaques in the brain. An increased concentration of total tau in CSF reflects injury to neurons, and an increased concentration of specific isoforms of hyperphosphorylated tau reflects neurofibrillary tangles. Presently, the ratios of t-tau to amyloid-beta 42, amyloid-beta 42 to amyloid-beta 40, and phosphorylated-tau 181 to amyloid-beta 42 are the best-performing markers of AD neuropathologic changes and are more accurate than assessing individual biomarkers. These CSF biomarkers of AD have been validated against autopsy, and ratio values of CSF amyloid-beta 42 have been further validated against amyloid PET, with overall sensitivity and specificity of approximately 90% and 84%, respectively.
Some of the most exciting recent advances in AD center around the measurement of these proteins and others in plasma. Appropriate-use criteria for plasma biomarkers in the evaluation of patients with cognitive impairment were published in 2022. In addition to their use in clinical trials, these criteria cautiously recommend using these biomarkers in specialized memory clinics in the diagnostic workup of patients with cognitive symptoms, along with confirmatory CSF markers or PET. Additional data are needed before plasma biomarkers of AD are used as standalone diagnostic markers or considered in the primary care setting.
We have made remarkable progress toward more precise molecular diagnosis of brain diseases underlying cognitive impairment and dementia. Ongoing efforts to evaluate the utility of these measures in clinical practice include the need to increase diversity of patients and providers. Ultimately, the tremendous progress in molecular biomarkers for the diseases causing dementia will help the field work toward our common goal of early and accurate diagnosis, better management, and hope for people living with these diseases.
Bradford C. Dickerson, MD, MMSc, is a professor, department of neurology, Harvard Medical School, and director, Frontotemporal Disorders Unit, department of neurology, at Massachusetts General Hospital, both in Boston.
A version of this article first appeared on Medscape.com.
The field of neurodegenerative dementias, particularly Alzheimer’s disease (AD), has been revolutionized by the development of imaging and cerebrospinal fluid biomarkers and is on the brink of a new development: emerging plasma biomarkers. Research now recognizes the relationship between the cognitive-behavioral syndromic diagnosis (that is, the illness) and the etiologic diagnosis (the disease) – and the need to consider each separately when developing a diagnostic formulation. The National Institute on Aging and Alzheimer’s Association Research Framework uses the amyloid, tau, and neurodegeneration system to define AD biologically in living patients. Here is an overview of the framework, which requires biomarker evidence of amyloid plaques (amyloid positivity) and neurofibrillary tangles (tau positivity), with evidence of neurodegeneration (neurodegeneration positivity) to support the diagnosis.
The diagnostic approach for symptomatic patients
The differential diagnosis in symptomatic patients with mild cognitive impairment (MCI), mild behavioral impairment, or dementia is broad and includes multiple neurodegenerative diseases (for example, AD, frontotemporal lobar degeneration, dementia with Lewy bodies, argyrophilic grain disease, hippocampal sclerosis); vascular ischemic brain injury (for example, stroke); tumors; infectious, inflammatory, paraneoplastic, or demyelinating diseases; trauma; hydrocephalus; toxic/metabolic insults; and other rare diseases. The patient’s clinical syndrome narrows the differential diagnosis.
Once the clinician has a prioritized differential diagnosis of the brain disease or condition that is probably causing or contributing to the patient’s signs and symptoms, they can then select appropriate assessments and tests, typically starting with a laboratory panel and brain MRI. Strong evidence backed by practice recommendations also supports the use of fluorodeoxyglucose PET as a marker of functional brain abnormalities associated with dementia. Although molecular biomarkers are typically considered at the later stage of the clinical workup, the anticipated future availability of plasma biomarkers will probably change the timing of molecular biomarker assessment in patients with suspected cognitive impairment owing to AD.
Molecular PET biomarkers
Three PET tracers approved by the U.S. Food and Drug Administration for the detection of cerebral amyloid plaques have high sensitivity (89%-98%) and specificity (88%-100%), compared with autopsy, the gold standard diagnostic tool. However, these scans are costly and are not reimbursed by Medicare and Medicaid. Because all amyloid PET scans are covered by the Veterans Administration, this test is more readily accessible for patients receiving VA benefits.
The appropriate-use criteria developed by the Amyloid Imaging Task Force recommends amyloid PET for patients with persistent or progressive MCI or dementia. In such patients, a negative amyloid PET scan would strongly weigh against AD, supporting a differential diagnosis of other etiologies. Although a positive amyloid PET scan in patients with MCI or dementia indicates the presence of amyloid plaques, it does not necessarily confirm AD as the cause. Cerebral amyloid plaques may coexist with other pathologies and increase with age, even in cognitively normal individuals.
The IDEAS study looked at the clinical utility of amyloid PET in a real-world dementia specialist setting. In the study, dementia subspecialists documented their presumed etiologic diagnosis (and level of confidence) before and after amyloid PET. Of the 11,409 patients who completed the study, the etiologic diagnosis changed from AD to non-AD in just over 25% of cases and from non-AD to AD in 10.5%. Clinical management changed in about 60% of patients with MCI and 63.5% of patients with dementia.
In May 2020, the FDA approved flortaucipir F-18, the first diagnostic tau radiotracer for use with PET to estimate the density and distribution of aggregated tau neurofibrillary tangles in adults with cognitive impairment undergoing evaluation for AD. Regulatory approval of flortaucipir F-18 was based on findings from two clinical trials of terminally ill patients who were followed to autopsy. The studies included patients with a spectrum of clinically diagnosed dementias and those with normal cognition. The primary outcome of the studies was accurate visual interpretation of the images in detecting advanced AD tau neurofibrillary tangle pathology (Braak stage V or VI tau pathology). Sensitivity of five trained readers ranged from 68% to 86%, and specificity ranged from 63% to 100%; interrater agreement was 0.87. Tau PET is not yet reimbursed and is therefore not yet readily available in the clinical setting. Moreover, appropriate use criteria have not yet been published.
Molecular fluid biomarkers
Cerebrospinal fluid (CSF) analysis is currently the most readily available and reimbursed test to aid in diagnosing AD, with appropriate-use criteria for patients with suspected AD. CSF biomarkers for AD are useful in cognitively impaired patients when the etiologic diagnosis is equivocal, there is only an intermediate level of diagnostic confidence, or there is very high confidence in the etiologic diagnosis. Testing for CSF biomarkers is also recommended for patients at very early clinical stages (for example, early MCI) or with atypical clinical presentations.
A decreased concentration of amyloid-beta 42 in CSF is a marker of amyloid neuritic plaques in the brain. An increased concentration of total tau in CSF reflects injury to neurons, and an increased concentration of specific isoforms of hyperphosphorylated tau reflects neurofibrillary tangles. Presently, the ratios of t-tau to amyloid-beta 42, amyloid-beta 42 to amyloid-beta 40, and phosphorylated-tau 181 to amyloid-beta 42 are the best-performing markers of AD neuropathologic changes and are more accurate than assessing individual biomarkers. These CSF biomarkers of AD have been validated against autopsy, and ratio values of CSF amyloid-beta 42 have been further validated against amyloid PET, with overall sensitivity and specificity of approximately 90% and 84%, respectively.
Some of the most exciting recent advances in AD center around the measurement of these proteins and others in plasma. Appropriate-use criteria for plasma biomarkers in the evaluation of patients with cognitive impairment were published in 2022. In addition to their use in clinical trials, these criteria cautiously recommend using these biomarkers in specialized memory clinics in the diagnostic workup of patients with cognitive symptoms, along with confirmatory CSF markers or PET. Additional data are needed before plasma biomarkers of AD are used as standalone diagnostic markers or considered in the primary care setting.
We have made remarkable progress toward more precise molecular diagnosis of brain diseases underlying cognitive impairment and dementia. Ongoing efforts to evaluate the utility of these measures in clinical practice include the need to increase diversity of patients and providers. Ultimately, the tremendous progress in molecular biomarkers for the diseases causing dementia will help the field work toward our common goal of early and accurate diagnosis, better management, and hope for people living with these diseases.
Bradford C. Dickerson, MD, MMSc, is a professor, department of neurology, Harvard Medical School, and director, Frontotemporal Disorders Unit, department of neurology, at Massachusetts General Hospital, both in Boston.
A version of this article first appeared on Medscape.com.
Flavanol supplement improves memory in adults with poor diets
Taking a daily flavanol supplement improves hippocampal-dependent memory in older adults who have a relatively poor diet, results of a large new study suggest.
There’s increasing evidence that certain nutrients are important for the aging body and brain, study investigator Scott Small, MD, the Boris and Rose Katz Professor of Neurology, Columbia University Vagelos College of Physicians and Surgeons, New York, told this news organization.
“With this new study, I think we can begin to say flavanols might be the first one that really is a nutrient for the aging brain.”
These findings, said Dr. Small, represent “the beginning of a new era” that will eventually lead to formal recommendations” related to ideal intake of flavanols to reduce cognitive aging.
The findings were published online in the Proceedings of the National Academy of Science.
Better cognitive aging
Cognitive aging refers to the decline in cognitive abilities that are not thought to be caused by neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease. Cognitive aging targets two areas of the brain: the hippocampus, which is related to memory function, and the prefrontal cortex, which is related to attention and executive function.
Previous research has linked flavanols, which are found in foods like apples, pears, berries, and cocoa beans, to improved cognitive aging. The evidence shows that consuming these nutrients might be associated with the hippocampal-dependent memory component of cognitive aging.
The new study, known as COcoa Supplement and Multivitamin Outcomes Study-Web (COSMOS-Web), included 3,562 generally healthy men and women, mean age 71 years, who were mostly well-educated and non-Hispanic/non-Latinx White individuals.
Participants were randomly assigned to receive oral flavanol-containing cocoa extract (500 mg of cocoa flavanols, including 80 mg of epicatechin) or a placebo daily.
The primary endpoint was hippocampal-dependent memory at year 1 as assessed with the ModRey, a neuropsychological test designed to measure hippocampal function.
Results showed participants in both groups had a typical learning (practice) effect, with similar improvements (d = 0.025; P = .42).
Researchers used other tests to measure cognition: the Color/Directional Flanker Task, a measure of prefrontal cortex function, and the ModBent, a measure that’s sensitive to dentate gyrus function. The flavanol intervention did not affect ModBent results or performance on the Flanker test after 1 year.
However, it was a different story for those with a poor diet at baseline. Researchers stratified participants into tertiles on the basis of diet quality as measured by the Healthy Eating Index (HEI) scores. Those in the lowest tertile had poorer baseline hippocampal-dependent memory performance but not memory related to the prefrontal cortex.
The flavanol intervention improved performance on the ModRey test, compared with placebo in participants in the low HEI tertile (overall effect: d = 0.086; P = .011) but not among those with a medium or high HEI at baseline.
“We confirmed that the flavanol intervention only benefits people who are relatively deficient at baseline,” said Dr. Small.
The correlation with hippocampal-dependent memory was confirmed in a subset of 1,361 study participants who provided a urine sample. Researchers measured urinary 5-(3′,4′-dihydroxyphenyl)-gamma-valerolactone metabolite (gVLM) concentrations, a validated biomarker of flavanol consumption.
After stratifying these results into tertiles, researchers found performance on the ModRey was significantly improved with the dietary flavanol intervention (overall effect: d = 0.141; P = .006) in the lowest gVLM tertile.
Memory restored
When participants in the lowest tertile consumed the supplement, “their flavanol levels went back to normal, and when that happened, their memory was restored,” said Dr. Small.
It appears that there is a sort of ceiling effect to the flavanol benefits. “It seems what you need to do is normalize your flavanol levels; if you go above normal, there was no evidence that your memory keeps on getting better,” said Dr. Small.
The study included only older adults, so it’s unclear what the impact of flavanol supplementation is in younger adults. But cognitive aging “begins its slippery side” in the 40s, said Dr. Small. “If this is truly a nutrient that is taken to prevent that slide from happening, it might be beneficial to start in our 40s.”
He recognized that the effect size is not large but said this is “very dependent” on baseline factors and most study participants had a rather healthy diet. “None of our participants were really highly deficient” in flavanols, he said.
“To see a stronger effect size, we need to do another study where we recruit people who are very low, truly deficient, in flavanols, and then see what happens.”
Showing that flavanols are linked to the hippocampal and not to the prefrontal component of cognitive aging “speaks to the mechanism,” said Dr. Small.
Though the exact mechanism linking flavanols with enhanced memory isn’t clear, there are some clues; for example, research suggests cognitive aging affects the dentate gyrus, a subregion of the hippocampus.
The flavanol supplements were well tolerated. “I can say with close to certainty that this is very safe,” said Dr. Small, adding the flavanols have now been used in numerous studies.
The findings suggest flavanol consumption might be part of future dietary guidelines. “I suspect that once there is sufficient evidence, flavanols will be part of the dietary recommendations for healthy aging,” said Dr. Small.
A word of caution
Heather M. Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said that though science suggests a balanced diet is good for overall brain health, no single food, beverage, ingredient, vitamin, or supplement has yet been proven to prevent dementia, treat or cure Alzheimer’s, or benefit cognitive function or brain health.
Experts agree the best source of vitamins and other nutrients is from whole foods as part of a balanced diet. “We recognize that, for a variety of reasons, this may not always be possible,” said Dr. Snyder.
However, she noted, dietary supplements are not subject to the same rigorous review and regulation process as medications.
“The Alzheimer’s Association strongly encourages individuals to have conversations with their physicians about all medications and dietary supplements they are currently taking or interested in starting.”
COSMOS is supported by an investigator-initiated grant from Mars Edge, a segment of Mars, company engaged in flavanol research and flavanol-related commercial activities, which included infrastructure support and the donation of study pills and packaging. Small reports receiving an unrestricted research grant from Mars.
A version of this article first appeared on Medscape.com.
Taking a daily flavanol supplement improves hippocampal-dependent memory in older adults who have a relatively poor diet, results of a large new study suggest.
There’s increasing evidence that certain nutrients are important for the aging body and brain, study investigator Scott Small, MD, the Boris and Rose Katz Professor of Neurology, Columbia University Vagelos College of Physicians and Surgeons, New York, told this news organization.
“With this new study, I think we can begin to say flavanols might be the first one that really is a nutrient for the aging brain.”
These findings, said Dr. Small, represent “the beginning of a new era” that will eventually lead to formal recommendations” related to ideal intake of flavanols to reduce cognitive aging.
The findings were published online in the Proceedings of the National Academy of Science.
Better cognitive aging
Cognitive aging refers to the decline in cognitive abilities that are not thought to be caused by neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease. Cognitive aging targets two areas of the brain: the hippocampus, which is related to memory function, and the prefrontal cortex, which is related to attention and executive function.
Previous research has linked flavanols, which are found in foods like apples, pears, berries, and cocoa beans, to improved cognitive aging. The evidence shows that consuming these nutrients might be associated with the hippocampal-dependent memory component of cognitive aging.
The new study, known as COcoa Supplement and Multivitamin Outcomes Study-Web (COSMOS-Web), included 3,562 generally healthy men and women, mean age 71 years, who were mostly well-educated and non-Hispanic/non-Latinx White individuals.
Participants were randomly assigned to receive oral flavanol-containing cocoa extract (500 mg of cocoa flavanols, including 80 mg of epicatechin) or a placebo daily.
The primary endpoint was hippocampal-dependent memory at year 1 as assessed with the ModRey, a neuropsychological test designed to measure hippocampal function.
Results showed participants in both groups had a typical learning (practice) effect, with similar improvements (d = 0.025; P = .42).
Researchers used other tests to measure cognition: the Color/Directional Flanker Task, a measure of prefrontal cortex function, and the ModBent, a measure that’s sensitive to dentate gyrus function. The flavanol intervention did not affect ModBent results or performance on the Flanker test after 1 year.
However, it was a different story for those with a poor diet at baseline. Researchers stratified participants into tertiles on the basis of diet quality as measured by the Healthy Eating Index (HEI) scores. Those in the lowest tertile had poorer baseline hippocampal-dependent memory performance but not memory related to the prefrontal cortex.
The flavanol intervention improved performance on the ModRey test, compared with placebo in participants in the low HEI tertile (overall effect: d = 0.086; P = .011) but not among those with a medium or high HEI at baseline.
“We confirmed that the flavanol intervention only benefits people who are relatively deficient at baseline,” said Dr. Small.
The correlation with hippocampal-dependent memory was confirmed in a subset of 1,361 study participants who provided a urine sample. Researchers measured urinary 5-(3′,4′-dihydroxyphenyl)-gamma-valerolactone metabolite (gVLM) concentrations, a validated biomarker of flavanol consumption.
After stratifying these results into tertiles, researchers found performance on the ModRey was significantly improved with the dietary flavanol intervention (overall effect: d = 0.141; P = .006) in the lowest gVLM tertile.
Memory restored
When participants in the lowest tertile consumed the supplement, “their flavanol levels went back to normal, and when that happened, their memory was restored,” said Dr. Small.
It appears that there is a sort of ceiling effect to the flavanol benefits. “It seems what you need to do is normalize your flavanol levels; if you go above normal, there was no evidence that your memory keeps on getting better,” said Dr. Small.
The study included only older adults, so it’s unclear what the impact of flavanol supplementation is in younger adults. But cognitive aging “begins its slippery side” in the 40s, said Dr. Small. “If this is truly a nutrient that is taken to prevent that slide from happening, it might be beneficial to start in our 40s.”
He recognized that the effect size is not large but said this is “very dependent” on baseline factors and most study participants had a rather healthy diet. “None of our participants were really highly deficient” in flavanols, he said.
“To see a stronger effect size, we need to do another study where we recruit people who are very low, truly deficient, in flavanols, and then see what happens.”
Showing that flavanols are linked to the hippocampal and not to the prefrontal component of cognitive aging “speaks to the mechanism,” said Dr. Small.
Though the exact mechanism linking flavanols with enhanced memory isn’t clear, there are some clues; for example, research suggests cognitive aging affects the dentate gyrus, a subregion of the hippocampus.
The flavanol supplements were well tolerated. “I can say with close to certainty that this is very safe,” said Dr. Small, adding the flavanols have now been used in numerous studies.
The findings suggest flavanol consumption might be part of future dietary guidelines. “I suspect that once there is sufficient evidence, flavanols will be part of the dietary recommendations for healthy aging,” said Dr. Small.
A word of caution
Heather M. Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said that though science suggests a balanced diet is good for overall brain health, no single food, beverage, ingredient, vitamin, or supplement has yet been proven to prevent dementia, treat or cure Alzheimer’s, or benefit cognitive function or brain health.
Experts agree the best source of vitamins and other nutrients is from whole foods as part of a balanced diet. “We recognize that, for a variety of reasons, this may not always be possible,” said Dr. Snyder.
However, she noted, dietary supplements are not subject to the same rigorous review and regulation process as medications.
“The Alzheimer’s Association strongly encourages individuals to have conversations with their physicians about all medications and dietary supplements they are currently taking or interested in starting.”
COSMOS is supported by an investigator-initiated grant from Mars Edge, a segment of Mars, company engaged in flavanol research and flavanol-related commercial activities, which included infrastructure support and the donation of study pills and packaging. Small reports receiving an unrestricted research grant from Mars.
A version of this article first appeared on Medscape.com.
Taking a daily flavanol supplement improves hippocampal-dependent memory in older adults who have a relatively poor diet, results of a large new study suggest.
There’s increasing evidence that certain nutrients are important for the aging body and brain, study investigator Scott Small, MD, the Boris and Rose Katz Professor of Neurology, Columbia University Vagelos College of Physicians and Surgeons, New York, told this news organization.
“With this new study, I think we can begin to say flavanols might be the first one that really is a nutrient for the aging brain.”
These findings, said Dr. Small, represent “the beginning of a new era” that will eventually lead to formal recommendations” related to ideal intake of flavanols to reduce cognitive aging.
The findings were published online in the Proceedings of the National Academy of Science.
Better cognitive aging
Cognitive aging refers to the decline in cognitive abilities that are not thought to be caused by neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease. Cognitive aging targets two areas of the brain: the hippocampus, which is related to memory function, and the prefrontal cortex, which is related to attention and executive function.
Previous research has linked flavanols, which are found in foods like apples, pears, berries, and cocoa beans, to improved cognitive aging. The evidence shows that consuming these nutrients might be associated with the hippocampal-dependent memory component of cognitive aging.
The new study, known as COcoa Supplement and Multivitamin Outcomes Study-Web (COSMOS-Web), included 3,562 generally healthy men and women, mean age 71 years, who were mostly well-educated and non-Hispanic/non-Latinx White individuals.
Participants were randomly assigned to receive oral flavanol-containing cocoa extract (500 mg of cocoa flavanols, including 80 mg of epicatechin) or a placebo daily.
The primary endpoint was hippocampal-dependent memory at year 1 as assessed with the ModRey, a neuropsychological test designed to measure hippocampal function.
Results showed participants in both groups had a typical learning (practice) effect, with similar improvements (d = 0.025; P = .42).
Researchers used other tests to measure cognition: the Color/Directional Flanker Task, a measure of prefrontal cortex function, and the ModBent, a measure that’s sensitive to dentate gyrus function. The flavanol intervention did not affect ModBent results or performance on the Flanker test after 1 year.
However, it was a different story for those with a poor diet at baseline. Researchers stratified participants into tertiles on the basis of diet quality as measured by the Healthy Eating Index (HEI) scores. Those in the lowest tertile had poorer baseline hippocampal-dependent memory performance but not memory related to the prefrontal cortex.
The flavanol intervention improved performance on the ModRey test, compared with placebo in participants in the low HEI tertile (overall effect: d = 0.086; P = .011) but not among those with a medium or high HEI at baseline.
“We confirmed that the flavanol intervention only benefits people who are relatively deficient at baseline,” said Dr. Small.
The correlation with hippocampal-dependent memory was confirmed in a subset of 1,361 study participants who provided a urine sample. Researchers measured urinary 5-(3′,4′-dihydroxyphenyl)-gamma-valerolactone metabolite (gVLM) concentrations, a validated biomarker of flavanol consumption.
After stratifying these results into tertiles, researchers found performance on the ModRey was significantly improved with the dietary flavanol intervention (overall effect: d = 0.141; P = .006) in the lowest gVLM tertile.
Memory restored
When participants in the lowest tertile consumed the supplement, “their flavanol levels went back to normal, and when that happened, their memory was restored,” said Dr. Small.
It appears that there is a sort of ceiling effect to the flavanol benefits. “It seems what you need to do is normalize your flavanol levels; if you go above normal, there was no evidence that your memory keeps on getting better,” said Dr. Small.
The study included only older adults, so it’s unclear what the impact of flavanol supplementation is in younger adults. But cognitive aging “begins its slippery side” in the 40s, said Dr. Small. “If this is truly a nutrient that is taken to prevent that slide from happening, it might be beneficial to start in our 40s.”
He recognized that the effect size is not large but said this is “very dependent” on baseline factors and most study participants had a rather healthy diet. “None of our participants were really highly deficient” in flavanols, he said.
“To see a stronger effect size, we need to do another study where we recruit people who are very low, truly deficient, in flavanols, and then see what happens.”
Showing that flavanols are linked to the hippocampal and not to the prefrontal component of cognitive aging “speaks to the mechanism,” said Dr. Small.
Though the exact mechanism linking flavanols with enhanced memory isn’t clear, there are some clues; for example, research suggests cognitive aging affects the dentate gyrus, a subregion of the hippocampus.
The flavanol supplements were well tolerated. “I can say with close to certainty that this is very safe,” said Dr. Small, adding the flavanols have now been used in numerous studies.
The findings suggest flavanol consumption might be part of future dietary guidelines. “I suspect that once there is sufficient evidence, flavanols will be part of the dietary recommendations for healthy aging,” said Dr. Small.
A word of caution
Heather M. Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said that though science suggests a balanced diet is good for overall brain health, no single food, beverage, ingredient, vitamin, or supplement has yet been proven to prevent dementia, treat or cure Alzheimer’s, or benefit cognitive function or brain health.
Experts agree the best source of vitamins and other nutrients is from whole foods as part of a balanced diet. “We recognize that, for a variety of reasons, this may not always be possible,” said Dr. Snyder.
However, she noted, dietary supplements are not subject to the same rigorous review and regulation process as medications.
“The Alzheimer’s Association strongly encourages individuals to have conversations with their physicians about all medications and dietary supplements they are currently taking or interested in starting.”
COSMOS is supported by an investigator-initiated grant from Mars Edge, a segment of Mars, company engaged in flavanol research and flavanol-related commercial activities, which included infrastructure support and the donation of study pills and packaging. Small reports receiving an unrestricted research grant from Mars.
A version of this article first appeared on Medscape.com.
CMS to cover Alzheimer’s drugs after traditional FDA okay
The Centers for Medicare & Medicaid Services has announced that
The one caveat is that CMS will require physicians to participate in registries that collect evidence about how these drugs work in the real world.
Physicians will be able to submit this evidence through a nationwide, CMS-facilitated portal that will be available when any product gains traditional approval and will collect information via an easy-to-use format.
“If the FDA grants traditional approval, then Medicare will cover it in appropriate settings that also support the collection of real-world information to study the usefulness of these drugs for people with Medicare,” the CMS says in a news release.
“CMS has always been committed to helping people obtain timely access to innovative treatments that meaningfully improve care and outcomes for this disease,” added CMS Administrator Chiquita Brooks-LaSure.
“If the FDA grants traditional approval, CMS is prepared to ensure anyone with Medicare Part B who meets the criteria is covered,” Ms. Brooks-LaSure explained.
The CMS says broader Medicare coverage for an Alzheimer’s drug would begin on the same day the FDA grants traditional approval. Under CMS’ current coverage policy, if the FDA grants traditional approval to other drugs in this class, they would also be eligible for broader coverage.
Currently two drugs in this class – aducanumab (Aduhelm) and lecanemab (Leqembi) – have received accelerated approval from the FDA, but no product has received traditional approval.
Lecanemab might be the first to cross the line.
On June 9, the FDA Peripheral and Central Nervous System Drugs Advisory Committee will discuss results of a confirmatory trial of lecanemab, with a potential decision on traditional approval expected shortly thereafter.
A version of this article first appeared on Medscape.com.
The Centers for Medicare & Medicaid Services has announced that
The one caveat is that CMS will require physicians to participate in registries that collect evidence about how these drugs work in the real world.
Physicians will be able to submit this evidence through a nationwide, CMS-facilitated portal that will be available when any product gains traditional approval and will collect information via an easy-to-use format.
“If the FDA grants traditional approval, then Medicare will cover it in appropriate settings that also support the collection of real-world information to study the usefulness of these drugs for people with Medicare,” the CMS says in a news release.
“CMS has always been committed to helping people obtain timely access to innovative treatments that meaningfully improve care and outcomes for this disease,” added CMS Administrator Chiquita Brooks-LaSure.
“If the FDA grants traditional approval, CMS is prepared to ensure anyone with Medicare Part B who meets the criteria is covered,” Ms. Brooks-LaSure explained.
The CMS says broader Medicare coverage for an Alzheimer’s drug would begin on the same day the FDA grants traditional approval. Under CMS’ current coverage policy, if the FDA grants traditional approval to other drugs in this class, they would also be eligible for broader coverage.
Currently two drugs in this class – aducanumab (Aduhelm) and lecanemab (Leqembi) – have received accelerated approval from the FDA, but no product has received traditional approval.
Lecanemab might be the first to cross the line.
On June 9, the FDA Peripheral and Central Nervous System Drugs Advisory Committee will discuss results of a confirmatory trial of lecanemab, with a potential decision on traditional approval expected shortly thereafter.
A version of this article first appeared on Medscape.com.
The Centers for Medicare & Medicaid Services has announced that
The one caveat is that CMS will require physicians to participate in registries that collect evidence about how these drugs work in the real world.
Physicians will be able to submit this evidence through a nationwide, CMS-facilitated portal that will be available when any product gains traditional approval and will collect information via an easy-to-use format.
“If the FDA grants traditional approval, then Medicare will cover it in appropriate settings that also support the collection of real-world information to study the usefulness of these drugs for people with Medicare,” the CMS says in a news release.
“CMS has always been committed to helping people obtain timely access to innovative treatments that meaningfully improve care and outcomes for this disease,” added CMS Administrator Chiquita Brooks-LaSure.
“If the FDA grants traditional approval, CMS is prepared to ensure anyone with Medicare Part B who meets the criteria is covered,” Ms. Brooks-LaSure explained.
The CMS says broader Medicare coverage for an Alzheimer’s drug would begin on the same day the FDA grants traditional approval. Under CMS’ current coverage policy, if the FDA grants traditional approval to other drugs in this class, they would also be eligible for broader coverage.
Currently two drugs in this class – aducanumab (Aduhelm) and lecanemab (Leqembi) – have received accelerated approval from the FDA, but no product has received traditional approval.
Lecanemab might be the first to cross the line.
On June 9, the FDA Peripheral and Central Nervous System Drugs Advisory Committee will discuss results of a confirmatory trial of lecanemab, with a potential decision on traditional approval expected shortly thereafter.
A version of this article first appeared on Medscape.com.
Younger age of type 2 diabetes onset linked to dementia risk
, new findings suggest.
Moreover, the new data from the prospective Atherosclerosis Risk in Communities (ARIC) cohort also suggest that the previously identified increased risk for dementia among people with prediabetes appears to be entirely explained by the subset who go on to develop type 2 diabetes.
“Our findings suggest that preventing prediabetes progression, especially in younger individuals, may be an important way to reduce the dementia burden,” wrote PhD student Jiaqi Hu of Johns Hopkins University, Baltimore, and colleagues. Their article was published online in Diabetologia.
The result builds on previous findings linking dysglycemia and cognitive decline, the study’s lead author, Elizabeth Selvin, PhD, of the Bloomberg School of Public Health at Johns Hopkins, said in an interview.
“Our prior work in the ARIC study suggests that improving glucose control could help prevent dementia in later life,” she said.
Other studies have also linked higher A1c levels and diabetes in midlife to increased rates of cognitive decline. In addition, Dr. Selvin noted, “There is growing evidence that focusing on vascular health, especially focusing on diabetes and blood pressure, in midlife can stave off dementia in later life.”
This new study is the first to examine the effect of diabetes in the relationship between prediabetes and dementia, as well as the age of diabetes onset on subsequent dementia.
Prediabetes linked to dementia via diabetes development
Of the 11,656 ARIC participants without diabetes at baseline during 1990-1992 (age 46-70 years), 20.0% had prediabetes (defined as A1c 5.7%-6.4% or 39-46 mmol/mol). During a median follow-up of 15.9 years, 3,143 participants developed diabetes. The proportions of patients who developed diabetes were 44.6% among those with prediabetes at baseline versus 22.5% of those without.
Dementia developed in 2,247 participants over a median follow-up of 24.7 years. The cumulative incidence of dementia was 23.9% among those who developed diabetes versus 20.5% among those who did not.
After adjustment for demographics and for the Alzheimer’s disease–linked apolipoprotein E (APOE) gene, prediabetes was significantly associated with incident dementia (hazard ratio [HR], 1.19). However, significance disappeared after adjustment for incident diabetes (HR, 1.09), the researchers reported.
Younger age at diabetes diagnosis raises dementia risk
Age at diabetes diagnosis made a difference in dementia risk. With adjustments for lifestyle, demographic, and clinical factors, those diagnosed with diabetes before age 60 years had a nearly threefold increased risk for dementia compared with those who never developed diabetes (HR, 2.92; P < .001).
The dementia risk was also significantly increased, although to a lesser degree, among those aged 60-69 years at diabetes diagnosis (HR, 1.73; P < .001) and age 70-79 years at diabetes diagnosis (HR, 1.23; P < .001). The relationship was not significant for those aged 80 years and older (HR, 1.13).
“Prevention efforts in people with diabetes diagnosed younger than 65 years should be a high priority,” the authors urged.
Taken together, the data suggest that prolonged exposure to hyperglycemia plays a major role in dementia development.
“Putative mechanisms include acute and chronic hyperglycemia, glucose toxicity, insulin resistance, and microvascular dysfunction of the central nervous system. ... Glucose toxicity and microvascular dysfunction are associated with increased inflammatory and oxidative stress, leading to increased blood–brain permeability,” the researchers wrote.
Dr. Selvin said that her group is pursuing further work in this area using continuous glucose monitoring. “We plan to look at ... how glycemic control and different patterns of glucose in older adults may be linked to cognitive decline and other neurocognitive outcomes.”
The researchers reported no relevant financial relationships. Dr. Selvin has reported being on the advisory board for Diabetologia; she had no role in peer review of the manuscript.
A version of this article first appeared on Medscape.com.
, new findings suggest.
Moreover, the new data from the prospective Atherosclerosis Risk in Communities (ARIC) cohort also suggest that the previously identified increased risk for dementia among people with prediabetes appears to be entirely explained by the subset who go on to develop type 2 diabetes.
“Our findings suggest that preventing prediabetes progression, especially in younger individuals, may be an important way to reduce the dementia burden,” wrote PhD student Jiaqi Hu of Johns Hopkins University, Baltimore, and colleagues. Their article was published online in Diabetologia.
The result builds on previous findings linking dysglycemia and cognitive decline, the study’s lead author, Elizabeth Selvin, PhD, of the Bloomberg School of Public Health at Johns Hopkins, said in an interview.
“Our prior work in the ARIC study suggests that improving glucose control could help prevent dementia in later life,” she said.
Other studies have also linked higher A1c levels and diabetes in midlife to increased rates of cognitive decline. In addition, Dr. Selvin noted, “There is growing evidence that focusing on vascular health, especially focusing on diabetes and blood pressure, in midlife can stave off dementia in later life.”
This new study is the first to examine the effect of diabetes in the relationship between prediabetes and dementia, as well as the age of diabetes onset on subsequent dementia.
Prediabetes linked to dementia via diabetes development
Of the 11,656 ARIC participants without diabetes at baseline during 1990-1992 (age 46-70 years), 20.0% had prediabetes (defined as A1c 5.7%-6.4% or 39-46 mmol/mol). During a median follow-up of 15.9 years, 3,143 participants developed diabetes. The proportions of patients who developed diabetes were 44.6% among those with prediabetes at baseline versus 22.5% of those without.
Dementia developed in 2,247 participants over a median follow-up of 24.7 years. The cumulative incidence of dementia was 23.9% among those who developed diabetes versus 20.5% among those who did not.
After adjustment for demographics and for the Alzheimer’s disease–linked apolipoprotein E (APOE) gene, prediabetes was significantly associated with incident dementia (hazard ratio [HR], 1.19). However, significance disappeared after adjustment for incident diabetes (HR, 1.09), the researchers reported.
Younger age at diabetes diagnosis raises dementia risk
Age at diabetes diagnosis made a difference in dementia risk. With adjustments for lifestyle, demographic, and clinical factors, those diagnosed with diabetes before age 60 years had a nearly threefold increased risk for dementia compared with those who never developed diabetes (HR, 2.92; P < .001).
The dementia risk was also significantly increased, although to a lesser degree, among those aged 60-69 years at diabetes diagnosis (HR, 1.73; P < .001) and age 70-79 years at diabetes diagnosis (HR, 1.23; P < .001). The relationship was not significant for those aged 80 years and older (HR, 1.13).
“Prevention efforts in people with diabetes diagnosed younger than 65 years should be a high priority,” the authors urged.
Taken together, the data suggest that prolonged exposure to hyperglycemia plays a major role in dementia development.
“Putative mechanisms include acute and chronic hyperglycemia, glucose toxicity, insulin resistance, and microvascular dysfunction of the central nervous system. ... Glucose toxicity and microvascular dysfunction are associated with increased inflammatory and oxidative stress, leading to increased blood–brain permeability,” the researchers wrote.
Dr. Selvin said that her group is pursuing further work in this area using continuous glucose monitoring. “We plan to look at ... how glycemic control and different patterns of glucose in older adults may be linked to cognitive decline and other neurocognitive outcomes.”
The researchers reported no relevant financial relationships. Dr. Selvin has reported being on the advisory board for Diabetologia; she had no role in peer review of the manuscript.
A version of this article first appeared on Medscape.com.
, new findings suggest.
Moreover, the new data from the prospective Atherosclerosis Risk in Communities (ARIC) cohort also suggest that the previously identified increased risk for dementia among people with prediabetes appears to be entirely explained by the subset who go on to develop type 2 diabetes.
“Our findings suggest that preventing prediabetes progression, especially in younger individuals, may be an important way to reduce the dementia burden,” wrote PhD student Jiaqi Hu of Johns Hopkins University, Baltimore, and colleagues. Their article was published online in Diabetologia.
The result builds on previous findings linking dysglycemia and cognitive decline, the study’s lead author, Elizabeth Selvin, PhD, of the Bloomberg School of Public Health at Johns Hopkins, said in an interview.
“Our prior work in the ARIC study suggests that improving glucose control could help prevent dementia in later life,” she said.
Other studies have also linked higher A1c levels and diabetes in midlife to increased rates of cognitive decline. In addition, Dr. Selvin noted, “There is growing evidence that focusing on vascular health, especially focusing on diabetes and blood pressure, in midlife can stave off dementia in later life.”
This new study is the first to examine the effect of diabetes in the relationship between prediabetes and dementia, as well as the age of diabetes onset on subsequent dementia.
Prediabetes linked to dementia via diabetes development
Of the 11,656 ARIC participants without diabetes at baseline during 1990-1992 (age 46-70 years), 20.0% had prediabetes (defined as A1c 5.7%-6.4% or 39-46 mmol/mol). During a median follow-up of 15.9 years, 3,143 participants developed diabetes. The proportions of patients who developed diabetes were 44.6% among those with prediabetes at baseline versus 22.5% of those without.
Dementia developed in 2,247 participants over a median follow-up of 24.7 years. The cumulative incidence of dementia was 23.9% among those who developed diabetes versus 20.5% among those who did not.
After adjustment for demographics and for the Alzheimer’s disease–linked apolipoprotein E (APOE) gene, prediabetes was significantly associated with incident dementia (hazard ratio [HR], 1.19). However, significance disappeared after adjustment for incident diabetes (HR, 1.09), the researchers reported.
Younger age at diabetes diagnosis raises dementia risk
Age at diabetes diagnosis made a difference in dementia risk. With adjustments for lifestyle, demographic, and clinical factors, those diagnosed with diabetes before age 60 years had a nearly threefold increased risk for dementia compared with those who never developed diabetes (HR, 2.92; P < .001).
The dementia risk was also significantly increased, although to a lesser degree, among those aged 60-69 years at diabetes diagnosis (HR, 1.73; P < .001) and age 70-79 years at diabetes diagnosis (HR, 1.23; P < .001). The relationship was not significant for those aged 80 years and older (HR, 1.13).
“Prevention efforts in people with diabetes diagnosed younger than 65 years should be a high priority,” the authors urged.
Taken together, the data suggest that prolonged exposure to hyperglycemia plays a major role in dementia development.
“Putative mechanisms include acute and chronic hyperglycemia, glucose toxicity, insulin resistance, and microvascular dysfunction of the central nervous system. ... Glucose toxicity and microvascular dysfunction are associated with increased inflammatory and oxidative stress, leading to increased blood–brain permeability,” the researchers wrote.
Dr. Selvin said that her group is pursuing further work in this area using continuous glucose monitoring. “We plan to look at ... how glycemic control and different patterns of glucose in older adults may be linked to cognitive decline and other neurocognitive outcomes.”
The researchers reported no relevant financial relationships. Dr. Selvin has reported being on the advisory board for Diabetologia; she had no role in peer review of the manuscript.
A version of this article first appeared on Medscape.com.
FROM DIABETOLOGIA
Alzheimer’s Disease Etiology
Which interventions could lessen the burden of dementia?
Using a microsimulation algorithm that accounts for the effect on mortality, a team from Marseille, France, has shown that interventions targeting the three main vascular risk factors for dementia – hypertension, diabetes, and physical inactivity – could significantly reduce the burden of dementia by 2040.
Although these modeling results could appear too optimistic, since total disappearance of the risk factors was assumed, the authors say the results do show that targeted interventions for these factors could be effective in reducing the future burden of dementia.
Increasing prevalence
According to the World Alzheimer Report 2018, 50 million people around the world were living with dementia; a population roughly around the size of South Korea or Spain. That community is likely to rise to about 152 million people by 2050, which is similar to the size of Russia or Bangladesh, the result of an aging population.
Among modifiable risk factors, many studies support a deleterious effect of hypertension, diabetes, and physical inactivity on the risk of dementia. However, since the distribution of these risk factors could have a direct impact on mortality, reducing it should increase life expectancy and the number of cases of dementia.
The team, headed by Hélène Jacqmin-Gadda, PhD, research director at the University of Bordeaux (France), has developed a microsimulation model capable of predicting the burden of dementia while accounting for the impact on mortality. The team used this approach to assess the impact of interventions targeting these three main risk factors on the burden of dementia in France by 2040.
Removing risk factors
The researchers estimated the incidence of dementia for men and women using data from the 2020 PAQUID cohort, and these data were combined with the projections forecast by the French National Institute of Statistics and Economic Studies to account for mortality with and without dementia.
Without intervention, the prevalence rate of dementia in 2040 would be 9.6% among men and 14% among women older than 65 years.
These figures would decrease to 6.4% (−33%) and 10.4% (−26%), respectively, under the intervention scenario whereby the three modifiable vascular risk factors (hypertension, diabetes, and physical inactivity) would be removed simultaneously beginning in 2020. The prevalence rates are significantly reduced for men and women from age 75 years. In this scenario, life expectancy without dementia would increase by 3.4 years in men and 2.6 years in women, the result of men being more exposed to these three risk factors.
Other scenarios have estimated dementia prevalence with the disappearance of just one of these risk factors. For example, the disappearance of hypertension alone from 2020 could decrease dementia prevalence by 21% in men and 16% in women (because this risk factor is less common in women than in men) by 2040. This reduction would be associated with a decrease in the lifelong probability of dementia among men and women and a gain in life expectancy without dementia of 2 years in men and 1.4 years in women.
Among the three factors, hypertension has the largest impact on dementia burden in the French population, since this is, by far, the most prevalent (69% in men and 49% in women), while intervention targeting only diabetes or physical inactivity would lead to a reduction in dementia prevalence of only 4%-7%.
The authors reported no conflicts of interest.
This article was translated from Univadis France. A version appeared on Medscape.com.
Using a microsimulation algorithm that accounts for the effect on mortality, a team from Marseille, France, has shown that interventions targeting the three main vascular risk factors for dementia – hypertension, diabetes, and physical inactivity – could significantly reduce the burden of dementia by 2040.
Although these modeling results could appear too optimistic, since total disappearance of the risk factors was assumed, the authors say the results do show that targeted interventions for these factors could be effective in reducing the future burden of dementia.
Increasing prevalence
According to the World Alzheimer Report 2018, 50 million people around the world were living with dementia; a population roughly around the size of South Korea or Spain. That community is likely to rise to about 152 million people by 2050, which is similar to the size of Russia or Bangladesh, the result of an aging population.
Among modifiable risk factors, many studies support a deleterious effect of hypertension, diabetes, and physical inactivity on the risk of dementia. However, since the distribution of these risk factors could have a direct impact on mortality, reducing it should increase life expectancy and the number of cases of dementia.
The team, headed by Hélène Jacqmin-Gadda, PhD, research director at the University of Bordeaux (France), has developed a microsimulation model capable of predicting the burden of dementia while accounting for the impact on mortality. The team used this approach to assess the impact of interventions targeting these three main risk factors on the burden of dementia in France by 2040.
Removing risk factors
The researchers estimated the incidence of dementia for men and women using data from the 2020 PAQUID cohort, and these data were combined with the projections forecast by the French National Institute of Statistics and Economic Studies to account for mortality with and without dementia.
Without intervention, the prevalence rate of dementia in 2040 would be 9.6% among men and 14% among women older than 65 years.
These figures would decrease to 6.4% (−33%) and 10.4% (−26%), respectively, under the intervention scenario whereby the three modifiable vascular risk factors (hypertension, diabetes, and physical inactivity) would be removed simultaneously beginning in 2020. The prevalence rates are significantly reduced for men and women from age 75 years. In this scenario, life expectancy without dementia would increase by 3.4 years in men and 2.6 years in women, the result of men being more exposed to these three risk factors.
Other scenarios have estimated dementia prevalence with the disappearance of just one of these risk factors. For example, the disappearance of hypertension alone from 2020 could decrease dementia prevalence by 21% in men and 16% in women (because this risk factor is less common in women than in men) by 2040. This reduction would be associated with a decrease in the lifelong probability of dementia among men and women and a gain in life expectancy without dementia of 2 years in men and 1.4 years in women.
Among the three factors, hypertension has the largest impact on dementia burden in the French population, since this is, by far, the most prevalent (69% in men and 49% in women), while intervention targeting only diabetes or physical inactivity would lead to a reduction in dementia prevalence of only 4%-7%.
The authors reported no conflicts of interest.
This article was translated from Univadis France. A version appeared on Medscape.com.
Using a microsimulation algorithm that accounts for the effect on mortality, a team from Marseille, France, has shown that interventions targeting the three main vascular risk factors for dementia – hypertension, diabetes, and physical inactivity – could significantly reduce the burden of dementia by 2040.
Although these modeling results could appear too optimistic, since total disappearance of the risk factors was assumed, the authors say the results do show that targeted interventions for these factors could be effective in reducing the future burden of dementia.
Increasing prevalence
According to the World Alzheimer Report 2018, 50 million people around the world were living with dementia; a population roughly around the size of South Korea or Spain. That community is likely to rise to about 152 million people by 2050, which is similar to the size of Russia or Bangladesh, the result of an aging population.
Among modifiable risk factors, many studies support a deleterious effect of hypertension, diabetes, and physical inactivity on the risk of dementia. However, since the distribution of these risk factors could have a direct impact on mortality, reducing it should increase life expectancy and the number of cases of dementia.
The team, headed by Hélène Jacqmin-Gadda, PhD, research director at the University of Bordeaux (France), has developed a microsimulation model capable of predicting the burden of dementia while accounting for the impact on mortality. The team used this approach to assess the impact of interventions targeting these three main risk factors on the burden of dementia in France by 2040.
Removing risk factors
The researchers estimated the incidence of dementia for men and women using data from the 2020 PAQUID cohort, and these data were combined with the projections forecast by the French National Institute of Statistics and Economic Studies to account for mortality with and without dementia.
Without intervention, the prevalence rate of dementia in 2040 would be 9.6% among men and 14% among women older than 65 years.
These figures would decrease to 6.4% (−33%) and 10.4% (−26%), respectively, under the intervention scenario whereby the three modifiable vascular risk factors (hypertension, diabetes, and physical inactivity) would be removed simultaneously beginning in 2020. The prevalence rates are significantly reduced for men and women from age 75 years. In this scenario, life expectancy without dementia would increase by 3.4 years in men and 2.6 years in women, the result of men being more exposed to these three risk factors.
Other scenarios have estimated dementia prevalence with the disappearance of just one of these risk factors. For example, the disappearance of hypertension alone from 2020 could decrease dementia prevalence by 21% in men and 16% in women (because this risk factor is less common in women than in men) by 2040. This reduction would be associated with a decrease in the lifelong probability of dementia among men and women and a gain in life expectancy without dementia of 2 years in men and 1.4 years in women.
Among the three factors, hypertension has the largest impact on dementia burden in the French population, since this is, by far, the most prevalent (69% in men and 49% in women), while intervention targeting only diabetes or physical inactivity would lead to a reduction in dementia prevalence of only 4%-7%.
The authors reported no conflicts of interest.
This article was translated from Univadis France. A version appeared on Medscape.com.
FROM THE EUROPEAN JOURNAL OF EPIDEMIOLOGY