User login
Vision Impairment Tied to Higher Dementia Risk in Older Adults
TOPLINE:
; a decline in contrast sensitivity over time also correlates with the risk of developing dementia.
METHODOLOGY:
- Researchers conducted a longitudinal study to analyze the association of visual function with the risk for dementia in 2159 men and women (mean age, 77.9 years; 54% women) included from the National Health and Aging Trends Study between 2021 and 2022.
- All participants were free from dementia at baseline and underwent visual assessment while wearing their usual glasses or contact lenses.
- Distance and near visual acuity were measured as the log minimum angle of resolution (logMAR) units where higher values indicated worse visual acuity; contrast sensitivity was measured as the log contrast sensitivity (logCS) units where lower values represented worse outcomes.
- Dementia status was determined by a medical diagnosis, a dementia score of 2 or more, or poor performance on cognitive testing.
TAKEAWAY:
- Over the 1-year follow-up period, 192 adults (6.6%) developed dementia.
- Worsening of distant and near vision by 0.1 logMAR increased the risk for dementia by 8% (P = .01) and 7% (P = .02), respectively.
- Each 0.1 logCS decline in baseline contrast sensitivity increased the risk for dementia by 9% (P = .003).
- A yearly decline in contrast sensitivity by 0.1 logCS increased the likelihood of dementia by 14% (P = .007).
- Changes in distant and near vision over time did not show a significant association with risk for dementia (P = .58 and P = .79, respectively).
IN PRACTICE:
“Visual function, especially contrast sensitivity, might be a risk factor for developing dementia,” the authors wrote. “Early vision screening may help identify adults at higher risk of dementia, allowing for timely interventions.”
SOURCE:
The study was led by Louay Almidani, MD, MSc, of the Wilmer Eye Institute at the Johns Hopkins University School of Medicine, in Baltimore, and was published online in the American Journal of Ophthalmology.
LIMITATIONS:
The study had a limited follow-up period of 1 year and may not have captured the long-term association between visual impairment and the risk for dementia. Moreover, the researchers did not consider other visual function measures such as depth perception and visual field, which might have affected the results.
DISCLOSURES:
The study did not have any funding source. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
; a decline in contrast sensitivity over time also correlates with the risk of developing dementia.
METHODOLOGY:
- Researchers conducted a longitudinal study to analyze the association of visual function with the risk for dementia in 2159 men and women (mean age, 77.9 years; 54% women) included from the National Health and Aging Trends Study between 2021 and 2022.
- All participants were free from dementia at baseline and underwent visual assessment while wearing their usual glasses or contact lenses.
- Distance and near visual acuity were measured as the log minimum angle of resolution (logMAR) units where higher values indicated worse visual acuity; contrast sensitivity was measured as the log contrast sensitivity (logCS) units where lower values represented worse outcomes.
- Dementia status was determined by a medical diagnosis, a dementia score of 2 or more, or poor performance on cognitive testing.
TAKEAWAY:
- Over the 1-year follow-up period, 192 adults (6.6%) developed dementia.
- Worsening of distant and near vision by 0.1 logMAR increased the risk for dementia by 8% (P = .01) and 7% (P = .02), respectively.
- Each 0.1 logCS decline in baseline contrast sensitivity increased the risk for dementia by 9% (P = .003).
- A yearly decline in contrast sensitivity by 0.1 logCS increased the likelihood of dementia by 14% (P = .007).
- Changes in distant and near vision over time did not show a significant association with risk for dementia (P = .58 and P = .79, respectively).
IN PRACTICE:
“Visual function, especially contrast sensitivity, might be a risk factor for developing dementia,” the authors wrote. “Early vision screening may help identify adults at higher risk of dementia, allowing for timely interventions.”
SOURCE:
The study was led by Louay Almidani, MD, MSc, of the Wilmer Eye Institute at the Johns Hopkins University School of Medicine, in Baltimore, and was published online in the American Journal of Ophthalmology.
LIMITATIONS:
The study had a limited follow-up period of 1 year and may not have captured the long-term association between visual impairment and the risk for dementia. Moreover, the researchers did not consider other visual function measures such as depth perception and visual field, which might have affected the results.
DISCLOSURES:
The study did not have any funding source. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
; a decline in contrast sensitivity over time also correlates with the risk of developing dementia.
METHODOLOGY:
- Researchers conducted a longitudinal study to analyze the association of visual function with the risk for dementia in 2159 men and women (mean age, 77.9 years; 54% women) included from the National Health and Aging Trends Study between 2021 and 2022.
- All participants were free from dementia at baseline and underwent visual assessment while wearing their usual glasses or contact lenses.
- Distance and near visual acuity were measured as the log minimum angle of resolution (logMAR) units where higher values indicated worse visual acuity; contrast sensitivity was measured as the log contrast sensitivity (logCS) units where lower values represented worse outcomes.
- Dementia status was determined by a medical diagnosis, a dementia score of 2 or more, or poor performance on cognitive testing.
TAKEAWAY:
- Over the 1-year follow-up period, 192 adults (6.6%) developed dementia.
- Worsening of distant and near vision by 0.1 logMAR increased the risk for dementia by 8% (P = .01) and 7% (P = .02), respectively.
- Each 0.1 logCS decline in baseline contrast sensitivity increased the risk for dementia by 9% (P = .003).
- A yearly decline in contrast sensitivity by 0.1 logCS increased the likelihood of dementia by 14% (P = .007).
- Changes in distant and near vision over time did not show a significant association with risk for dementia (P = .58 and P = .79, respectively).
IN PRACTICE:
“Visual function, especially contrast sensitivity, might be a risk factor for developing dementia,” the authors wrote. “Early vision screening may help identify adults at higher risk of dementia, allowing for timely interventions.”
SOURCE:
The study was led by Louay Almidani, MD, MSc, of the Wilmer Eye Institute at the Johns Hopkins University School of Medicine, in Baltimore, and was published online in the American Journal of Ophthalmology.
LIMITATIONS:
The study had a limited follow-up period of 1 year and may not have captured the long-term association between visual impairment and the risk for dementia. Moreover, the researchers did not consider other visual function measures such as depth perception and visual field, which might have affected the results.
DISCLOSURES:
The study did not have any funding source. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
More and More Are Living With Type 1 Diabetes Into Old Age
TOPLINE:
Mortality and disability-adjusted life years (DALYs) among people with type 1 diabetes (T1D) aged ≥ 65 years dropped significantly from 1990 to 2019. Both were lower among women and those living in higher sociodemographic areas.
METHODOLOGY:
- A population-based study of adults aged ≥ 65 years from 21 regions and 204 countries and territories, 1990-2019, was conducted.
TAKEAWAY:
- Globally, the prevalence of T1D among people aged ≥ 65 years increased by 180% between 1990 and 2019, from 1.3 million to 3.7 million.
- The proportion of older people with T1D has consistently trended upward, from 12% of all people with T1D in 1990 to 17% in 2019.
- Age-standardized mortality from T1D among this age group significantly decreased by 25%, from 4.7/100,000 population in 1990 to 3.5/100,000 in 2019.
- Age-standardized increases in T1D prevalence have occurred in both men and women worldwide, while the increase was more rapid among men (average annual percent change, 1.00% vs 0.74%).
- Globally, T1D prevalence at least tripled in every age subgroup of those aged ≥ 65 years, and even fivefold to sixfold for those ≥ 90-95 years (0.02-0.11 million for ages 90-94 years; 0.005-0.03 million for ages ≥ 95 years).
- No decreases occurred in T1D prevalence among those aged ≥ 65 years in any of the 21 global regions.
- Three primary risk factors associated with DALYs for T1D among people aged ≥ 65 years were high fasting plasma glucose levels, low temperature, and high temperature, accounting for 103 DALYs per 100,000 people, 3/100,000 people, and 1/100,000 people, respectively, in 2019.
IN PRACTICE:
“The results suggest that T1DM is no longer a contributory factor in decreased life expectancy owing to improvements in medical care over the three decades,” the authors wrote. “Management of high fasting plasma glucose levels remains a major challenge for older people with T1D, and targeted clinical guidelines are needed.”
SOURCE:
The study was conducted by Kaijie Yang, the Department of Endocrinology and Metabolism, the Institute of Endocrinology, NHC Key Laboratory of Diagnosis and Treatment of Thyroid Diseases, First Hospital of China Medical University, Shenyang, China, and colleagues. The study was published online in the BMJ.
LIMITATIONS:
Data were extrapolated from countries that have epidemiologic data. Health information systems and reporting mechanisms vary across countries and regions. Disease burden data include a time lag. Diagnosing T1D in older people can be challenging.
DISCLOSURES:
The study was supported by the National Natural Science Foundation of China and the China Postdoctoral Science Foundation. The authors reported no additional financial relationships.
A version of this article first appeared on Medscape.com.
TOPLINE:
Mortality and disability-adjusted life years (DALYs) among people with type 1 diabetes (T1D) aged ≥ 65 years dropped significantly from 1990 to 2019. Both were lower among women and those living in higher sociodemographic areas.
METHODOLOGY:
- A population-based study of adults aged ≥ 65 years from 21 regions and 204 countries and territories, 1990-2019, was conducted.
TAKEAWAY:
- Globally, the prevalence of T1D among people aged ≥ 65 years increased by 180% between 1990 and 2019, from 1.3 million to 3.7 million.
- The proportion of older people with T1D has consistently trended upward, from 12% of all people with T1D in 1990 to 17% in 2019.
- Age-standardized mortality from T1D among this age group significantly decreased by 25%, from 4.7/100,000 population in 1990 to 3.5/100,000 in 2019.
- Age-standardized increases in T1D prevalence have occurred in both men and women worldwide, while the increase was more rapid among men (average annual percent change, 1.00% vs 0.74%).
- Globally, T1D prevalence at least tripled in every age subgroup of those aged ≥ 65 years, and even fivefold to sixfold for those ≥ 90-95 years (0.02-0.11 million for ages 90-94 years; 0.005-0.03 million for ages ≥ 95 years).
- No decreases occurred in T1D prevalence among those aged ≥ 65 years in any of the 21 global regions.
- Three primary risk factors associated with DALYs for T1D among people aged ≥ 65 years were high fasting plasma glucose levels, low temperature, and high temperature, accounting for 103 DALYs per 100,000 people, 3/100,000 people, and 1/100,000 people, respectively, in 2019.
IN PRACTICE:
“The results suggest that T1DM is no longer a contributory factor in decreased life expectancy owing to improvements in medical care over the three decades,” the authors wrote. “Management of high fasting plasma glucose levels remains a major challenge for older people with T1D, and targeted clinical guidelines are needed.”
SOURCE:
The study was conducted by Kaijie Yang, the Department of Endocrinology and Metabolism, the Institute of Endocrinology, NHC Key Laboratory of Diagnosis and Treatment of Thyroid Diseases, First Hospital of China Medical University, Shenyang, China, and colleagues. The study was published online in the BMJ.
LIMITATIONS:
Data were extrapolated from countries that have epidemiologic data. Health information systems and reporting mechanisms vary across countries and regions. Disease burden data include a time lag. Diagnosing T1D in older people can be challenging.
DISCLOSURES:
The study was supported by the National Natural Science Foundation of China and the China Postdoctoral Science Foundation. The authors reported no additional financial relationships.
A version of this article first appeared on Medscape.com.
TOPLINE:
Mortality and disability-adjusted life years (DALYs) among people with type 1 diabetes (T1D) aged ≥ 65 years dropped significantly from 1990 to 2019. Both were lower among women and those living in higher sociodemographic areas.
METHODOLOGY:
- A population-based study of adults aged ≥ 65 years from 21 regions and 204 countries and territories, 1990-2019, was conducted.
TAKEAWAY:
- Globally, the prevalence of T1D among people aged ≥ 65 years increased by 180% between 1990 and 2019, from 1.3 million to 3.7 million.
- The proportion of older people with T1D has consistently trended upward, from 12% of all people with T1D in 1990 to 17% in 2019.
- Age-standardized mortality from T1D among this age group significantly decreased by 25%, from 4.7/100,000 population in 1990 to 3.5/100,000 in 2019.
- Age-standardized increases in T1D prevalence have occurred in both men and women worldwide, while the increase was more rapid among men (average annual percent change, 1.00% vs 0.74%).
- Globally, T1D prevalence at least tripled in every age subgroup of those aged ≥ 65 years, and even fivefold to sixfold for those ≥ 90-95 years (0.02-0.11 million for ages 90-94 years; 0.005-0.03 million for ages ≥ 95 years).
- No decreases occurred in T1D prevalence among those aged ≥ 65 years in any of the 21 global regions.
- Three primary risk factors associated with DALYs for T1D among people aged ≥ 65 years were high fasting plasma glucose levels, low temperature, and high temperature, accounting for 103 DALYs per 100,000 people, 3/100,000 people, and 1/100,000 people, respectively, in 2019.
IN PRACTICE:
“The results suggest that T1DM is no longer a contributory factor in decreased life expectancy owing to improvements in medical care over the three decades,” the authors wrote. “Management of high fasting plasma glucose levels remains a major challenge for older people with T1D, and targeted clinical guidelines are needed.”
SOURCE:
The study was conducted by Kaijie Yang, the Department of Endocrinology and Metabolism, the Institute of Endocrinology, NHC Key Laboratory of Diagnosis and Treatment of Thyroid Diseases, First Hospital of China Medical University, Shenyang, China, and colleagues. The study was published online in the BMJ.
LIMITATIONS:
Data were extrapolated from countries that have epidemiologic data. Health information systems and reporting mechanisms vary across countries and regions. Disease burden data include a time lag. Diagnosing T1D in older people can be challenging.
DISCLOSURES:
The study was supported by the National Natural Science Foundation of China and the China Postdoctoral Science Foundation. The authors reported no additional financial relationships.
A version of this article first appeared on Medscape.com.
Intensive Lifestyle Changes May Counter Early Alzheimer’s Symptoms
study was published online in Alzheimer’s Research & Therapy.
, in what authors said is the first randomized controlled trial of intensive lifestyle modification for patients diagnosed with Alzheimer’s disease. Results could help physicians address patients at risk of Alzheimer’s disease who reject relevant testing because they believe nothing can forestall development of the disease, the authors added. TheAlthough technology allows probable Alzheimer’s disease diagnosis years before clinical symptoms appear, wrote investigators led by Dean Ornish, MD, of the Preventive Medicine Research Institute in Sausalito, California, “many people do not want to know if they are likely to get Alzheimer’s disease if they do not believe they can do anything about it. If intensive lifestyle changes may cause improvement in cognition and function in MCI or early dementia due to Alzheimer’s disease, then it is reasonable to think that these lifestyle changes may also help to prevent MCI or early dementia due to Alzheimer’s disease.” As with cardiovascular disease, the authors added, preventing Alzheimer’s disease might require less intensive lifestyle modifications than treating it.
Study Methodology
Investigators randomized 26 patients with Montréal Cognitive Assessment scores of 18 or higher to an intensive intervention involving nutrition, exercise, and stress management techniques. To improve adherence, the protocol included participants’ spouses or caregivers.
Two patients, both in the treatment group, withdrew over logistical concerns.
After 20 weeks, treated patients exhibited statistically significant differences in several key measures versus a 25-patient usual-care control group. Scores that improved in the intervention group and worsened among controls included the following:
- Clinical Global Impression of Change (CGIC, P = .001)
- Clinical Dementia Rating-Global (CDR-Global, -0.04, P = .037)
- Clinical Dementia Rating Sum of Boxes (CDR-SB, +0.08, P = .032)
- Alzheimer’s Disease Assessment Scale (ADAS-Cog, -1.01, P = .053)
The validity of these changes in cognition and function, and possible biological mechanisms of improvement, were supported by statistically significant improvements in several clinically relevant biomarkers versus controls, the investigators wrote. These biomarkers included Abeta42/40 ratio, HbA1c, insulin, and glycoprotein acetylation. “This information may also help in predicting which patients are more likely to show improvements in cognition and function by making these intensive lifestyle changes,” the authors added.
In primary analysis, the degree of lifestyle changes required to stop progression of MCI ranged from 71.4% (ADAS-Cog) to 120.6% (CDR-SB). “This helps to explain why other studies of less intensive lifestyle interventions may not have been sufficient to stop deterioration or improve cognition and function,” the authors wrote. Moreover, they added, variable adherence might explain why in the intervention group, 10 patients improved their CGIC scores, while the rest held static or worsened.
Caveats
Alzheimer’s Association Vice President of Medical and Scientific Relations Heather M. Snyder, PhD, said, “This is an interesting paper in an important area of research and adds to the growing body of literature on how behavior or lifestyle may be related to cognitive decline. However, because this is a small phase 2 study, it is important for this or similar work to be done in larger, more diverse populations and over a longer duration of the intervention.” She was not involved with the study but was asked to comment.
Investigators chose the 20-week duration, they explained, because control-group patients likely would not refrain from trying the lifestyle intervention beyond that timeframe. Perhaps more importantly, challenges created by the COVID-19 pandemic required researchers to cut planned enrollment in half, eliminate planned MRI and amyloid PET scans, and reduce the number of cognition and function tests.
Such shortcomings limit what neurologists can glean and generalize from the study, said Dr. Snyder. “That said,” she added, “it does demonstrate the potential of an intensive behavior/lifestyle intervention, and the importance of this sort of research in Alzheimer’s and dementia.” Although the complexity of the interventions makes these studies challenging, she added, “it is important that we continue to advance larger, longer studies in more representative study populations to develop specific recommendations.”
Further Study
The Alzheimer’s Association’s U.S. POINTER study is the first large-scale study in the United States to explore the impact of comprehensive lifestyle changes on cognitive health. About 2000 older adults at risk for cognitive decline are participating, from diverse locations across the country. More than 25% of participants come from groups typically underrepresented in dementia research, said Dr. Snyder. Initial results are expected in summer 2025.
Future research also should explore reasons (beyond adherence) why some patients respond to lifestyle interventions better than others, and the potential synergy of lifestyle changes with drug therapies, wrote Dr. Ornish and colleagues.
“For now,” said Dr. Snyder, “there is an opportunity for providers to incorporate or expand messaging with their patients and families about the habits that they can incorporate into their daily lives. The Alzheimer’s Association offers 10 Healthy Habits for Your Brain — everyday actions that can make a difference for your brain health.”
Investigators received study funding from more than two dozen charitable foundations and other organizations. Dr. Snyder is a full-time employee of the Alzheimer’s Association and in this role, serves on the leadership team of the U.S. POINTER study. Her partner works for Abbott in an unrelated field.
study was published online in Alzheimer’s Research & Therapy.
, in what authors said is the first randomized controlled trial of intensive lifestyle modification for patients diagnosed with Alzheimer’s disease. Results could help physicians address patients at risk of Alzheimer’s disease who reject relevant testing because they believe nothing can forestall development of the disease, the authors added. TheAlthough technology allows probable Alzheimer’s disease diagnosis years before clinical symptoms appear, wrote investigators led by Dean Ornish, MD, of the Preventive Medicine Research Institute in Sausalito, California, “many people do not want to know if they are likely to get Alzheimer’s disease if they do not believe they can do anything about it. If intensive lifestyle changes may cause improvement in cognition and function in MCI or early dementia due to Alzheimer’s disease, then it is reasonable to think that these lifestyle changes may also help to prevent MCI or early dementia due to Alzheimer’s disease.” As with cardiovascular disease, the authors added, preventing Alzheimer’s disease might require less intensive lifestyle modifications than treating it.
Study Methodology
Investigators randomized 26 patients with Montréal Cognitive Assessment scores of 18 or higher to an intensive intervention involving nutrition, exercise, and stress management techniques. To improve adherence, the protocol included participants’ spouses or caregivers.
Two patients, both in the treatment group, withdrew over logistical concerns.
After 20 weeks, treated patients exhibited statistically significant differences in several key measures versus a 25-patient usual-care control group. Scores that improved in the intervention group and worsened among controls included the following:
- Clinical Global Impression of Change (CGIC, P = .001)
- Clinical Dementia Rating-Global (CDR-Global, -0.04, P = .037)
- Clinical Dementia Rating Sum of Boxes (CDR-SB, +0.08, P = .032)
- Alzheimer’s Disease Assessment Scale (ADAS-Cog, -1.01, P = .053)
The validity of these changes in cognition and function, and possible biological mechanisms of improvement, were supported by statistically significant improvements in several clinically relevant biomarkers versus controls, the investigators wrote. These biomarkers included Abeta42/40 ratio, HbA1c, insulin, and glycoprotein acetylation. “This information may also help in predicting which patients are more likely to show improvements in cognition and function by making these intensive lifestyle changes,” the authors added.
In primary analysis, the degree of lifestyle changes required to stop progression of MCI ranged from 71.4% (ADAS-Cog) to 120.6% (CDR-SB). “This helps to explain why other studies of less intensive lifestyle interventions may not have been sufficient to stop deterioration or improve cognition and function,” the authors wrote. Moreover, they added, variable adherence might explain why in the intervention group, 10 patients improved their CGIC scores, while the rest held static or worsened.
Caveats
Alzheimer’s Association Vice President of Medical and Scientific Relations Heather M. Snyder, PhD, said, “This is an interesting paper in an important area of research and adds to the growing body of literature on how behavior or lifestyle may be related to cognitive decline. However, because this is a small phase 2 study, it is important for this or similar work to be done in larger, more diverse populations and over a longer duration of the intervention.” She was not involved with the study but was asked to comment.
Investigators chose the 20-week duration, they explained, because control-group patients likely would not refrain from trying the lifestyle intervention beyond that timeframe. Perhaps more importantly, challenges created by the COVID-19 pandemic required researchers to cut planned enrollment in half, eliminate planned MRI and amyloid PET scans, and reduce the number of cognition and function tests.
Such shortcomings limit what neurologists can glean and generalize from the study, said Dr. Snyder. “That said,” she added, “it does demonstrate the potential of an intensive behavior/lifestyle intervention, and the importance of this sort of research in Alzheimer’s and dementia.” Although the complexity of the interventions makes these studies challenging, she added, “it is important that we continue to advance larger, longer studies in more representative study populations to develop specific recommendations.”
Further Study
The Alzheimer’s Association’s U.S. POINTER study is the first large-scale study in the United States to explore the impact of comprehensive lifestyle changes on cognitive health. About 2000 older adults at risk for cognitive decline are participating, from diverse locations across the country. More than 25% of participants come from groups typically underrepresented in dementia research, said Dr. Snyder. Initial results are expected in summer 2025.
Future research also should explore reasons (beyond adherence) why some patients respond to lifestyle interventions better than others, and the potential synergy of lifestyle changes with drug therapies, wrote Dr. Ornish and colleagues.
“For now,” said Dr. Snyder, “there is an opportunity for providers to incorporate or expand messaging with their patients and families about the habits that they can incorporate into their daily lives. The Alzheimer’s Association offers 10 Healthy Habits for Your Brain — everyday actions that can make a difference for your brain health.”
Investigators received study funding from more than two dozen charitable foundations and other organizations. Dr. Snyder is a full-time employee of the Alzheimer’s Association and in this role, serves on the leadership team of the U.S. POINTER study. Her partner works for Abbott in an unrelated field.
study was published online in Alzheimer’s Research & Therapy.
, in what authors said is the first randomized controlled trial of intensive lifestyle modification for patients diagnosed with Alzheimer’s disease. Results could help physicians address patients at risk of Alzheimer’s disease who reject relevant testing because they believe nothing can forestall development of the disease, the authors added. TheAlthough technology allows probable Alzheimer’s disease diagnosis years before clinical symptoms appear, wrote investigators led by Dean Ornish, MD, of the Preventive Medicine Research Institute in Sausalito, California, “many people do not want to know if they are likely to get Alzheimer’s disease if they do not believe they can do anything about it. If intensive lifestyle changes may cause improvement in cognition and function in MCI or early dementia due to Alzheimer’s disease, then it is reasonable to think that these lifestyle changes may also help to prevent MCI or early dementia due to Alzheimer’s disease.” As with cardiovascular disease, the authors added, preventing Alzheimer’s disease might require less intensive lifestyle modifications than treating it.
Study Methodology
Investigators randomized 26 patients with Montréal Cognitive Assessment scores of 18 or higher to an intensive intervention involving nutrition, exercise, and stress management techniques. To improve adherence, the protocol included participants’ spouses or caregivers.
Two patients, both in the treatment group, withdrew over logistical concerns.
After 20 weeks, treated patients exhibited statistically significant differences in several key measures versus a 25-patient usual-care control group. Scores that improved in the intervention group and worsened among controls included the following:
- Clinical Global Impression of Change (CGIC, P = .001)
- Clinical Dementia Rating-Global (CDR-Global, -0.04, P = .037)
- Clinical Dementia Rating Sum of Boxes (CDR-SB, +0.08, P = .032)
- Alzheimer’s Disease Assessment Scale (ADAS-Cog, -1.01, P = .053)
The validity of these changes in cognition and function, and possible biological mechanisms of improvement, were supported by statistically significant improvements in several clinically relevant biomarkers versus controls, the investigators wrote. These biomarkers included Abeta42/40 ratio, HbA1c, insulin, and glycoprotein acetylation. “This information may also help in predicting which patients are more likely to show improvements in cognition and function by making these intensive lifestyle changes,” the authors added.
In primary analysis, the degree of lifestyle changes required to stop progression of MCI ranged from 71.4% (ADAS-Cog) to 120.6% (CDR-SB). “This helps to explain why other studies of less intensive lifestyle interventions may not have been sufficient to stop deterioration or improve cognition and function,” the authors wrote. Moreover, they added, variable adherence might explain why in the intervention group, 10 patients improved their CGIC scores, while the rest held static or worsened.
Caveats
Alzheimer’s Association Vice President of Medical and Scientific Relations Heather M. Snyder, PhD, said, “This is an interesting paper in an important area of research and adds to the growing body of literature on how behavior or lifestyle may be related to cognitive decline. However, because this is a small phase 2 study, it is important for this or similar work to be done in larger, more diverse populations and over a longer duration of the intervention.” She was not involved with the study but was asked to comment.
Investigators chose the 20-week duration, they explained, because control-group patients likely would not refrain from trying the lifestyle intervention beyond that timeframe. Perhaps more importantly, challenges created by the COVID-19 pandemic required researchers to cut planned enrollment in half, eliminate planned MRI and amyloid PET scans, and reduce the number of cognition and function tests.
Such shortcomings limit what neurologists can glean and generalize from the study, said Dr. Snyder. “That said,” she added, “it does demonstrate the potential of an intensive behavior/lifestyle intervention, and the importance of this sort of research in Alzheimer’s and dementia.” Although the complexity of the interventions makes these studies challenging, she added, “it is important that we continue to advance larger, longer studies in more representative study populations to develop specific recommendations.”
Further Study
The Alzheimer’s Association’s U.S. POINTER study is the first large-scale study in the United States to explore the impact of comprehensive lifestyle changes on cognitive health. About 2000 older adults at risk for cognitive decline are participating, from diverse locations across the country. More than 25% of participants come from groups typically underrepresented in dementia research, said Dr. Snyder. Initial results are expected in summer 2025.
Future research also should explore reasons (beyond adherence) why some patients respond to lifestyle interventions better than others, and the potential synergy of lifestyle changes with drug therapies, wrote Dr. Ornish and colleagues.
“For now,” said Dr. Snyder, “there is an opportunity for providers to incorporate or expand messaging with their patients and families about the habits that they can incorporate into their daily lives. The Alzheimer’s Association offers 10 Healthy Habits for Your Brain — everyday actions that can make a difference for your brain health.”
Investigators received study funding from more than two dozen charitable foundations and other organizations. Dr. Snyder is a full-time employee of the Alzheimer’s Association and in this role, serves on the leadership team of the U.S. POINTER study. Her partner works for Abbott in an unrelated field.
FROM ALZHEIMER’S RESEARCH & THERAPY
‘Shockingly High’ Rate of TBI in Older Adults
TOPLINE:
, a new study showed.
METHODOLOGY:
- Researchers analyzed data from approximately 9200 Medicare enrollees who were part of the Health and Retirement Study (HRS), aged 65 years and older, from 2000 to 2018.
- The baseline date was the date of the first age eligible HRS core interview in the community in 2000 or later.
- Incident TBI cases came from an updated list of the International Classification of Diseases (ICD), 9th and 10th edition codes, from the Defense and Veterans Brain Injury Center and the Armed Forces Health Surveillance Branch for TBI surveillance.
- Codes corresponded with emergency department, CT, and/or fMRI visits.
TAKEAWAY:
- Almost 13% of older individuals (n = 797) experienced TBI during the study, highlighting its significant prevalence in this population.
- Older adults (mean age at baseline, 75 years) who experienced TBI during the study period were more likely to be women and White individuals as well as individuals having higher levels of education and normal cognition (P < .001), challenging previous assumptions about risk factors.
- The study underscored the need for targeted interventions and research focused on TBI prevention and postdischarge care in older adults.
IN PRACTICE:
“The number of people 65 and older with TBI is shockingly high,” senior author Raquel Gardner, MD, said in a press release. “We need evidence-based guidelines to inform postdischarge care of this very large Medicare population and more research on post-TBI dementia prevention and repeat injury prevention.”
SOURCE:
The study was led by Erica Kornblith, PhD, of the University of California, San Francisco. It was published online in JAMA Network Open.
LIMITATIONS:
The study’s reliance on ICD codes for TBI identification may not capture the full spectrum of TBI severity. Self-reported data on sociodemographic factors may have introduced bias, affecting the accuracy of associations with TBI incidence. In addition, the findings’ generalizability may be limited due to the study’s focus on Medicare enrollees, potentially excluding those from diverse socioeconomic backgrounds.
DISCLOSURES:
The study was funded by the Alzheimer’s Association, the US Department of Veterans Affairs, the National Institute on Aging, and the Department of Defense. Disclosures are noted in the original study.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
, a new study showed.
METHODOLOGY:
- Researchers analyzed data from approximately 9200 Medicare enrollees who were part of the Health and Retirement Study (HRS), aged 65 years and older, from 2000 to 2018.
- The baseline date was the date of the first age eligible HRS core interview in the community in 2000 or later.
- Incident TBI cases came from an updated list of the International Classification of Diseases (ICD), 9th and 10th edition codes, from the Defense and Veterans Brain Injury Center and the Armed Forces Health Surveillance Branch for TBI surveillance.
- Codes corresponded with emergency department, CT, and/or fMRI visits.
TAKEAWAY:
- Almost 13% of older individuals (n = 797) experienced TBI during the study, highlighting its significant prevalence in this population.
- Older adults (mean age at baseline, 75 years) who experienced TBI during the study period were more likely to be women and White individuals as well as individuals having higher levels of education and normal cognition (P < .001), challenging previous assumptions about risk factors.
- The study underscored the need for targeted interventions and research focused on TBI prevention and postdischarge care in older adults.
IN PRACTICE:
“The number of people 65 and older with TBI is shockingly high,” senior author Raquel Gardner, MD, said in a press release. “We need evidence-based guidelines to inform postdischarge care of this very large Medicare population and more research on post-TBI dementia prevention and repeat injury prevention.”
SOURCE:
The study was led by Erica Kornblith, PhD, of the University of California, San Francisco. It was published online in JAMA Network Open.
LIMITATIONS:
The study’s reliance on ICD codes for TBI identification may not capture the full spectrum of TBI severity. Self-reported data on sociodemographic factors may have introduced bias, affecting the accuracy of associations with TBI incidence. In addition, the findings’ generalizability may be limited due to the study’s focus on Medicare enrollees, potentially excluding those from diverse socioeconomic backgrounds.
DISCLOSURES:
The study was funded by the Alzheimer’s Association, the US Department of Veterans Affairs, the National Institute on Aging, and the Department of Defense. Disclosures are noted in the original study.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
, a new study showed.
METHODOLOGY:
- Researchers analyzed data from approximately 9200 Medicare enrollees who were part of the Health and Retirement Study (HRS), aged 65 years and older, from 2000 to 2018.
- The baseline date was the date of the first age eligible HRS core interview in the community in 2000 or later.
- Incident TBI cases came from an updated list of the International Classification of Diseases (ICD), 9th and 10th edition codes, from the Defense and Veterans Brain Injury Center and the Armed Forces Health Surveillance Branch for TBI surveillance.
- Codes corresponded with emergency department, CT, and/or fMRI visits.
TAKEAWAY:
- Almost 13% of older individuals (n = 797) experienced TBI during the study, highlighting its significant prevalence in this population.
- Older adults (mean age at baseline, 75 years) who experienced TBI during the study period were more likely to be women and White individuals as well as individuals having higher levels of education and normal cognition (P < .001), challenging previous assumptions about risk factors.
- The study underscored the need for targeted interventions and research focused on TBI prevention and postdischarge care in older adults.
IN PRACTICE:
“The number of people 65 and older with TBI is shockingly high,” senior author Raquel Gardner, MD, said in a press release. “We need evidence-based guidelines to inform postdischarge care of this very large Medicare population and more research on post-TBI dementia prevention and repeat injury prevention.”
SOURCE:
The study was led by Erica Kornblith, PhD, of the University of California, San Francisco. It was published online in JAMA Network Open.
LIMITATIONS:
The study’s reliance on ICD codes for TBI identification may not capture the full spectrum of TBI severity. Self-reported data on sociodemographic factors may have introduced bias, affecting the accuracy of associations with TBI incidence. In addition, the findings’ generalizability may be limited due to the study’s focus on Medicare enrollees, potentially excluding those from diverse socioeconomic backgrounds.
DISCLOSURES:
The study was funded by the Alzheimer’s Association, the US Department of Veterans Affairs, the National Institute on Aging, and the Department of Defense. Disclosures are noted in the original study.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Recurrent UTI Rates High Among Older Women, Diagnosing Accurately Is Complicated
TOPLINE:
Accurately diagnosing recurrent urinary tract infections (rUTIs) in older women is challenging and requires careful weighing of the risks and benefits of various treatments, according to a new clinical insight published in JAMA Internal Medicine.
METHODOLOGY:
- Women aged > 65 years have double the rUTI rates compared with younger women, but detecting the condition is more complicated due to age-related conditions, such as overactive bladder related to menopause.
- Overuse of antibiotics can increase their risk of contracting antibiotic-resistant organisms and can lead to pulmonary or hepatic toxic effects in women with reduced kidney function.
- Up to 20% of older women have bacteria in their urine, which may or may not reflect a rUTI.
- Diagnosing rUTIs is complicated if women have dementia or cognitive decline, which can hinder recollection of symptoms.
TAKEAWAYS:
- Clinicians should consider only testing older female patients for rUTIs when symptoms are present and consider all possibilities before making a diagnosis.
- Vaginal estrogen may be an effective treatment, although the authors of the clinical review note a lack of a uniform formulation to recommend. However, oral estrogen use is not supported by evidence, and clinicians should instead consider vaginal creams or rings.
- The drug methenamine may be as effective as antibiotics but may not be safe for women with comorbidities. Evidence supports daily use at 1 g.
- Cranberry supplements and behavioral changes may be helpful, but evidence is limited, including among women living in long-term care facilities.
IN PRACTICE:
“Shared decision-making is especially important when diagnosis of an rUTI episode in older women is unclear ... in these cases, clinicians should acknowledge limitations in the evidence and invite patients or their caregivers to discuss preferences about presumptive treatment, weighing the possibility of earlier symptom relief or decreased UTI complications against the risk of adverse drug effects or multidrug resistance.”
SOURCE:
The paper was led by Alison J. Huang, MD, MAS, an internal medicine specialist and researcher in the Department of Medicine at the University of California, San Francisco.
LIMITATIONS:
The authors reported no limitations.
DISCLOSURES:
Dr. Huang received grants from the National Institutes of Health. Other authors reported receiving grants from the Agency for Healthcare Research and Quality, the US Department of Veterans Affairs, the Kahn Foundation, and Nanovibronix.
Cranberry supplements and behavioral changes may be helpful, but evidence is limited, including among women living in long-term care facilities.
A version of this article first appeared on Medscape.com.
TOPLINE:
Accurately diagnosing recurrent urinary tract infections (rUTIs) in older women is challenging and requires careful weighing of the risks and benefits of various treatments, according to a new clinical insight published in JAMA Internal Medicine.
METHODOLOGY:
- Women aged > 65 years have double the rUTI rates compared with younger women, but detecting the condition is more complicated due to age-related conditions, such as overactive bladder related to menopause.
- Overuse of antibiotics can increase their risk of contracting antibiotic-resistant organisms and can lead to pulmonary or hepatic toxic effects in women with reduced kidney function.
- Up to 20% of older women have bacteria in their urine, which may or may not reflect a rUTI.
- Diagnosing rUTIs is complicated if women have dementia or cognitive decline, which can hinder recollection of symptoms.
TAKEAWAYS:
- Clinicians should consider only testing older female patients for rUTIs when symptoms are present and consider all possibilities before making a diagnosis.
- Vaginal estrogen may be an effective treatment, although the authors of the clinical review note a lack of a uniform formulation to recommend. However, oral estrogen use is not supported by evidence, and clinicians should instead consider vaginal creams or rings.
- The drug methenamine may be as effective as antibiotics but may not be safe for women with comorbidities. Evidence supports daily use at 1 g.
- Cranberry supplements and behavioral changes may be helpful, but evidence is limited, including among women living in long-term care facilities.
IN PRACTICE:
“Shared decision-making is especially important when diagnosis of an rUTI episode in older women is unclear ... in these cases, clinicians should acknowledge limitations in the evidence and invite patients or their caregivers to discuss preferences about presumptive treatment, weighing the possibility of earlier symptom relief or decreased UTI complications against the risk of adverse drug effects or multidrug resistance.”
SOURCE:
The paper was led by Alison J. Huang, MD, MAS, an internal medicine specialist and researcher in the Department of Medicine at the University of California, San Francisco.
LIMITATIONS:
The authors reported no limitations.
DISCLOSURES:
Dr. Huang received grants from the National Institutes of Health. Other authors reported receiving grants from the Agency for Healthcare Research and Quality, the US Department of Veterans Affairs, the Kahn Foundation, and Nanovibronix.
Cranberry supplements and behavioral changes may be helpful, but evidence is limited, including among women living in long-term care facilities.
A version of this article first appeared on Medscape.com.
TOPLINE:
Accurately diagnosing recurrent urinary tract infections (rUTIs) in older women is challenging and requires careful weighing of the risks and benefits of various treatments, according to a new clinical insight published in JAMA Internal Medicine.
METHODOLOGY:
- Women aged > 65 years have double the rUTI rates compared with younger women, but detecting the condition is more complicated due to age-related conditions, such as overactive bladder related to menopause.
- Overuse of antibiotics can increase their risk of contracting antibiotic-resistant organisms and can lead to pulmonary or hepatic toxic effects in women with reduced kidney function.
- Up to 20% of older women have bacteria in their urine, which may or may not reflect a rUTI.
- Diagnosing rUTIs is complicated if women have dementia or cognitive decline, which can hinder recollection of symptoms.
TAKEAWAYS:
- Clinicians should consider only testing older female patients for rUTIs when symptoms are present and consider all possibilities before making a diagnosis.
- Vaginal estrogen may be an effective treatment, although the authors of the clinical review note a lack of a uniform formulation to recommend. However, oral estrogen use is not supported by evidence, and clinicians should instead consider vaginal creams or rings.
- The drug methenamine may be as effective as antibiotics but may not be safe for women with comorbidities. Evidence supports daily use at 1 g.
- Cranberry supplements and behavioral changes may be helpful, but evidence is limited, including among women living in long-term care facilities.
IN PRACTICE:
“Shared decision-making is especially important when diagnosis of an rUTI episode in older women is unclear ... in these cases, clinicians should acknowledge limitations in the evidence and invite patients or their caregivers to discuss preferences about presumptive treatment, weighing the possibility of earlier symptom relief or decreased UTI complications against the risk of adverse drug effects or multidrug resistance.”
SOURCE:
The paper was led by Alison J. Huang, MD, MAS, an internal medicine specialist and researcher in the Department of Medicine at the University of California, San Francisco.
LIMITATIONS:
The authors reported no limitations.
DISCLOSURES:
Dr. Huang received grants from the National Institutes of Health. Other authors reported receiving grants from the Agency for Healthcare Research and Quality, the US Department of Veterans Affairs, the Kahn Foundation, and Nanovibronix.
Cranberry supplements and behavioral changes may be helpful, but evidence is limited, including among women living in long-term care facilities.
A version of this article first appeared on Medscape.com.
Novel Method Able to Predict if, When, Dementia Will Develop
Novel, noninvasive testing is able to predict dementia onset with 80% accuracy up to 9 years before clinical diagnosis.
The results suggest resting-state functional MRI (rs-fMRI) could be used to identify a neural network signature of dementia risk early in the pathological course of the disease, an important advance as disease-modifying drugs such as those targeting amyloid beta are now becoming available.
“The brain has been changing for a long time before people get symptoms of dementia, and if we’re very precise about how we do it, we can actually, in principle, detect those changes, which could be really exciting,” study investigator Charles R. Marshall, PhD, professor of clinical neurology, Centre for Preventive Neurology, Wolfson Institute of Population Health, Queen Mary University of London, London, England, told this news organization.
“This could become a platform for screening people for risk status in the future, and it could one day make all the difference in terms of being able to prevent dementia,” he added.
The findings were published online in Nature Mental Health.
The rs-fMRI measures fluctuations in blood oxygen level–dependent signals across the brain, which reflect functional connectivity.
Brain regions commonly implicated in altered functional connectivity in Alzheimer’s disease (AD) are within the default-mode network (DMN). This is the group of regions “connecting with each other and communicating with each other when someone is just lying in an MRI scanner doing nothing, which is how it came to be called the default-mode network,” explained Dr. Marshall.
The DMN encompasses the medial prefrontal cortex, posterior cingulate cortex or precuneus, and bilateral inferior parietal cortices, as well as supplementary brain regions including the medial temporal lobes and temporal poles.
This network is believed to be selectively vulnerable to AD neuropathology. “Something about that network starts to be disrupted in the very earliest stages of Alzheimer’s disease,” said Dr. Marshall.
While this has been known for some time, “what we’ve not been able to do before is build a precise enough model of how the network is connected to be able to tell whether individual participants were going to get dementia or not,” he added.
The investigators used data from the UK Biobank, a large-scale biomedical database and research resource containing genetic and health information from about a half a million UK volunteer participants.
The analysis included 103 individuals with dementia (22 with prevalent dementia and 81 later diagnosed with dementia over a median of 3.7 years) and 1030 matched participants without dementia. All participants had MRI imaging between 2006 and 2010.
The total sample had a mean age of 70.4 years at the time of MRI data acquisition. For each participant, researchers extracted relevant data from 10 predefined regions of interest in the brain, which together defined their DMN. This included two midline regions and four regions in each hemisphere.
Greater Predictive Power
Researchers built a model using an approach related to how brain regions communicate with each other. “The model sort of incorporates what we know about how the changes that you see on a functional MRI scan relate to changes in the firing of brain cells, in a very precise way,” said Dr. Marshall.
The researchers then used a machine learning approach to develop a model for effective connectivity, which describes the causal influence of one brain region over another. “We trained a machine learning tool to recognize what a dementia-like pattern of connectivity looks like,” said Dr. Marshall.
Investigators controlled for potential confounders, including age, sex, handedness, in-scanner head motion, and geographical location of data acquisition.
The model was able to determine the difference in brain connectivity patterns between those who would go on to develop dementia and those who would not, with an accuracy of 82% up to 9 years before an official diagnosis was made.
When the researchers trained a model to use brain connections to predict time to diagnosis, the predicted time to diagnosis and actual time to diagnosis were within about 2 years.
This effective connectivity approach has much more predictive power than memory test scores or brain structural measures, said Dr. Marshall. “We looked at brain volumes and they performed very poorly, only just better than tossing a coin, and the same with cognitive test scores, which were only just better than chance.”
As for markers of amyloid beta and tau in the brain, these are “very useful diagnostically” but only when someone has symptoms, said Dr. Marshall. He noted people live for years with these proteins without developing dementia symptoms.
“We wouldn’t necessarily want to expose somebody who has a brain full of amyloid but was not going to get symptoms for the next 20 years to a treatment, but if we knew that person was highly likely to develop symptoms of dementia in the next 5 years, then we probably would,” he said.
Dr. Marshall believes the predictive power of all these diagnostic tools could be boosted if they were used together.
Potential for Early Detection, Treatment
Researchers examined a number of modifiable dementia risk factors, including hearing loss, depression, hypertension, and physical inactivity. They found self-reported social isolation was the only variable that showed a significant association with effective connectivity, meaning those who are socially isolated were more likely to have a “dementia-like” pattern of DMN effective connectivity. This finding suggests social isolation is a cause, rather than a consequence, of dementia.
The study also revealed associations between DMN effective connectivity and AD polygenic risk score, derived from meta-analysis of multiple external genome-wide association study sources.
A predictive tool that uses rs-fMRI could also help select participants at a high risk for dementia to investigate potential treatments. “There’s good reason to think that if we could go in earlier with, for example, anti-amyloid treatments, they’re more likely to be effective,” said Dr. Marshall.
The new test might eventually have value as a population screening tool, something akin to colon cancer screening, he added. “We don’t send everyone for a colonoscopy; you do a kind of pre-screening test at home, and if that’s positive, then you get called in for a colonoscopy.”
The researchers looked at all-cause dementia and not just AD because dementia subtype diagnoses in the UK Biobank “are not at all reliable,” said Dr. Marshall.
Study limitations included the fact that UK Biobank participants are healthier and less socioeconomically deprived than the general population and are predominantly White. Another study limitation was that labeling of cases and controls depended on clinician coding rather than on standardized diagnostic criteria.
Kudos, Caveats
In a release from the Science Media Center, a nonprofit organization promoting voices and views of the scientific community, Sebastian Walsh, National Institute for Health and Care Research doctoral fellow in Public Health Medicine, University of Cambridge, Cambridge, England, said the results are “potentially exciting,” and he praised the way the team conducted the study.
However, he noted some caveats, including the small sample size, with only about 100 people with dementia, and the relatively short time between the brain scan and diagnosis (an average of 3.7 years).
Dr. Walsh emphasized the importance of replicating the findings “in bigger samples with a much longer delay between scan and onset of cognitive symptoms.”
He also noted the average age of study participants was 70 years, whereas the average age at which individuals in the United Kingdom develop dementia is mid to late 80s, “so we need to see these results repeated for more diverse and older samples.”
He also noted that MRI scans are expensive, and the approach used in the study needs “a high-quality scan which requires people to keep their head still.”
Also commenting, Andrew Doig, PhD, professor, Division of Neuroscience, the University of Manchester, Manchester, England, said the MRI connectivity method used in the study might form part of a broader diagnostic approach.
“Dementia is a complex condition, and it is unlikely that we will ever find one simple test that can accurately diagnose it,” Dr. Doig noted. “Within a few years, however, there is good reason to believe that we will be routinely testing for dementia in middle-aged people, using a combination of methods, such as a blood test, followed by imaging.”
“The MRI connectivity method described here could form part of this diagnostic platform. We will then have an excellent understanding of which people are likely to benefit most from the new generation of dementia drugs,” he said.
Dr. Marshall and Dr. Walsh reported no relevant disclosures. Dr. Doig reported that he is a founder, shareholder, and consultant for PharmaKure Ltd, which is developing new diagnostics for neurodegenerative diseases using blood biomarkers.
A version of this article first appeared on Medscape.com.
Novel, noninvasive testing is able to predict dementia onset with 80% accuracy up to 9 years before clinical diagnosis.
The results suggest resting-state functional MRI (rs-fMRI) could be used to identify a neural network signature of dementia risk early in the pathological course of the disease, an important advance as disease-modifying drugs such as those targeting amyloid beta are now becoming available.
“The brain has been changing for a long time before people get symptoms of dementia, and if we’re very precise about how we do it, we can actually, in principle, detect those changes, which could be really exciting,” study investigator Charles R. Marshall, PhD, professor of clinical neurology, Centre for Preventive Neurology, Wolfson Institute of Population Health, Queen Mary University of London, London, England, told this news organization.
“This could become a platform for screening people for risk status in the future, and it could one day make all the difference in terms of being able to prevent dementia,” he added.
The findings were published online in Nature Mental Health.
The rs-fMRI measures fluctuations in blood oxygen level–dependent signals across the brain, which reflect functional connectivity.
Brain regions commonly implicated in altered functional connectivity in Alzheimer’s disease (AD) are within the default-mode network (DMN). This is the group of regions “connecting with each other and communicating with each other when someone is just lying in an MRI scanner doing nothing, which is how it came to be called the default-mode network,” explained Dr. Marshall.
The DMN encompasses the medial prefrontal cortex, posterior cingulate cortex or precuneus, and bilateral inferior parietal cortices, as well as supplementary brain regions including the medial temporal lobes and temporal poles.
This network is believed to be selectively vulnerable to AD neuropathology. “Something about that network starts to be disrupted in the very earliest stages of Alzheimer’s disease,” said Dr. Marshall.
While this has been known for some time, “what we’ve not been able to do before is build a precise enough model of how the network is connected to be able to tell whether individual participants were going to get dementia or not,” he added.
The investigators used data from the UK Biobank, a large-scale biomedical database and research resource containing genetic and health information from about a half a million UK volunteer participants.
The analysis included 103 individuals with dementia (22 with prevalent dementia and 81 later diagnosed with dementia over a median of 3.7 years) and 1030 matched participants without dementia. All participants had MRI imaging between 2006 and 2010.
The total sample had a mean age of 70.4 years at the time of MRI data acquisition. For each participant, researchers extracted relevant data from 10 predefined regions of interest in the brain, which together defined their DMN. This included two midline regions and four regions in each hemisphere.
Greater Predictive Power
Researchers built a model using an approach related to how brain regions communicate with each other. “The model sort of incorporates what we know about how the changes that you see on a functional MRI scan relate to changes in the firing of brain cells, in a very precise way,” said Dr. Marshall.
The researchers then used a machine learning approach to develop a model for effective connectivity, which describes the causal influence of one brain region over another. “We trained a machine learning tool to recognize what a dementia-like pattern of connectivity looks like,” said Dr. Marshall.
Investigators controlled for potential confounders, including age, sex, handedness, in-scanner head motion, and geographical location of data acquisition.
The model was able to determine the difference in brain connectivity patterns between those who would go on to develop dementia and those who would not, with an accuracy of 82% up to 9 years before an official diagnosis was made.
When the researchers trained a model to use brain connections to predict time to diagnosis, the predicted time to diagnosis and actual time to diagnosis were within about 2 years.
This effective connectivity approach has much more predictive power than memory test scores or brain structural measures, said Dr. Marshall. “We looked at brain volumes and they performed very poorly, only just better than tossing a coin, and the same with cognitive test scores, which were only just better than chance.”
As for markers of amyloid beta and tau in the brain, these are “very useful diagnostically” but only when someone has symptoms, said Dr. Marshall. He noted people live for years with these proteins without developing dementia symptoms.
“We wouldn’t necessarily want to expose somebody who has a brain full of amyloid but was not going to get symptoms for the next 20 years to a treatment, but if we knew that person was highly likely to develop symptoms of dementia in the next 5 years, then we probably would,” he said.
Dr. Marshall believes the predictive power of all these diagnostic tools could be boosted if they were used together.
Potential for Early Detection, Treatment
Researchers examined a number of modifiable dementia risk factors, including hearing loss, depression, hypertension, and physical inactivity. They found self-reported social isolation was the only variable that showed a significant association with effective connectivity, meaning those who are socially isolated were more likely to have a “dementia-like” pattern of DMN effective connectivity. This finding suggests social isolation is a cause, rather than a consequence, of dementia.
The study also revealed associations between DMN effective connectivity and AD polygenic risk score, derived from meta-analysis of multiple external genome-wide association study sources.
A predictive tool that uses rs-fMRI could also help select participants at a high risk for dementia to investigate potential treatments. “There’s good reason to think that if we could go in earlier with, for example, anti-amyloid treatments, they’re more likely to be effective,” said Dr. Marshall.
The new test might eventually have value as a population screening tool, something akin to colon cancer screening, he added. “We don’t send everyone for a colonoscopy; you do a kind of pre-screening test at home, and if that’s positive, then you get called in for a colonoscopy.”
The researchers looked at all-cause dementia and not just AD because dementia subtype diagnoses in the UK Biobank “are not at all reliable,” said Dr. Marshall.
Study limitations included the fact that UK Biobank participants are healthier and less socioeconomically deprived than the general population and are predominantly White. Another study limitation was that labeling of cases and controls depended on clinician coding rather than on standardized diagnostic criteria.
Kudos, Caveats
In a release from the Science Media Center, a nonprofit organization promoting voices and views of the scientific community, Sebastian Walsh, National Institute for Health and Care Research doctoral fellow in Public Health Medicine, University of Cambridge, Cambridge, England, said the results are “potentially exciting,” and he praised the way the team conducted the study.
However, he noted some caveats, including the small sample size, with only about 100 people with dementia, and the relatively short time between the brain scan and diagnosis (an average of 3.7 years).
Dr. Walsh emphasized the importance of replicating the findings “in bigger samples with a much longer delay between scan and onset of cognitive symptoms.”
He also noted the average age of study participants was 70 years, whereas the average age at which individuals in the United Kingdom develop dementia is mid to late 80s, “so we need to see these results repeated for more diverse and older samples.”
He also noted that MRI scans are expensive, and the approach used in the study needs “a high-quality scan which requires people to keep their head still.”
Also commenting, Andrew Doig, PhD, professor, Division of Neuroscience, the University of Manchester, Manchester, England, said the MRI connectivity method used in the study might form part of a broader diagnostic approach.
“Dementia is a complex condition, and it is unlikely that we will ever find one simple test that can accurately diagnose it,” Dr. Doig noted. “Within a few years, however, there is good reason to believe that we will be routinely testing for dementia in middle-aged people, using a combination of methods, such as a blood test, followed by imaging.”
“The MRI connectivity method described here could form part of this diagnostic platform. We will then have an excellent understanding of which people are likely to benefit most from the new generation of dementia drugs,” he said.
Dr. Marshall and Dr. Walsh reported no relevant disclosures. Dr. Doig reported that he is a founder, shareholder, and consultant for PharmaKure Ltd, which is developing new diagnostics for neurodegenerative diseases using blood biomarkers.
A version of this article first appeared on Medscape.com.
Novel, noninvasive testing is able to predict dementia onset with 80% accuracy up to 9 years before clinical diagnosis.
The results suggest resting-state functional MRI (rs-fMRI) could be used to identify a neural network signature of dementia risk early in the pathological course of the disease, an important advance as disease-modifying drugs such as those targeting amyloid beta are now becoming available.
“The brain has been changing for a long time before people get symptoms of dementia, and if we’re very precise about how we do it, we can actually, in principle, detect those changes, which could be really exciting,” study investigator Charles R. Marshall, PhD, professor of clinical neurology, Centre for Preventive Neurology, Wolfson Institute of Population Health, Queen Mary University of London, London, England, told this news organization.
“This could become a platform for screening people for risk status in the future, and it could one day make all the difference in terms of being able to prevent dementia,” he added.
The findings were published online in Nature Mental Health.
The rs-fMRI measures fluctuations in blood oxygen level–dependent signals across the brain, which reflect functional connectivity.
Brain regions commonly implicated in altered functional connectivity in Alzheimer’s disease (AD) are within the default-mode network (DMN). This is the group of regions “connecting with each other and communicating with each other when someone is just lying in an MRI scanner doing nothing, which is how it came to be called the default-mode network,” explained Dr. Marshall.
The DMN encompasses the medial prefrontal cortex, posterior cingulate cortex or precuneus, and bilateral inferior parietal cortices, as well as supplementary brain regions including the medial temporal lobes and temporal poles.
This network is believed to be selectively vulnerable to AD neuropathology. “Something about that network starts to be disrupted in the very earliest stages of Alzheimer’s disease,” said Dr. Marshall.
While this has been known for some time, “what we’ve not been able to do before is build a precise enough model of how the network is connected to be able to tell whether individual participants were going to get dementia or not,” he added.
The investigators used data from the UK Biobank, a large-scale biomedical database and research resource containing genetic and health information from about a half a million UK volunteer participants.
The analysis included 103 individuals with dementia (22 with prevalent dementia and 81 later diagnosed with dementia over a median of 3.7 years) and 1030 matched participants without dementia. All participants had MRI imaging between 2006 and 2010.
The total sample had a mean age of 70.4 years at the time of MRI data acquisition. For each participant, researchers extracted relevant data from 10 predefined regions of interest in the brain, which together defined their DMN. This included two midline regions and four regions in each hemisphere.
Greater Predictive Power
Researchers built a model using an approach related to how brain regions communicate with each other. “The model sort of incorporates what we know about how the changes that you see on a functional MRI scan relate to changes in the firing of brain cells, in a very precise way,” said Dr. Marshall.
The researchers then used a machine learning approach to develop a model for effective connectivity, which describes the causal influence of one brain region over another. “We trained a machine learning tool to recognize what a dementia-like pattern of connectivity looks like,” said Dr. Marshall.
Investigators controlled for potential confounders, including age, sex, handedness, in-scanner head motion, and geographical location of data acquisition.
The model was able to determine the difference in brain connectivity patterns between those who would go on to develop dementia and those who would not, with an accuracy of 82% up to 9 years before an official diagnosis was made.
When the researchers trained a model to use brain connections to predict time to diagnosis, the predicted time to diagnosis and actual time to diagnosis were within about 2 years.
This effective connectivity approach has much more predictive power than memory test scores or brain structural measures, said Dr. Marshall. “We looked at brain volumes and they performed very poorly, only just better than tossing a coin, and the same with cognitive test scores, which were only just better than chance.”
As for markers of amyloid beta and tau in the brain, these are “very useful diagnostically” but only when someone has symptoms, said Dr. Marshall. He noted people live for years with these proteins without developing dementia symptoms.
“We wouldn’t necessarily want to expose somebody who has a brain full of amyloid but was not going to get symptoms for the next 20 years to a treatment, but if we knew that person was highly likely to develop symptoms of dementia in the next 5 years, then we probably would,” he said.
Dr. Marshall believes the predictive power of all these diagnostic tools could be boosted if they were used together.
Potential for Early Detection, Treatment
Researchers examined a number of modifiable dementia risk factors, including hearing loss, depression, hypertension, and physical inactivity. They found self-reported social isolation was the only variable that showed a significant association with effective connectivity, meaning those who are socially isolated were more likely to have a “dementia-like” pattern of DMN effective connectivity. This finding suggests social isolation is a cause, rather than a consequence, of dementia.
The study also revealed associations between DMN effective connectivity and AD polygenic risk score, derived from meta-analysis of multiple external genome-wide association study sources.
A predictive tool that uses rs-fMRI could also help select participants at a high risk for dementia to investigate potential treatments. “There’s good reason to think that if we could go in earlier with, for example, anti-amyloid treatments, they’re more likely to be effective,” said Dr. Marshall.
The new test might eventually have value as a population screening tool, something akin to colon cancer screening, he added. “We don’t send everyone for a colonoscopy; you do a kind of pre-screening test at home, and if that’s positive, then you get called in for a colonoscopy.”
The researchers looked at all-cause dementia and not just AD because dementia subtype diagnoses in the UK Biobank “are not at all reliable,” said Dr. Marshall.
Study limitations included the fact that UK Biobank participants are healthier and less socioeconomically deprived than the general population and are predominantly White. Another study limitation was that labeling of cases and controls depended on clinician coding rather than on standardized diagnostic criteria.
Kudos, Caveats
In a release from the Science Media Center, a nonprofit organization promoting voices and views of the scientific community, Sebastian Walsh, National Institute for Health and Care Research doctoral fellow in Public Health Medicine, University of Cambridge, Cambridge, England, said the results are “potentially exciting,” and he praised the way the team conducted the study.
However, he noted some caveats, including the small sample size, with only about 100 people with dementia, and the relatively short time between the brain scan and diagnosis (an average of 3.7 years).
Dr. Walsh emphasized the importance of replicating the findings “in bigger samples with a much longer delay between scan and onset of cognitive symptoms.”
He also noted the average age of study participants was 70 years, whereas the average age at which individuals in the United Kingdom develop dementia is mid to late 80s, “so we need to see these results repeated for more diverse and older samples.”
He also noted that MRI scans are expensive, and the approach used in the study needs “a high-quality scan which requires people to keep their head still.”
Also commenting, Andrew Doig, PhD, professor, Division of Neuroscience, the University of Manchester, Manchester, England, said the MRI connectivity method used in the study might form part of a broader diagnostic approach.
“Dementia is a complex condition, and it is unlikely that we will ever find one simple test that can accurately diagnose it,” Dr. Doig noted. “Within a few years, however, there is good reason to believe that we will be routinely testing for dementia in middle-aged people, using a combination of methods, such as a blood test, followed by imaging.”
“The MRI connectivity method described here could form part of this diagnostic platform. We will then have an excellent understanding of which people are likely to benefit most from the new generation of dementia drugs,” he said.
Dr. Marshall and Dr. Walsh reported no relevant disclosures. Dr. Doig reported that he is a founder, shareholder, and consultant for PharmaKure Ltd, which is developing new diagnostics for neurodegenerative diseases using blood biomarkers.
A version of this article first appeared on Medscape.com.
Chemo May Benefit Some Older Patients With Metastatic Pancreatic Cancer
TOPLINE:
METHODOLOGY:
Pancreatic cancer is most often diagnosed in adults aged 65 years or older. Providing cancer treatment for this older, often vulnerable, population comes with significant challenges and can lead to worse survival.
To examine real-world outcomes of older adults with untreated metastatic pancreatic cancer, researchers recruited patients aged 70 years or older and performed a geriatric assessment to identify comorbidities, cognitive issues, and other geriatric abnormalities.
Those who were deemed “fit” (ie, with no geriatric abnormalities) were assigned to receive off-study standard-of-care treatment, whereas those classified as “frail” (ie, with severe abnormalities) received off-study supportive care.
The remaining 176 “vulnerable” patients with mild to moderate geriatric abnormalities completed a geriatric and quality-of-life assessment and were then randomly assigned to receive either dose-reduced 5-fluorouracil (5-FU), leucovorin plus liposomal irinotecan (n = 88) or modified gemcitabine plus nab-paclitaxel (n = 88) every 2 weeks. Ultimately, 79 patients started the 5-FU combination and 75 received gemcitabine plus nab-paclitaxel. Patients were assessed every 8 weeks until disease progression or intolerance.
Overall, patients had a median age of 77 years; 61.9% were aged 75 years or older. About half were female, and 81.5% were White. The majority (87.5%) had a performance status of 0 or 1.
TAKEAWAY:
- Median overall survival was 4.7 months in the gemcitabine plus nab-paclitaxel arm and 4.4 months in the 5-FU combination group, with no significant survival difference observed between the two arms (P = .72).
- When the overall survival analysis was restricted to patients who received at least 4 weeks, or two cycles, of treatment (about 62% of patients), the median overall survival across the two treatment arms reached 8.0 months, in line with expectations for these regimens.
- Patient stratification revealed that those with a performance status of 2 had significantly worse overall survival than those with a status of 0: 1.4 months vs 6.9 months, respectively (hazard ratio [HR], 2.77; P < .001). A similar divide was seen when patients were stratified by physical/functional status and well-being. Age, however, did not significantly influence the results.
- Overall, more than half of patients experienced grade 3 or higher adverse events. Just over 38% of patients received only one to three cycles of therapy, whereas 26% remained on treatment for 12 or more cycles. The adverse event rates were similar between the two regimens, but the toxicity profile was slightly different — the researchers, for instance, observed more peripheral neuropathy with gemcitabine plus nab-paclitaxel and more diarrhea in the 5-FU combination arm.
IN PRACTICE:
- Overall, the “survival outcomes among vulnerable older patients were lower than expected, with high percentage of patients not able to start treatment, or complete one month of therapy due to clinical deterioration,” said study presenter Efrat Dotan, MD, chief, Division of Gastrointestinal Medical Oncology, Fox Chase Cancer Center, Philadelphia.
- “For vulnerable older adults who can tolerate treatment, these two regimens provide clinicians with options for tailoring therapy based on toxicity profile,” Dr. Dotan added. But “tools are needed to better identify patients who can benefit from treatment.”
- The results underline the need to perform geriatric assessments, as opposed to merely looking at performance status, commented David F. Chang, PhD, MS, MBBS, professor of Surgical Oncology, University of Glasgow, Scotland, who was not involved in the study.
SOURCE:
The research, presented at the 2024 annual meeting of the American Society of Clinical Oncology, was funded by the National Cancer Institute and the Eastern Cooperative Oncology Group.
LIMITATIONS:
Dr. Chang noted that the study did not reveal which treatment regimen was more effective.
DISCLOSURES:
Dr. Dotan declared relationships with Agenus, Amgen, G1 Therapeutics, Incyte, Olympus, and Taiho Pharmaceutical and institutional relationships with Dragonfly Therapeutics, Gilead Sciences, Ipsen, Kinnate Biopharma, Leap Therapeutics, Lilly, Lutris, NGM Biopharmaceuticals, Relay Therapeutics, and Zymeworks. Dr. Chang declared relationships with Immodulon Therapeutics and Mylan and institutional relationships with AstraZeneca, BMS GmbH & Co. KG, Immodulon Therapeutics, and Merck.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
Pancreatic cancer is most often diagnosed in adults aged 65 years or older. Providing cancer treatment for this older, often vulnerable, population comes with significant challenges and can lead to worse survival.
To examine real-world outcomes of older adults with untreated metastatic pancreatic cancer, researchers recruited patients aged 70 years or older and performed a geriatric assessment to identify comorbidities, cognitive issues, and other geriatric abnormalities.
Those who were deemed “fit” (ie, with no geriatric abnormalities) were assigned to receive off-study standard-of-care treatment, whereas those classified as “frail” (ie, with severe abnormalities) received off-study supportive care.
The remaining 176 “vulnerable” patients with mild to moderate geriatric abnormalities completed a geriatric and quality-of-life assessment and were then randomly assigned to receive either dose-reduced 5-fluorouracil (5-FU), leucovorin plus liposomal irinotecan (n = 88) or modified gemcitabine plus nab-paclitaxel (n = 88) every 2 weeks. Ultimately, 79 patients started the 5-FU combination and 75 received gemcitabine plus nab-paclitaxel. Patients were assessed every 8 weeks until disease progression or intolerance.
Overall, patients had a median age of 77 years; 61.9% were aged 75 years or older. About half were female, and 81.5% were White. The majority (87.5%) had a performance status of 0 or 1.
TAKEAWAY:
- Median overall survival was 4.7 months in the gemcitabine plus nab-paclitaxel arm and 4.4 months in the 5-FU combination group, with no significant survival difference observed between the two arms (P = .72).
- When the overall survival analysis was restricted to patients who received at least 4 weeks, or two cycles, of treatment (about 62% of patients), the median overall survival across the two treatment arms reached 8.0 months, in line with expectations for these regimens.
- Patient stratification revealed that those with a performance status of 2 had significantly worse overall survival than those with a status of 0: 1.4 months vs 6.9 months, respectively (hazard ratio [HR], 2.77; P < .001). A similar divide was seen when patients were stratified by physical/functional status and well-being. Age, however, did not significantly influence the results.
- Overall, more than half of patients experienced grade 3 or higher adverse events. Just over 38% of patients received only one to three cycles of therapy, whereas 26% remained on treatment for 12 or more cycles. The adverse event rates were similar between the two regimens, but the toxicity profile was slightly different — the researchers, for instance, observed more peripheral neuropathy with gemcitabine plus nab-paclitaxel and more diarrhea in the 5-FU combination arm.
IN PRACTICE:
- Overall, the “survival outcomes among vulnerable older patients were lower than expected, with high percentage of patients not able to start treatment, or complete one month of therapy due to clinical deterioration,” said study presenter Efrat Dotan, MD, chief, Division of Gastrointestinal Medical Oncology, Fox Chase Cancer Center, Philadelphia.
- “For vulnerable older adults who can tolerate treatment, these two regimens provide clinicians with options for tailoring therapy based on toxicity profile,” Dr. Dotan added. But “tools are needed to better identify patients who can benefit from treatment.”
- The results underline the need to perform geriatric assessments, as opposed to merely looking at performance status, commented David F. Chang, PhD, MS, MBBS, professor of Surgical Oncology, University of Glasgow, Scotland, who was not involved in the study.
SOURCE:
The research, presented at the 2024 annual meeting of the American Society of Clinical Oncology, was funded by the National Cancer Institute and the Eastern Cooperative Oncology Group.
LIMITATIONS:
Dr. Chang noted that the study did not reveal which treatment regimen was more effective.
DISCLOSURES:
Dr. Dotan declared relationships with Agenus, Amgen, G1 Therapeutics, Incyte, Olympus, and Taiho Pharmaceutical and institutional relationships with Dragonfly Therapeutics, Gilead Sciences, Ipsen, Kinnate Biopharma, Leap Therapeutics, Lilly, Lutris, NGM Biopharmaceuticals, Relay Therapeutics, and Zymeworks. Dr. Chang declared relationships with Immodulon Therapeutics and Mylan and institutional relationships with AstraZeneca, BMS GmbH & Co. KG, Immodulon Therapeutics, and Merck.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
Pancreatic cancer is most often diagnosed in adults aged 65 years or older. Providing cancer treatment for this older, often vulnerable, population comes with significant challenges and can lead to worse survival.
To examine real-world outcomes of older adults with untreated metastatic pancreatic cancer, researchers recruited patients aged 70 years or older and performed a geriatric assessment to identify comorbidities, cognitive issues, and other geriatric abnormalities.
Those who were deemed “fit” (ie, with no geriatric abnormalities) were assigned to receive off-study standard-of-care treatment, whereas those classified as “frail” (ie, with severe abnormalities) received off-study supportive care.
The remaining 176 “vulnerable” patients with mild to moderate geriatric abnormalities completed a geriatric and quality-of-life assessment and were then randomly assigned to receive either dose-reduced 5-fluorouracil (5-FU), leucovorin plus liposomal irinotecan (n = 88) or modified gemcitabine plus nab-paclitaxel (n = 88) every 2 weeks. Ultimately, 79 patients started the 5-FU combination and 75 received gemcitabine plus nab-paclitaxel. Patients were assessed every 8 weeks until disease progression or intolerance.
Overall, patients had a median age of 77 years; 61.9% were aged 75 years or older. About half were female, and 81.5% were White. The majority (87.5%) had a performance status of 0 or 1.
TAKEAWAY:
- Median overall survival was 4.7 months in the gemcitabine plus nab-paclitaxel arm and 4.4 months in the 5-FU combination group, with no significant survival difference observed between the two arms (P = .72).
- When the overall survival analysis was restricted to patients who received at least 4 weeks, or two cycles, of treatment (about 62% of patients), the median overall survival across the two treatment arms reached 8.0 months, in line with expectations for these regimens.
- Patient stratification revealed that those with a performance status of 2 had significantly worse overall survival than those with a status of 0: 1.4 months vs 6.9 months, respectively (hazard ratio [HR], 2.77; P < .001). A similar divide was seen when patients were stratified by physical/functional status and well-being. Age, however, did not significantly influence the results.
- Overall, more than half of patients experienced grade 3 or higher adverse events. Just over 38% of patients received only one to three cycles of therapy, whereas 26% remained on treatment for 12 or more cycles. The adverse event rates were similar between the two regimens, but the toxicity profile was slightly different — the researchers, for instance, observed more peripheral neuropathy with gemcitabine plus nab-paclitaxel and more diarrhea in the 5-FU combination arm.
IN PRACTICE:
- Overall, the “survival outcomes among vulnerable older patients were lower than expected, with high percentage of patients not able to start treatment, or complete one month of therapy due to clinical deterioration,” said study presenter Efrat Dotan, MD, chief, Division of Gastrointestinal Medical Oncology, Fox Chase Cancer Center, Philadelphia.
- “For vulnerable older adults who can tolerate treatment, these two regimens provide clinicians with options for tailoring therapy based on toxicity profile,” Dr. Dotan added. But “tools are needed to better identify patients who can benefit from treatment.”
- The results underline the need to perform geriatric assessments, as opposed to merely looking at performance status, commented David F. Chang, PhD, MS, MBBS, professor of Surgical Oncology, University of Glasgow, Scotland, who was not involved in the study.
SOURCE:
The research, presented at the 2024 annual meeting of the American Society of Clinical Oncology, was funded by the National Cancer Institute and the Eastern Cooperative Oncology Group.
LIMITATIONS:
Dr. Chang noted that the study did not reveal which treatment regimen was more effective.
DISCLOSURES:
Dr. Dotan declared relationships with Agenus, Amgen, G1 Therapeutics, Incyte, Olympus, and Taiho Pharmaceutical and institutional relationships with Dragonfly Therapeutics, Gilead Sciences, Ipsen, Kinnate Biopharma, Leap Therapeutics, Lilly, Lutris, NGM Biopharmaceuticals, Relay Therapeutics, and Zymeworks. Dr. Chang declared relationships with Immodulon Therapeutics and Mylan and institutional relationships with AstraZeneca, BMS GmbH & Co. KG, Immodulon Therapeutics, and Merck.
A version of this article appeared on Medscape.com.
Melatonin May Cut Risk for Age-Related Eye Disease
TOPLINE:
Melatonin supplementation is linked to a reduced risk for developing age-related macular degeneration (AMD) and slowing its progression, suggesting potential as a preventive therapy.
METHODOLOGY:
- Researchers analyzed data from the TriNetX database, covering electronic medical records across the United States from December 2023 to March 2024.
- The retrospective study included patients aged ≥ 50 years, divided into groups based on their history of AMD and melatonin medication codes between November 2008 and November 2023.
- Propensity score matching was used to compare melatonin users and nonusers for the risk for developing any form of AMD or the progression to exudative AMD from the nonexudative form of the condition.
TAKEAWAY:
- Use of melatonin was associated with a 58% reduction in the risk for developing AMD, according to the researchers.
- In people with nonexudative AMD, use of the supplement was linked to a 56% lower risk for progression to exudative AMD.
- The findings were consistent across age groups, suggesting melatonin’s benefits may extend to older populations at higher risk for AMD, the researchers reported.
IN PRACTICE:
“In this cohort study of 121,523 patients with no history of AMD aged ≥ 50 years, taking melatonin was associated with a decreased risk of developing AMD,” the authors of the study wrote. “Likewise, among 66,253 patients with preexisting nonexudative AMD, melatonin supplementation was negatively associated with the rate of progression to exudative AMD.”
Studies in animals and humans have shown melatonin may be a potent antioxidant and anti-inflammatory agent and have both antiangiogenic and mitochondrial-preserving properties, the authors noted. The new findings “provide a rationale for expanding clinical research on the potential therapeutic efficacy of melatonin in preventing AMD development or its progression,” they added.
SOURCE:
The study was led by Hejin Jeong, Case Western Reserve University School of Medicine, Cleveland, and was published online in JAMA Ophthalmology.
LIMITATIONS:
The study’s reliance on diagnostic codes may have limited the accuracy of identifying AMD progression. Variations in coding practices and the reporting of over-the-counter medications like melatonin could have influenced the results. The study did not control for all modifiable risk factors for AMD, which may have introduced healthy user bias.
DISCLOSURES:
The authors reported various potential conflicts of interest, including receiving personal fees and grants from various pharmaceutical companies. The study was funded by grants from the National Institutes of Health and the Cleveland Eye Bank Foundation.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
Melatonin supplementation is linked to a reduced risk for developing age-related macular degeneration (AMD) and slowing its progression, suggesting potential as a preventive therapy.
METHODOLOGY:
- Researchers analyzed data from the TriNetX database, covering electronic medical records across the United States from December 2023 to March 2024.
- The retrospective study included patients aged ≥ 50 years, divided into groups based on their history of AMD and melatonin medication codes between November 2008 and November 2023.
- Propensity score matching was used to compare melatonin users and nonusers for the risk for developing any form of AMD or the progression to exudative AMD from the nonexudative form of the condition.
TAKEAWAY:
- Use of melatonin was associated with a 58% reduction in the risk for developing AMD, according to the researchers.
- In people with nonexudative AMD, use of the supplement was linked to a 56% lower risk for progression to exudative AMD.
- The findings were consistent across age groups, suggesting melatonin’s benefits may extend to older populations at higher risk for AMD, the researchers reported.
IN PRACTICE:
“In this cohort study of 121,523 patients with no history of AMD aged ≥ 50 years, taking melatonin was associated with a decreased risk of developing AMD,” the authors of the study wrote. “Likewise, among 66,253 patients with preexisting nonexudative AMD, melatonin supplementation was negatively associated with the rate of progression to exudative AMD.”
Studies in animals and humans have shown melatonin may be a potent antioxidant and anti-inflammatory agent and have both antiangiogenic and mitochondrial-preserving properties, the authors noted. The new findings “provide a rationale for expanding clinical research on the potential therapeutic efficacy of melatonin in preventing AMD development or its progression,” they added.
SOURCE:
The study was led by Hejin Jeong, Case Western Reserve University School of Medicine, Cleveland, and was published online in JAMA Ophthalmology.
LIMITATIONS:
The study’s reliance on diagnostic codes may have limited the accuracy of identifying AMD progression. Variations in coding practices and the reporting of over-the-counter medications like melatonin could have influenced the results. The study did not control for all modifiable risk factors for AMD, which may have introduced healthy user bias.
DISCLOSURES:
The authors reported various potential conflicts of interest, including receiving personal fees and grants from various pharmaceutical companies. The study was funded by grants from the National Institutes of Health and the Cleveland Eye Bank Foundation.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
Melatonin supplementation is linked to a reduced risk for developing age-related macular degeneration (AMD) and slowing its progression, suggesting potential as a preventive therapy.
METHODOLOGY:
- Researchers analyzed data from the TriNetX database, covering electronic medical records across the United States from December 2023 to March 2024.
- The retrospective study included patients aged ≥ 50 years, divided into groups based on their history of AMD and melatonin medication codes between November 2008 and November 2023.
- Propensity score matching was used to compare melatonin users and nonusers for the risk for developing any form of AMD or the progression to exudative AMD from the nonexudative form of the condition.
TAKEAWAY:
- Use of melatonin was associated with a 58% reduction in the risk for developing AMD, according to the researchers.
- In people with nonexudative AMD, use of the supplement was linked to a 56% lower risk for progression to exudative AMD.
- The findings were consistent across age groups, suggesting melatonin’s benefits may extend to older populations at higher risk for AMD, the researchers reported.
IN PRACTICE:
“In this cohort study of 121,523 patients with no history of AMD aged ≥ 50 years, taking melatonin was associated with a decreased risk of developing AMD,” the authors of the study wrote. “Likewise, among 66,253 patients with preexisting nonexudative AMD, melatonin supplementation was negatively associated with the rate of progression to exudative AMD.”
Studies in animals and humans have shown melatonin may be a potent antioxidant and anti-inflammatory agent and have both antiangiogenic and mitochondrial-preserving properties, the authors noted. The new findings “provide a rationale for expanding clinical research on the potential therapeutic efficacy of melatonin in preventing AMD development or its progression,” they added.
SOURCE:
The study was led by Hejin Jeong, Case Western Reserve University School of Medicine, Cleveland, and was published online in JAMA Ophthalmology.
LIMITATIONS:
The study’s reliance on diagnostic codes may have limited the accuracy of identifying AMD progression. Variations in coding practices and the reporting of over-the-counter medications like melatonin could have influenced the results. The study did not control for all modifiable risk factors for AMD, which may have introduced healthy user bias.
DISCLOSURES:
The authors reported various potential conflicts of interest, including receiving personal fees and grants from various pharmaceutical companies. The study was funded by grants from the National Institutes of Health and the Cleveland Eye Bank Foundation.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Early Memory Problems Linked to Increased Tau
Reports from older adults and their partners of early memory issues are associated with higher levels of tau neurofibrillary tangles in the brain, new research suggests.
The findings show that in addition to beta-amyloid, tau is implicated in cognitive decline even in the absence of overt clinical symptoms.
“Understanding the earliest signs of Alzheimer’s disease is even more important now that new disease-modifying drugs are becoming available,” study author
Rebecca E. Amariglio, PhD, clinical neuropsychologist at Brigham and Women’s Hospital and the Massachusetts General Hospital and assistant professor in neurology at Harvard Medical School, Boston, said in a news release. “Our study found early suspicions of memory problems by both participants and the people who knew them well were linked to higher levels of tau tangles in the brain.”
The study was published online in Neurology.
Subjective Cognitive Decline
Beta-amyloid plaque accumulations and tau neurofibrillary tangles both underlie the clinical continuum of Alzheimer’s disease (AD). Previous studies have investigated beta-amyloid burden and self- and partner-reported cognitive decline, but fewer have examined regional tau.
Subjective cognitive decline may be an early sign of AD, but self-awareness declines as individuals become increasingly symptomatic. So, a report from a partner about the participant’s level of cognitive functioning is often required in studies of mild cognitive impairment and dementia. The relevance of this model during the preclinical stage is less clear.
For the multicohort, cross-sectional study, investigators studied 675 cognitively unimpaired older adults (mean age, 72 years; 59% female), including persons with nonelevated beta-amyloid levels and those with elevated beta-amyloid levels, as determined by PET.
Participants brought a spouse, adult child, or other study partner with them to answer questions about the participant’s cognitive abilities and their ability to complete daily tasks. About 65% of participants lived with their partners and both completed the Cognitive Function Index (CFI) to assess cognitive decline, with higher scores indicating greater cognitive decline.
Covariates included age, sex, education, and cohort as well as objective cognitive performance.
The Value of Partner Reporting
Investigators found that higher tau levels were associated with greater self- and partner-reported cognitive decline (P < .001 for both).
Significant associations between self- and partner-reported CFI measures were driven by elevated beta-amyloid levels, with continuous beta-amyloid levels showing an independent effect on CFI in addition to tau.
“Our findings suggest that asking older people who have elevated Alzheimer’s disease biomarkers about subjective cognitive decline may be valuable for early detection,” Dr. Amariglio said.
Limitations include the fact that most participants were White and highly educated. Future studies should include participants from more diverse racial and ethnic groups and people with diverse levels of education, researchers noted.
“Although this study was cross-sectional, findings suggest that among older CU individuals who at risk for AD dementia, capturing self-report and study partner report of cognitive function may be valuable for understanding the relationship between early pathophysiologic progression and the emergence of functional impairment,” the authors concluded.
The study was funded in part by the National Institute on Aging, Eli Lily, and the Alzheimer’s Association, among others. Dr. Amariglio receives research funding from the National Institute on Aging. Complete study funding and other authors’ disclosures are listed in the original paper.
A version of this article first appeared on Medscape.com.
Reports from older adults and their partners of early memory issues are associated with higher levels of tau neurofibrillary tangles in the brain, new research suggests.
The findings show that in addition to beta-amyloid, tau is implicated in cognitive decline even in the absence of overt clinical symptoms.
“Understanding the earliest signs of Alzheimer’s disease is even more important now that new disease-modifying drugs are becoming available,” study author
Rebecca E. Amariglio, PhD, clinical neuropsychologist at Brigham and Women’s Hospital and the Massachusetts General Hospital and assistant professor in neurology at Harvard Medical School, Boston, said in a news release. “Our study found early suspicions of memory problems by both participants and the people who knew them well were linked to higher levels of tau tangles in the brain.”
The study was published online in Neurology.
Subjective Cognitive Decline
Beta-amyloid plaque accumulations and tau neurofibrillary tangles both underlie the clinical continuum of Alzheimer’s disease (AD). Previous studies have investigated beta-amyloid burden and self- and partner-reported cognitive decline, but fewer have examined regional tau.
Subjective cognitive decline may be an early sign of AD, but self-awareness declines as individuals become increasingly symptomatic. So, a report from a partner about the participant’s level of cognitive functioning is often required in studies of mild cognitive impairment and dementia. The relevance of this model during the preclinical stage is less clear.
For the multicohort, cross-sectional study, investigators studied 675 cognitively unimpaired older adults (mean age, 72 years; 59% female), including persons with nonelevated beta-amyloid levels and those with elevated beta-amyloid levels, as determined by PET.
Participants brought a spouse, adult child, or other study partner with them to answer questions about the participant’s cognitive abilities and their ability to complete daily tasks. About 65% of participants lived with their partners and both completed the Cognitive Function Index (CFI) to assess cognitive decline, with higher scores indicating greater cognitive decline.
Covariates included age, sex, education, and cohort as well as objective cognitive performance.
The Value of Partner Reporting
Investigators found that higher tau levels were associated with greater self- and partner-reported cognitive decline (P < .001 for both).
Significant associations between self- and partner-reported CFI measures were driven by elevated beta-amyloid levels, with continuous beta-amyloid levels showing an independent effect on CFI in addition to tau.
“Our findings suggest that asking older people who have elevated Alzheimer’s disease biomarkers about subjective cognitive decline may be valuable for early detection,” Dr. Amariglio said.
Limitations include the fact that most participants were White and highly educated. Future studies should include participants from more diverse racial and ethnic groups and people with diverse levels of education, researchers noted.
“Although this study was cross-sectional, findings suggest that among older CU individuals who at risk for AD dementia, capturing self-report and study partner report of cognitive function may be valuable for understanding the relationship between early pathophysiologic progression and the emergence of functional impairment,” the authors concluded.
The study was funded in part by the National Institute on Aging, Eli Lily, and the Alzheimer’s Association, among others. Dr. Amariglio receives research funding from the National Institute on Aging. Complete study funding and other authors’ disclosures are listed in the original paper.
A version of this article first appeared on Medscape.com.
Reports from older adults and their partners of early memory issues are associated with higher levels of tau neurofibrillary tangles in the brain, new research suggests.
The findings show that in addition to beta-amyloid, tau is implicated in cognitive decline even in the absence of overt clinical symptoms.
“Understanding the earliest signs of Alzheimer’s disease is even more important now that new disease-modifying drugs are becoming available,” study author
Rebecca E. Amariglio, PhD, clinical neuropsychologist at Brigham and Women’s Hospital and the Massachusetts General Hospital and assistant professor in neurology at Harvard Medical School, Boston, said in a news release. “Our study found early suspicions of memory problems by both participants and the people who knew them well were linked to higher levels of tau tangles in the brain.”
The study was published online in Neurology.
Subjective Cognitive Decline
Beta-amyloid plaque accumulations and tau neurofibrillary tangles both underlie the clinical continuum of Alzheimer’s disease (AD). Previous studies have investigated beta-amyloid burden and self- and partner-reported cognitive decline, but fewer have examined regional tau.
Subjective cognitive decline may be an early sign of AD, but self-awareness declines as individuals become increasingly symptomatic. So, a report from a partner about the participant’s level of cognitive functioning is often required in studies of mild cognitive impairment and dementia. The relevance of this model during the preclinical stage is less clear.
For the multicohort, cross-sectional study, investigators studied 675 cognitively unimpaired older adults (mean age, 72 years; 59% female), including persons with nonelevated beta-amyloid levels and those with elevated beta-amyloid levels, as determined by PET.
Participants brought a spouse, adult child, or other study partner with them to answer questions about the participant’s cognitive abilities and their ability to complete daily tasks. About 65% of participants lived with their partners and both completed the Cognitive Function Index (CFI) to assess cognitive decline, with higher scores indicating greater cognitive decline.
Covariates included age, sex, education, and cohort as well as objective cognitive performance.
The Value of Partner Reporting
Investigators found that higher tau levels were associated with greater self- and partner-reported cognitive decline (P < .001 for both).
Significant associations between self- and partner-reported CFI measures were driven by elevated beta-amyloid levels, with continuous beta-amyloid levels showing an independent effect on CFI in addition to tau.
“Our findings suggest that asking older people who have elevated Alzheimer’s disease biomarkers about subjective cognitive decline may be valuable for early detection,” Dr. Amariglio said.
Limitations include the fact that most participants were White and highly educated. Future studies should include participants from more diverse racial and ethnic groups and people with diverse levels of education, researchers noted.
“Although this study was cross-sectional, findings suggest that among older CU individuals who at risk for AD dementia, capturing self-report and study partner report of cognitive function may be valuable for understanding the relationship between early pathophysiologic progression and the emergence of functional impairment,” the authors concluded.
The study was funded in part by the National Institute on Aging, Eli Lily, and the Alzheimer’s Association, among others. Dr. Amariglio receives research funding from the National Institute on Aging. Complete study funding and other authors’ disclosures are listed in the original paper.
A version of this article first appeared on Medscape.com.
More Women Report First Hip Fracture in Their 60s
TOPLINE:
Women with low bone density are more likely to report their first fragility hip fracture in their 60s rather than at older ages.
METHODOLOGY:
- Researchers used hip fracture data from the National Health and Nutrition Examination Survey for 2009-2010, 2013-2014, and 2017-2018.
- They included women older than 60 years with a bone mineral density T score ≤ −1 at the femur neck, measured by dual-energy x-ray absorptiometry.
- Fragility fractures are defined as a self-reported hip fracture resulting from a fall from standing height or less.
TAKEAWAY:
- The number of women in their 60s who reported their first hip fracture grew by 50% from 2009 to 2018.
- The opposite was true for women in their 70s and 80s who reported fewer first hip fractures over the study period.
- Reported fragility hip fractures in women overall decreased by half from 2009 to 2018.
- The prevalence of women with osteoporosis (T score ≤ −2.5) grew from 18.1% to 21.3% over 10 years.
IN PRACTICE:
The decrease in fractures overall and in women older than 70 years “may be due to increasing awareness and utilization of measures to decrease falls such as exercise, nutrition, health education, and environmental modifications targeted toward the elderly population,” the authors wrote. The findings also underscore the importance of earlier bone health awareness in primary care to curb the rising trend in younger women, they added.
SOURCE:
The study was led by Avica Atri, MD, of Albert Einstein Medical Center in Philadelphia. She presented the findings at ENDO 2024: The Endocrine Society Annual Meeting.
LIMITATIONS:
The study was retrospective in nature and included self-reported health data.
DISCLOSURES:
The study received no commercial funding. The authors have reported no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
Women with low bone density are more likely to report their first fragility hip fracture in their 60s rather than at older ages.
METHODOLOGY:
- Researchers used hip fracture data from the National Health and Nutrition Examination Survey for 2009-2010, 2013-2014, and 2017-2018.
- They included women older than 60 years with a bone mineral density T score ≤ −1 at the femur neck, measured by dual-energy x-ray absorptiometry.
- Fragility fractures are defined as a self-reported hip fracture resulting from a fall from standing height or less.
TAKEAWAY:
- The number of women in their 60s who reported their first hip fracture grew by 50% from 2009 to 2018.
- The opposite was true for women in their 70s and 80s who reported fewer first hip fractures over the study period.
- Reported fragility hip fractures in women overall decreased by half from 2009 to 2018.
- The prevalence of women with osteoporosis (T score ≤ −2.5) grew from 18.1% to 21.3% over 10 years.
IN PRACTICE:
The decrease in fractures overall and in women older than 70 years “may be due to increasing awareness and utilization of measures to decrease falls such as exercise, nutrition, health education, and environmental modifications targeted toward the elderly population,” the authors wrote. The findings also underscore the importance of earlier bone health awareness in primary care to curb the rising trend in younger women, they added.
SOURCE:
The study was led by Avica Atri, MD, of Albert Einstein Medical Center in Philadelphia. She presented the findings at ENDO 2024: The Endocrine Society Annual Meeting.
LIMITATIONS:
The study was retrospective in nature and included self-reported health data.
DISCLOSURES:
The study received no commercial funding. The authors have reported no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
Women with low bone density are more likely to report their first fragility hip fracture in their 60s rather than at older ages.
METHODOLOGY:
- Researchers used hip fracture data from the National Health and Nutrition Examination Survey for 2009-2010, 2013-2014, and 2017-2018.
- They included women older than 60 years with a bone mineral density T score ≤ −1 at the femur neck, measured by dual-energy x-ray absorptiometry.
- Fragility fractures are defined as a self-reported hip fracture resulting from a fall from standing height or less.
TAKEAWAY:
- The number of women in their 60s who reported their first hip fracture grew by 50% from 2009 to 2018.
- The opposite was true for women in their 70s and 80s who reported fewer first hip fractures over the study period.
- Reported fragility hip fractures in women overall decreased by half from 2009 to 2018.
- The prevalence of women with osteoporosis (T score ≤ −2.5) grew from 18.1% to 21.3% over 10 years.
IN PRACTICE:
The decrease in fractures overall and in women older than 70 years “may be due to increasing awareness and utilization of measures to decrease falls such as exercise, nutrition, health education, and environmental modifications targeted toward the elderly population,” the authors wrote. The findings also underscore the importance of earlier bone health awareness in primary care to curb the rising trend in younger women, they added.
SOURCE:
The study was led by Avica Atri, MD, of Albert Einstein Medical Center in Philadelphia. She presented the findings at ENDO 2024: The Endocrine Society Annual Meeting.
LIMITATIONS:
The study was retrospective in nature and included self-reported health data.
DISCLOSURES:
The study received no commercial funding. The authors have reported no relevant financial relationships.
A version of this article appeared on Medscape.com.