User login
Seven hours of sleep is ideal for middle aged and older
Sleep disturbances are common in older age, and previous studies have shown associations between too much or too little sleep and increased risk of cognitive decline, but the ideal amount of sleep for preserving mental health has not been well described, according to the authors of the new paper.
In the study published in Nature Aging, the team of researchers from China and the United Kingdom reviewed data from the UK Biobank, a national database of individuals in the United Kingdom that includes cognitive assessments, mental health questionnaires, and brain imaging data, as well as genetic information.
Sleep is important for physical and psychological health, and also serves a neuroprotective function by clearing waste products from the brain, lead author Yuzhu Li of Fudan University, Shanghai, China, and colleagues wrote.
The study population included 498,277 participants, aged 38-73 years, who completed touchscreen questionnaires about sleep duration between 2006 and 2010. The average age at baseline was 56.5 years, 54% were female, and the mean sleep duration was 7.15 hours.
The researchers also reviewed brain imaging data and genetic data from 39,692 participants in 2014 to examine the relationships between sleep duration and brain structure and between sleep duration and genetic risk. In addition, 156,884 participants completed an online follow-up mental health questionnaire in 2016-2017 to assess the longitudinal impact of sleep on mental health.
Both excessive and insufficient sleep was associated with impaired cognitive performance, evidenced by the U-shaped curve found by the researchers in their data analysis, which used quadratic associations.
Specific cognitive functions including pair matching, trail making, prospective memory, and reaction time were significantly impaired with too much or too little sleep, the researchers said. “This demonstrated the positive association of both insufficient and excessive sleep duration with inferior performance on cognitive tasks.”
When the researchers analyzed the association between sleep duration and mental health, sleep duration also showed a U-shaped association with symptoms of anxiety, depression, mental distress, mania, and self-harm, while well-being showed an inverted U-shape. All associations between sleep duration and mental health were statistically significant after controlling for confounding variables (P < .001).
On further analysis (using two-line tests), the researchers determined that consistent sleep duration of approximately 7 hours per night was optimal for cognitive performance and for good mental health.
The researchers also used neuroimaging data to examine the relationship between sleep duration and brain structure. Overall, greater changes were seen in the regions of the brain involved in cognitive processing and memory.
“The most significant cortical volumes nonlinearly associated with sleep duration included the precentral cortex, the superior frontal gyrus, the lateral orbitofrontal cortex, the pars orbitalis, the frontal pole, and the middle temporal cortex,” the researchers wrote (P < .05 for all).
The association between sleep duration and cognitive function diminished among individuals older than 65 years, compared with those aged approximately 40 years, which suggests that optimal sleep duration may be more beneficial in middle age, the researchers noted. However, no similar impact of age was seen for mental health. For brain structure, the nonlinear relationship between sleep duration and cortical volumes was greatest in those aged 44-59 years, and gradually flattened with older age.
Research supports sleep discussions with patients
“Primary care physicians can use this study in their discussions with middle-aged and older patients to recommend optimal sleep duration and measures to achieve this sleep target,” Noel Deep, MD, a general internist in group practice in Antigo, Wisc., who was not involved in the study, said in an interview.
“This study is important because it demonstrated that both inadequate and excessive sleep patterns were associated with cognitive and mental health changes,” said Dr. Deep. “It supported previous observations of cognitive decline and mental health disorders being linked to disturbed sleep. But this study was unique because it provides data supporting an optimal sleep duration of 7 hours and the ill effects of both insufficient and excessive sleep duration.
“The usual thought process has been to assume that older individuals may not require as much sleep as the younger individuals, but this study supports an optimal time duration of sleep of 7 hours that benefits the older individuals. It was also interesting to note the mental health effects caused by the inadequate and excessive sleep durations,” he added.
As for additional research, “I would like to look into the quality of the sleep, in addition to the duration of sleep,” said Dr. Deep. For example, whether the excessive sleep was caused by poor quality sleep or fragmented sleep leading to the structural and subsequent cognitive decline.
Study limitations
“The current study relied on self-reporting of the sleep duration and was not observed and recorded data,” Dr. Deep noted. “It would also be beneficial to not only rely on healthy volunteers reporting the sleep duration, but also obtain sleep data from individuals with known brain disorders.”
The study findings were limited by several other factors, including the use of total sleep duration only, without other measures of sleep hygiene, the researchers noted. More research is needed to investigate the mechanisms driving the association between too much and not enough sleep and poor mental health and cognitive function.
The study was supported by the National Key R&D Program of China, the Shanghai Municipal Science and Technology Major Project, the Shanghai Center for Brain Science and Brain-Inspired Technology, the 111 Project, the National Natural Sciences Foundation of China and the Shanghai Rising Star Program.
The researchers had no financial conflicts to disclose. Dr. Deep had no financial conflicts to disclose, but serves on the editorial advisory board of Internal Medicine News.
Sleep disturbances are common in older age, and previous studies have shown associations between too much or too little sleep and increased risk of cognitive decline, but the ideal amount of sleep for preserving mental health has not been well described, according to the authors of the new paper.
In the study published in Nature Aging, the team of researchers from China and the United Kingdom reviewed data from the UK Biobank, a national database of individuals in the United Kingdom that includes cognitive assessments, mental health questionnaires, and brain imaging data, as well as genetic information.
Sleep is important for physical and psychological health, and also serves a neuroprotective function by clearing waste products from the brain, lead author Yuzhu Li of Fudan University, Shanghai, China, and colleagues wrote.
The study population included 498,277 participants, aged 38-73 years, who completed touchscreen questionnaires about sleep duration between 2006 and 2010. The average age at baseline was 56.5 years, 54% were female, and the mean sleep duration was 7.15 hours.
The researchers also reviewed brain imaging data and genetic data from 39,692 participants in 2014 to examine the relationships between sleep duration and brain structure and between sleep duration and genetic risk. In addition, 156,884 participants completed an online follow-up mental health questionnaire in 2016-2017 to assess the longitudinal impact of sleep on mental health.
Both excessive and insufficient sleep was associated with impaired cognitive performance, evidenced by the U-shaped curve found by the researchers in their data analysis, which used quadratic associations.
Specific cognitive functions including pair matching, trail making, prospective memory, and reaction time were significantly impaired with too much or too little sleep, the researchers said. “This demonstrated the positive association of both insufficient and excessive sleep duration with inferior performance on cognitive tasks.”
When the researchers analyzed the association between sleep duration and mental health, sleep duration also showed a U-shaped association with symptoms of anxiety, depression, mental distress, mania, and self-harm, while well-being showed an inverted U-shape. All associations between sleep duration and mental health were statistically significant after controlling for confounding variables (P < .001).
On further analysis (using two-line tests), the researchers determined that consistent sleep duration of approximately 7 hours per night was optimal for cognitive performance and for good mental health.
The researchers also used neuroimaging data to examine the relationship between sleep duration and brain structure. Overall, greater changes were seen in the regions of the brain involved in cognitive processing and memory.
“The most significant cortical volumes nonlinearly associated with sleep duration included the precentral cortex, the superior frontal gyrus, the lateral orbitofrontal cortex, the pars orbitalis, the frontal pole, and the middle temporal cortex,” the researchers wrote (P < .05 for all).
The association between sleep duration and cognitive function diminished among individuals older than 65 years, compared with those aged approximately 40 years, which suggests that optimal sleep duration may be more beneficial in middle age, the researchers noted. However, no similar impact of age was seen for mental health. For brain structure, the nonlinear relationship between sleep duration and cortical volumes was greatest in those aged 44-59 years, and gradually flattened with older age.
Research supports sleep discussions with patients
“Primary care physicians can use this study in their discussions with middle-aged and older patients to recommend optimal sleep duration and measures to achieve this sleep target,” Noel Deep, MD, a general internist in group practice in Antigo, Wisc., who was not involved in the study, said in an interview.
“This study is important because it demonstrated that both inadequate and excessive sleep patterns were associated with cognitive and mental health changes,” said Dr. Deep. “It supported previous observations of cognitive decline and mental health disorders being linked to disturbed sleep. But this study was unique because it provides data supporting an optimal sleep duration of 7 hours and the ill effects of both insufficient and excessive sleep duration.
“The usual thought process has been to assume that older individuals may not require as much sleep as the younger individuals, but this study supports an optimal time duration of sleep of 7 hours that benefits the older individuals. It was also interesting to note the mental health effects caused by the inadequate and excessive sleep durations,” he added.
As for additional research, “I would like to look into the quality of the sleep, in addition to the duration of sleep,” said Dr. Deep. For example, whether the excessive sleep was caused by poor quality sleep or fragmented sleep leading to the structural and subsequent cognitive decline.
Study limitations
“The current study relied on self-reporting of the sleep duration and was not observed and recorded data,” Dr. Deep noted. “It would also be beneficial to not only rely on healthy volunteers reporting the sleep duration, but also obtain sleep data from individuals with known brain disorders.”
The study findings were limited by several other factors, including the use of total sleep duration only, without other measures of sleep hygiene, the researchers noted. More research is needed to investigate the mechanisms driving the association between too much and not enough sleep and poor mental health and cognitive function.
The study was supported by the National Key R&D Program of China, the Shanghai Municipal Science and Technology Major Project, the Shanghai Center for Brain Science and Brain-Inspired Technology, the 111 Project, the National Natural Sciences Foundation of China and the Shanghai Rising Star Program.
The researchers had no financial conflicts to disclose. Dr. Deep had no financial conflicts to disclose, but serves on the editorial advisory board of Internal Medicine News.
Sleep disturbances are common in older age, and previous studies have shown associations between too much or too little sleep and increased risk of cognitive decline, but the ideal amount of sleep for preserving mental health has not been well described, according to the authors of the new paper.
In the study published in Nature Aging, the team of researchers from China and the United Kingdom reviewed data from the UK Biobank, a national database of individuals in the United Kingdom that includes cognitive assessments, mental health questionnaires, and brain imaging data, as well as genetic information.
Sleep is important for physical and psychological health, and also serves a neuroprotective function by clearing waste products from the brain, lead author Yuzhu Li of Fudan University, Shanghai, China, and colleagues wrote.
The study population included 498,277 participants, aged 38-73 years, who completed touchscreen questionnaires about sleep duration between 2006 and 2010. The average age at baseline was 56.5 years, 54% were female, and the mean sleep duration was 7.15 hours.
The researchers also reviewed brain imaging data and genetic data from 39,692 participants in 2014 to examine the relationships between sleep duration and brain structure and between sleep duration and genetic risk. In addition, 156,884 participants completed an online follow-up mental health questionnaire in 2016-2017 to assess the longitudinal impact of sleep on mental health.
Both excessive and insufficient sleep was associated with impaired cognitive performance, evidenced by the U-shaped curve found by the researchers in their data analysis, which used quadratic associations.
Specific cognitive functions including pair matching, trail making, prospective memory, and reaction time were significantly impaired with too much or too little sleep, the researchers said. “This demonstrated the positive association of both insufficient and excessive sleep duration with inferior performance on cognitive tasks.”
When the researchers analyzed the association between sleep duration and mental health, sleep duration also showed a U-shaped association with symptoms of anxiety, depression, mental distress, mania, and self-harm, while well-being showed an inverted U-shape. All associations between sleep duration and mental health were statistically significant after controlling for confounding variables (P < .001).
On further analysis (using two-line tests), the researchers determined that consistent sleep duration of approximately 7 hours per night was optimal for cognitive performance and for good mental health.
The researchers also used neuroimaging data to examine the relationship between sleep duration and brain structure. Overall, greater changes were seen in the regions of the brain involved in cognitive processing and memory.
“The most significant cortical volumes nonlinearly associated with sleep duration included the precentral cortex, the superior frontal gyrus, the lateral orbitofrontal cortex, the pars orbitalis, the frontal pole, and the middle temporal cortex,” the researchers wrote (P < .05 for all).
The association between sleep duration and cognitive function diminished among individuals older than 65 years, compared with those aged approximately 40 years, which suggests that optimal sleep duration may be more beneficial in middle age, the researchers noted. However, no similar impact of age was seen for mental health. For brain structure, the nonlinear relationship between sleep duration and cortical volumes was greatest in those aged 44-59 years, and gradually flattened with older age.
Research supports sleep discussions with patients
“Primary care physicians can use this study in their discussions with middle-aged and older patients to recommend optimal sleep duration and measures to achieve this sleep target,” Noel Deep, MD, a general internist in group practice in Antigo, Wisc., who was not involved in the study, said in an interview.
“This study is important because it demonstrated that both inadequate and excessive sleep patterns were associated with cognitive and mental health changes,” said Dr. Deep. “It supported previous observations of cognitive decline and mental health disorders being linked to disturbed sleep. But this study was unique because it provides data supporting an optimal sleep duration of 7 hours and the ill effects of both insufficient and excessive sleep duration.
“The usual thought process has been to assume that older individuals may not require as much sleep as the younger individuals, but this study supports an optimal time duration of sleep of 7 hours that benefits the older individuals. It was also interesting to note the mental health effects caused by the inadequate and excessive sleep durations,” he added.
As for additional research, “I would like to look into the quality of the sleep, in addition to the duration of sleep,” said Dr. Deep. For example, whether the excessive sleep was caused by poor quality sleep or fragmented sleep leading to the structural and subsequent cognitive decline.
Study limitations
“The current study relied on self-reporting of the sleep duration and was not observed and recorded data,” Dr. Deep noted. “It would also be beneficial to not only rely on healthy volunteers reporting the sleep duration, but also obtain sleep data from individuals with known brain disorders.”
The study findings were limited by several other factors, including the use of total sleep duration only, without other measures of sleep hygiene, the researchers noted. More research is needed to investigate the mechanisms driving the association between too much and not enough sleep and poor mental health and cognitive function.
The study was supported by the National Key R&D Program of China, the Shanghai Municipal Science and Technology Major Project, the Shanghai Center for Brain Science and Brain-Inspired Technology, the 111 Project, the National Natural Sciences Foundation of China and the Shanghai Rising Star Program.
The researchers had no financial conflicts to disclose. Dr. Deep had no financial conflicts to disclose, but serves on the editorial advisory board of Internal Medicine News.
FROM NATURE AGING
Severe COVID-19 adds 20 years of cognitive aging: Study
adding that the impairment is “equivalent to losing 10 IQ points.”
In their study, published in eClinicalMedicine, a team of scientists from the University of Cambridge and Imperial College London said there is growing evidence that COVID-19 can cause lasting cognitive and mental health problems. Patients report fatigue, “brain fog,” problems recalling words, sleep disturbances, anxiety, and even posttraumatic stress disorder months after infection.
The researchers analyzed data from 46 individuals who received critical care for COVID-19 at Addenbrooke’s Hospital between March and July 2020 (27 females, 19 males, mean age 51 years, 16 of whom had mechanical ventilation) and were recruited to the NIHR COVID-19 BioResource project.
At an average of 6 months after acute COVID-19 illness, the study participants underwent detailed computerized cognitive tests via the Cognitron platform, comprising eight tasks deployed on an iPad measuring mental function such as memory, attention, and reasoning. Also assessed were anxiety, depression, and posttraumatic stress disorder via standard mood, anxiety, and posttraumatic stress scales – specifically the Generalized Anxiety Disorder 7 (GAD-7), the Patient Health Questionnaire 9 (PHQ-9), and the PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders 5 (PCL-5). Their data were compared against 460 controls – matched for age, sex, education, and first language – and the pattern of deficits across tasks was qualitatively compared with normal age-related decline and early-stage dementia.
Less accurate and slower response times
The authors highlighted how this was the first time a “rigorous assessment and comparison” had been carried out in relation to the after-effects of severe COVID-19.
“Cognitive impairment is common to a wide range of neurological disorders, including dementia, and even routine aging, but the patterns we saw – the cognitive ‘fingerprint’ of COVID-19 – was distinct from all of these,” said David Menon, MD, division of anesthesia at the University of Cambridge, England, and the study’s senior author.
The scientists found that COVID-19 survivors were less accurate and had slower response times than the control population, and added that survivors scored particularly poorly on verbal analogical reasoning and showed slower processing speeds.
Critically, the scale of the cognitive deficits correlated with acute illness severity, but not fatigue or mental health status at the time of cognitive assessment, said the authors.
Recovery ‘at best gradual’
The effects were strongest for those with more severe acute illness, and who required mechanical ventilation, said the authors, who found that acute illness severity was “better at predicting the cognitive deficits.”
The authors pointed out how these deficits were still detectable when patients were followed up 6 months later, and that, although patients’ scores and reaction times began to improve over time, any recovery was “at best gradual” and likely to be influenced by factors such as illness severity and its neurological or psychological impacts.
“We followed some patients up as late as 10 months after their acute infection, so were able to see a very slow improvement,” Dr. Menon said. He explained how, while this improvement was not statistically significant, it was “at least heading in the right direction.”
However, he warned it is very possible that some of these individuals “will never fully recover.”
The cognitive deficits observed may be due to several factors in combination, said the authors, including inadequate oxygen or blood supply to the brain, blockage of large or small blood vessels due to clotting, and microscopic bleeds. They highlighted how the most important mechanism, however, may be “damage caused by the body’s own inflammatory response and immune system.”
Adam Hampshire, PhD, of the department of brain sciences at Imperial College London, one of the study’s authors, described how around 40,000 people have been through intensive care with COVID-19 in England alone, with many more despite having been very sick not admitted to hospital. This means there is a “large number of people out there still experiencing problems with cognition many months later,” he said. “We urgently need to look at what can be done to help these people.”
A version of this article first appeared on Univadis.
adding that the impairment is “equivalent to losing 10 IQ points.”
In their study, published in eClinicalMedicine, a team of scientists from the University of Cambridge and Imperial College London said there is growing evidence that COVID-19 can cause lasting cognitive and mental health problems. Patients report fatigue, “brain fog,” problems recalling words, sleep disturbances, anxiety, and even posttraumatic stress disorder months after infection.
The researchers analyzed data from 46 individuals who received critical care for COVID-19 at Addenbrooke’s Hospital between March and July 2020 (27 females, 19 males, mean age 51 years, 16 of whom had mechanical ventilation) and were recruited to the NIHR COVID-19 BioResource project.
At an average of 6 months after acute COVID-19 illness, the study participants underwent detailed computerized cognitive tests via the Cognitron platform, comprising eight tasks deployed on an iPad measuring mental function such as memory, attention, and reasoning. Also assessed were anxiety, depression, and posttraumatic stress disorder via standard mood, anxiety, and posttraumatic stress scales – specifically the Generalized Anxiety Disorder 7 (GAD-7), the Patient Health Questionnaire 9 (PHQ-9), and the PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders 5 (PCL-5). Their data were compared against 460 controls – matched for age, sex, education, and first language – and the pattern of deficits across tasks was qualitatively compared with normal age-related decline and early-stage dementia.
Less accurate and slower response times
The authors highlighted how this was the first time a “rigorous assessment and comparison” had been carried out in relation to the after-effects of severe COVID-19.
“Cognitive impairment is common to a wide range of neurological disorders, including dementia, and even routine aging, but the patterns we saw – the cognitive ‘fingerprint’ of COVID-19 – was distinct from all of these,” said David Menon, MD, division of anesthesia at the University of Cambridge, England, and the study’s senior author.
The scientists found that COVID-19 survivors were less accurate and had slower response times than the control population, and added that survivors scored particularly poorly on verbal analogical reasoning and showed slower processing speeds.
Critically, the scale of the cognitive deficits correlated with acute illness severity, but not fatigue or mental health status at the time of cognitive assessment, said the authors.
Recovery ‘at best gradual’
The effects were strongest for those with more severe acute illness, and who required mechanical ventilation, said the authors, who found that acute illness severity was “better at predicting the cognitive deficits.”
The authors pointed out how these deficits were still detectable when patients were followed up 6 months later, and that, although patients’ scores and reaction times began to improve over time, any recovery was “at best gradual” and likely to be influenced by factors such as illness severity and its neurological or psychological impacts.
“We followed some patients up as late as 10 months after their acute infection, so were able to see a very slow improvement,” Dr. Menon said. He explained how, while this improvement was not statistically significant, it was “at least heading in the right direction.”
However, he warned it is very possible that some of these individuals “will never fully recover.”
The cognitive deficits observed may be due to several factors in combination, said the authors, including inadequate oxygen or blood supply to the brain, blockage of large or small blood vessels due to clotting, and microscopic bleeds. They highlighted how the most important mechanism, however, may be “damage caused by the body’s own inflammatory response and immune system.”
Adam Hampshire, PhD, of the department of brain sciences at Imperial College London, one of the study’s authors, described how around 40,000 people have been through intensive care with COVID-19 in England alone, with many more despite having been very sick not admitted to hospital. This means there is a “large number of people out there still experiencing problems with cognition many months later,” he said. “We urgently need to look at what can be done to help these people.”
A version of this article first appeared on Univadis.
adding that the impairment is “equivalent to losing 10 IQ points.”
In their study, published in eClinicalMedicine, a team of scientists from the University of Cambridge and Imperial College London said there is growing evidence that COVID-19 can cause lasting cognitive and mental health problems. Patients report fatigue, “brain fog,” problems recalling words, sleep disturbances, anxiety, and even posttraumatic stress disorder months after infection.
The researchers analyzed data from 46 individuals who received critical care for COVID-19 at Addenbrooke’s Hospital between March and July 2020 (27 females, 19 males, mean age 51 years, 16 of whom had mechanical ventilation) and were recruited to the NIHR COVID-19 BioResource project.
At an average of 6 months after acute COVID-19 illness, the study participants underwent detailed computerized cognitive tests via the Cognitron platform, comprising eight tasks deployed on an iPad measuring mental function such as memory, attention, and reasoning. Also assessed were anxiety, depression, and posttraumatic stress disorder via standard mood, anxiety, and posttraumatic stress scales – specifically the Generalized Anxiety Disorder 7 (GAD-7), the Patient Health Questionnaire 9 (PHQ-9), and the PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders 5 (PCL-5). Their data were compared against 460 controls – matched for age, sex, education, and first language – and the pattern of deficits across tasks was qualitatively compared with normal age-related decline and early-stage dementia.
Less accurate and slower response times
The authors highlighted how this was the first time a “rigorous assessment and comparison” had been carried out in relation to the after-effects of severe COVID-19.
“Cognitive impairment is common to a wide range of neurological disorders, including dementia, and even routine aging, but the patterns we saw – the cognitive ‘fingerprint’ of COVID-19 – was distinct from all of these,” said David Menon, MD, division of anesthesia at the University of Cambridge, England, and the study’s senior author.
The scientists found that COVID-19 survivors were less accurate and had slower response times than the control population, and added that survivors scored particularly poorly on verbal analogical reasoning and showed slower processing speeds.
Critically, the scale of the cognitive deficits correlated with acute illness severity, but not fatigue or mental health status at the time of cognitive assessment, said the authors.
Recovery ‘at best gradual’
The effects were strongest for those with more severe acute illness, and who required mechanical ventilation, said the authors, who found that acute illness severity was “better at predicting the cognitive deficits.”
The authors pointed out how these deficits were still detectable when patients were followed up 6 months later, and that, although patients’ scores and reaction times began to improve over time, any recovery was “at best gradual” and likely to be influenced by factors such as illness severity and its neurological or psychological impacts.
“We followed some patients up as late as 10 months after their acute infection, so were able to see a very slow improvement,” Dr. Menon said. He explained how, while this improvement was not statistically significant, it was “at least heading in the right direction.”
However, he warned it is very possible that some of these individuals “will never fully recover.”
The cognitive deficits observed may be due to several factors in combination, said the authors, including inadequate oxygen or blood supply to the brain, blockage of large or small blood vessels due to clotting, and microscopic bleeds. They highlighted how the most important mechanism, however, may be “damage caused by the body’s own inflammatory response and immune system.”
Adam Hampshire, PhD, of the department of brain sciences at Imperial College London, one of the study’s authors, described how around 40,000 people have been through intensive care with COVID-19 in England alone, with many more despite having been very sick not admitted to hospital. This means there is a “large number of people out there still experiencing problems with cognition many months later,” he said. “We urgently need to look at what can be done to help these people.”
A version of this article first appeared on Univadis.
FROM ECLINICAL MEDICINE
Cutting dementia risk in AFib: Does rhythm control strategy matter?
The risk for dementia goes up in patients with atrial fibrillation (AFib), but some evidence suggests that risk can be blunted with therapies that restore sinus rhythm. However, a new cohort study suggests that the treatment effect’s magnitude might depend on the rhythm control strategy. It hinted that AFib catheter ablation might be more effective than pharmacologic rhythm control alone at cutting the risk for dementia.
The case-matched study of more than 38,000 adults with AFib saw a 41% reduction (P < .0001) in risk for dementia among those who underwent catheter ablation after attempted rhythm control with antiarrhythmic drugs (AAD), compared with those managed with pharmacologic rhythm control therapy alone.
The observational study comprising 20 years of data comes with big limitations and can’t say for sure whether catheter ablation is better than AAD-only at cutting the dementia risk in AFib. But it and other evidence support the idea, which has yet to be explored in a randomized fashion.
In a secondary finding, the analysis showed a similar reduction in dementia risk from catheter ablation, compared with AAD, in women and in men by 40% and 45%, respectively (P < .0001 for both). The findings are particularly relevant “given the higher life-long risk of dementia among women and the lower likelihood that women will be offered ablation, which has been demonstrated repeatedly,” Emily P. Zeitler, MD, MHS, Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire, told this news organization. “I think this is another reason to try to be more generous in offering ablation to women.”
Management of AFib certainly evolved in important ways from 2000 to 2021, the period covered by the study. But a sensitivity analysis based on data from 2010 to 2021 showed “no meaningful differences” in the results, said Dr. Zeitler, who is slated to present the findings April 30 at the Heart Rhythm Society 2022 Scientific Sessions, conducted virtually and live in San Francisco.
Dr. Zeitler acknowledged that the observational study, even with its propensity-matched ablation and AAD cohorts, can only hint at a preference for ablation over AAD for lowering risk for AFib-associated dementia. “We know there’s unmeasured and unfixable confounding between those two groups, so we see this really as hypothesis-generating.”
It was “a well-done analysis,” and the conclusion that the dementia risk was lower with catheter ablation is “absolutely correct,” but only as far as the study and its limitations allow, agreed David Conen, MD, MPH, McMaster University, Hamilton, Ontario, who is not a coauthor.
“Even with propensity matching, you can get rid of some sorts of confounding, but you can never get rid of all selection bias issues.” That, he said when interviewed, takes randomized trials.
Dr. Conen, who is studying cognitive decline in AFib as a SWISS-AF trial principal investigator, pointed to a secondary finding of the analysis as evidence for such confounding. He said the ablation group’s nearly 50% drop (P < .0001) in competing risk for death, compared with patients managed with AAD, isn’t plausible.
The finding “strongly suggests these people were healthier and that there’s some sort of selection bias. They were at lower risk of death, they were at lower risk of dementia, and they were probably also at lower risk of stroke, myocardial infarction, thrombosis, and cancer because they were just probably a little healthier than the others,” Dr. Conen said. The ablation and AAD groups “were two very different populations from the get-go.”
The analysis was based on U.S. insurance and Medicare claims data from AFib patients who either underwent catheter ablation after at least one AAD trial or filled prescriptions for at least two different antiarrhythmic agents in the year after AFib diagnosis. Patients with history of dementia, catheter or surgical AFib ablation, or a valve procedure were excluded.
The ablation and AAD-only groups each consisted of 19,066 patients after propensity matching, and the groups were balanced with respect to age, sex, type of insurance, CHA2DS2-VASc scores, and use of renin-angiotensin-system inhibitors, oral anticoagulants, and antiplatelets.
The overall risk for dementia was 1.9% for the ablation group and 3.3% for AAD-only patients (hazard ratio, 0.59; 95% confidence interval, 0.52-0.67). Corresponding HRs by sex were 0.55 (95% CI, 0.46-0.66) for men and 0.60 (95% CI, 0.50-0.72) for women.
The competing risk for death was also significantly decreased in the ablation group (HR, 0.51; 95% CI, 0.46-0.55).
Dr. Zeitler pointed to a randomized trial now in the early stages called Neurocognition and Greater Maintenance of Sinus Rhythm in Atrial Fibrillation, or NOGGIN-AF, which will explore relationships between rhythm control therapy and dementia in patients with AFib, whether catheter ablation or AAD can mitigate that risk, and whether either strategy works better than the other, among other goals.
“I’m optimistic,” she said, “and I think it’s going to add to the growing motivations to get patients ablated more quickly and more broadly.”
The analysis was funded by Biosense-Webster. Dr. Zeitler discloses consulting for Biosense-Webster and Arena Pharmaceuticals (now Pfizer); fees for speaking from Medtronic; and receiving research support from Boston Scientific, Sanofi, and Biosense-Webster. Dr. Conen has previously reported receiving speaker fees from Servier Canada.
A version of this article first appeared on Medscape.com.
The risk for dementia goes up in patients with atrial fibrillation (AFib), but some evidence suggests that risk can be blunted with therapies that restore sinus rhythm. However, a new cohort study suggests that the treatment effect’s magnitude might depend on the rhythm control strategy. It hinted that AFib catheter ablation might be more effective than pharmacologic rhythm control alone at cutting the risk for dementia.
The case-matched study of more than 38,000 adults with AFib saw a 41% reduction (P < .0001) in risk for dementia among those who underwent catheter ablation after attempted rhythm control with antiarrhythmic drugs (AAD), compared with those managed with pharmacologic rhythm control therapy alone.
The observational study comprising 20 years of data comes with big limitations and can’t say for sure whether catheter ablation is better than AAD-only at cutting the dementia risk in AFib. But it and other evidence support the idea, which has yet to be explored in a randomized fashion.
In a secondary finding, the analysis showed a similar reduction in dementia risk from catheter ablation, compared with AAD, in women and in men by 40% and 45%, respectively (P < .0001 for both). The findings are particularly relevant “given the higher life-long risk of dementia among women and the lower likelihood that women will be offered ablation, which has been demonstrated repeatedly,” Emily P. Zeitler, MD, MHS, Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire, told this news organization. “I think this is another reason to try to be more generous in offering ablation to women.”
Management of AFib certainly evolved in important ways from 2000 to 2021, the period covered by the study. But a sensitivity analysis based on data from 2010 to 2021 showed “no meaningful differences” in the results, said Dr. Zeitler, who is slated to present the findings April 30 at the Heart Rhythm Society 2022 Scientific Sessions, conducted virtually and live in San Francisco.
Dr. Zeitler acknowledged that the observational study, even with its propensity-matched ablation and AAD cohorts, can only hint at a preference for ablation over AAD for lowering risk for AFib-associated dementia. “We know there’s unmeasured and unfixable confounding between those two groups, so we see this really as hypothesis-generating.”
It was “a well-done analysis,” and the conclusion that the dementia risk was lower with catheter ablation is “absolutely correct,” but only as far as the study and its limitations allow, agreed David Conen, MD, MPH, McMaster University, Hamilton, Ontario, who is not a coauthor.
“Even with propensity matching, you can get rid of some sorts of confounding, but you can never get rid of all selection bias issues.” That, he said when interviewed, takes randomized trials.
Dr. Conen, who is studying cognitive decline in AFib as a SWISS-AF trial principal investigator, pointed to a secondary finding of the analysis as evidence for such confounding. He said the ablation group’s nearly 50% drop (P < .0001) in competing risk for death, compared with patients managed with AAD, isn’t plausible.
The finding “strongly suggests these people were healthier and that there’s some sort of selection bias. They were at lower risk of death, they were at lower risk of dementia, and they were probably also at lower risk of stroke, myocardial infarction, thrombosis, and cancer because they were just probably a little healthier than the others,” Dr. Conen said. The ablation and AAD groups “were two very different populations from the get-go.”
The analysis was based on U.S. insurance and Medicare claims data from AFib patients who either underwent catheter ablation after at least one AAD trial or filled prescriptions for at least two different antiarrhythmic agents in the year after AFib diagnosis. Patients with history of dementia, catheter or surgical AFib ablation, or a valve procedure were excluded.
The ablation and AAD-only groups each consisted of 19,066 patients after propensity matching, and the groups were balanced with respect to age, sex, type of insurance, CHA2DS2-VASc scores, and use of renin-angiotensin-system inhibitors, oral anticoagulants, and antiplatelets.
The overall risk for dementia was 1.9% for the ablation group and 3.3% for AAD-only patients (hazard ratio, 0.59; 95% confidence interval, 0.52-0.67). Corresponding HRs by sex were 0.55 (95% CI, 0.46-0.66) for men and 0.60 (95% CI, 0.50-0.72) for women.
The competing risk for death was also significantly decreased in the ablation group (HR, 0.51; 95% CI, 0.46-0.55).
Dr. Zeitler pointed to a randomized trial now in the early stages called Neurocognition and Greater Maintenance of Sinus Rhythm in Atrial Fibrillation, or NOGGIN-AF, which will explore relationships between rhythm control therapy and dementia in patients with AFib, whether catheter ablation or AAD can mitigate that risk, and whether either strategy works better than the other, among other goals.
“I’m optimistic,” she said, “and I think it’s going to add to the growing motivations to get patients ablated more quickly and more broadly.”
The analysis was funded by Biosense-Webster. Dr. Zeitler discloses consulting for Biosense-Webster and Arena Pharmaceuticals (now Pfizer); fees for speaking from Medtronic; and receiving research support from Boston Scientific, Sanofi, and Biosense-Webster. Dr. Conen has previously reported receiving speaker fees from Servier Canada.
A version of this article first appeared on Medscape.com.
The risk for dementia goes up in patients with atrial fibrillation (AFib), but some evidence suggests that risk can be blunted with therapies that restore sinus rhythm. However, a new cohort study suggests that the treatment effect’s magnitude might depend on the rhythm control strategy. It hinted that AFib catheter ablation might be more effective than pharmacologic rhythm control alone at cutting the risk for dementia.
The case-matched study of more than 38,000 adults with AFib saw a 41% reduction (P < .0001) in risk for dementia among those who underwent catheter ablation after attempted rhythm control with antiarrhythmic drugs (AAD), compared with those managed with pharmacologic rhythm control therapy alone.
The observational study comprising 20 years of data comes with big limitations and can’t say for sure whether catheter ablation is better than AAD-only at cutting the dementia risk in AFib. But it and other evidence support the idea, which has yet to be explored in a randomized fashion.
In a secondary finding, the analysis showed a similar reduction in dementia risk from catheter ablation, compared with AAD, in women and in men by 40% and 45%, respectively (P < .0001 for both). The findings are particularly relevant “given the higher life-long risk of dementia among women and the lower likelihood that women will be offered ablation, which has been demonstrated repeatedly,” Emily P. Zeitler, MD, MHS, Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire, told this news organization. “I think this is another reason to try to be more generous in offering ablation to women.”
Management of AFib certainly evolved in important ways from 2000 to 2021, the period covered by the study. But a sensitivity analysis based on data from 2010 to 2021 showed “no meaningful differences” in the results, said Dr. Zeitler, who is slated to present the findings April 30 at the Heart Rhythm Society 2022 Scientific Sessions, conducted virtually and live in San Francisco.
Dr. Zeitler acknowledged that the observational study, even with its propensity-matched ablation and AAD cohorts, can only hint at a preference for ablation over AAD for lowering risk for AFib-associated dementia. “We know there’s unmeasured and unfixable confounding between those two groups, so we see this really as hypothesis-generating.”
It was “a well-done analysis,” and the conclusion that the dementia risk was lower with catheter ablation is “absolutely correct,” but only as far as the study and its limitations allow, agreed David Conen, MD, MPH, McMaster University, Hamilton, Ontario, who is not a coauthor.
“Even with propensity matching, you can get rid of some sorts of confounding, but you can never get rid of all selection bias issues.” That, he said when interviewed, takes randomized trials.
Dr. Conen, who is studying cognitive decline in AFib as a SWISS-AF trial principal investigator, pointed to a secondary finding of the analysis as evidence for such confounding. He said the ablation group’s nearly 50% drop (P < .0001) in competing risk for death, compared with patients managed with AAD, isn’t plausible.
The finding “strongly suggests these people were healthier and that there’s some sort of selection bias. They were at lower risk of death, they were at lower risk of dementia, and they were probably also at lower risk of stroke, myocardial infarction, thrombosis, and cancer because they were just probably a little healthier than the others,” Dr. Conen said. The ablation and AAD groups “were two very different populations from the get-go.”
The analysis was based on U.S. insurance and Medicare claims data from AFib patients who either underwent catheter ablation after at least one AAD trial or filled prescriptions for at least two different antiarrhythmic agents in the year after AFib diagnosis. Patients with history of dementia, catheter or surgical AFib ablation, or a valve procedure were excluded.
The ablation and AAD-only groups each consisted of 19,066 patients after propensity matching, and the groups were balanced with respect to age, sex, type of insurance, CHA2DS2-VASc scores, and use of renin-angiotensin-system inhibitors, oral anticoagulants, and antiplatelets.
The overall risk for dementia was 1.9% for the ablation group and 3.3% for AAD-only patients (hazard ratio, 0.59; 95% confidence interval, 0.52-0.67). Corresponding HRs by sex were 0.55 (95% CI, 0.46-0.66) for men and 0.60 (95% CI, 0.50-0.72) for women.
The competing risk for death was also significantly decreased in the ablation group (HR, 0.51; 95% CI, 0.46-0.55).
Dr. Zeitler pointed to a randomized trial now in the early stages called Neurocognition and Greater Maintenance of Sinus Rhythm in Atrial Fibrillation, or NOGGIN-AF, which will explore relationships between rhythm control therapy and dementia in patients with AFib, whether catheter ablation or AAD can mitigate that risk, and whether either strategy works better than the other, among other goals.
“I’m optimistic,” she said, “and I think it’s going to add to the growing motivations to get patients ablated more quickly and more broadly.”
The analysis was funded by Biosense-Webster. Dr. Zeitler discloses consulting for Biosense-Webster and Arena Pharmaceuticals (now Pfizer); fees for speaking from Medtronic; and receiving research support from Boston Scientific, Sanofi, and Biosense-Webster. Dr. Conen has previously reported receiving speaker fees from Servier Canada.
A version of this article first appeared on Medscape.com.
Cutting dementia risk in atrial fibrillation: Does rhythm control strategy matter?
The risk for dementia goes up in patients with atrial fibrillation (AFib), but some evidence suggests that risk can be blunted with therapies that restore sinus rhythm. But a new cohort study suggests that the treatment effect’s magnitude might depend on the rhythm control strategy. It hinted that AFib catheter ablation might be more effective than pharmacologic rhythm control alone at cutting the risk for dementia.
The case-matched study of more than 38,000 adults with AFib saw a 41% reduction (P < .0001) in risk for dementia among those who underwent catheter ablation after attempted rhythm control with antiarrhythmic drugs (AAD), compared with those managed with pharmacologic rhythm control therapy alone.
The observational study comprising 20 years of data comes with big limitations and can’t say for sure whether catheter ablation is better than AAD alone at cutting the dementia risk in AFib. But it and other evidence support the idea, which has yet to be explored in a randomized fashion.
In a secondary finding, the analysis showed a similar reduction in dementia risk from catheter ablation, compared with AAD, in women and in men by 40% and 45%, respectively (P < .0001 for both). The findings are particularly relevant “given the higher life-long risk of dementia among women and the lower likelihood that women will be offered ablation, which has been demonstrated repeatedly,” Emily P. Zeitler, MD, MHS, Dartmouth-Hitchcock Medical Center, Lebanon, N.H., said in an interview. “I think this is another reason to try to be more generous in offering ablation to women.”
Management of AFib certainly evolved in important ways from 2000 to 2021, the period covered by the study. But a sensitivity analysis based on data from 2010 to 2021 showed “no meaningful differences” in the results, said Dr. Zeitler, who is slated to present the findings at the annual scientific sessions of the Heart Rhythm Society.
Dr. Zeitler acknowledged that the observational study, even with its propensity-matched ablation and AAD cohorts, can only hint at a preference for ablation over AAD for lowering risk for AFib-associated dementia. “We know there’s unmeasured and unfixable confounding between those two groups, so we see this really as hypothesis-generating.”
It was “a well-done analysis,” and the conclusion that the dementia risk was lower with catheter ablation is “absolutely correct,” but only as far as the study and its limitations allow, agreed David Conen, MD, MPH, McMaster University, Hamilton, Ont., who is not a coauthor.
“Even with propensity matching, you can get rid of some sorts of confounding, but you can never get rid of all selection bias issues.” That, he said when interviewed, takes randomized trials.
Dr. Conen, who is studying cognitive decline in AFib as a SWISS-AF trial principal investigator, pointed to a secondary finding of the analysis as evidence for such confounding. He said the ablation group’s nearly 50% drop (P < .0001) in competing risk for death, compared with patients managed with AAD, isn’t plausible.
The finding “strongly suggests these people were healthier and that there’s some sort of selection bias. They were at lower risk of death, they were at lower risk of dementia, and they were probably also at lower risk of stroke, myocardial infarction, thrombosis, and cancer because they were just probably a little healthier than the others,” Dr. Conen said. The ablation and AAD groups “were two very different populations from the get-go.”
The analysis was based on U.S. insurance and Medicare claims data from AFib patients who either underwent catheter ablation after at least one AAD trial or filled prescriptions for at least two different antiarrhythmic agents in the year after AFib diagnosis. Patients with history of dementia, catheter or surgical AFib ablation, or a valve procedure were excluded.
The ablation and AAD-only groups each consisted of 19,066 patients after propensity matching, and the groups were balanced with respect to age, sex, type of insurance, CHA2DS2-VASc scores, and use of renin-angiotensin system inhibitors, oral anticoagulants, and antiplatelets.
The overall risk for dementia was 1.9% for the ablation group and 3.3% for AAD-only patients (hazard ratio, 0.59; 95% confidence interval, 0.52-0.67). Corresponding HRs by sex were 0.55 (95% CI, 0.46-0.66) for men and 0.60 (95% CI, 0.50-0.72) for women.
The competing risk for death was also significantly decreased in the ablation group (HR, 0.51; 95% CI, 0.46-0.55).
Dr. Zeitler pointed to a randomized trial now in the early stages called Neurocognition and Greater Maintenance of Sinus Rhythm in Atrial Fibrillation, or NOGGIN-AF, which will explore relationships between rhythm control therapy and dementia in patients with AFib, whether catheter ablation or AAD can mitigate that risk, and whether either strategy works better than the other, among other goals.
“I’m optimistic,” she said, “and I think it’s going to add to the growing motivations to get patients ablated more quickly and more broadly.”
The analysis was funded by Biosense-Webster. Dr. Zeitler disclosed consulting for Biosense-Webster and Arena Pharmaceuticals (now Pfizer); fees for speaking from Medtronic; and receiving research support from Boston Scientific, Sanofi, and Biosense-Webster. Dr. Conen has previously reported receiving speaker fees from Servier Canada.
A version of this article first appeared on Medscape.com.
The risk for dementia goes up in patients with atrial fibrillation (AFib), but some evidence suggests that risk can be blunted with therapies that restore sinus rhythm. But a new cohort study suggests that the treatment effect’s magnitude might depend on the rhythm control strategy. It hinted that AFib catheter ablation might be more effective than pharmacologic rhythm control alone at cutting the risk for dementia.
The case-matched study of more than 38,000 adults with AFib saw a 41% reduction (P < .0001) in risk for dementia among those who underwent catheter ablation after attempted rhythm control with antiarrhythmic drugs (AAD), compared with those managed with pharmacologic rhythm control therapy alone.
The observational study comprising 20 years of data comes with big limitations and can’t say for sure whether catheter ablation is better than AAD alone at cutting the dementia risk in AFib. But it and other evidence support the idea, which has yet to be explored in a randomized fashion.
In a secondary finding, the analysis showed a similar reduction in dementia risk from catheter ablation, compared with AAD, in women and in men by 40% and 45%, respectively (P < .0001 for both). The findings are particularly relevant “given the higher life-long risk of dementia among women and the lower likelihood that women will be offered ablation, which has been demonstrated repeatedly,” Emily P. Zeitler, MD, MHS, Dartmouth-Hitchcock Medical Center, Lebanon, N.H., said in an interview. “I think this is another reason to try to be more generous in offering ablation to women.”
Management of AFib certainly evolved in important ways from 2000 to 2021, the period covered by the study. But a sensitivity analysis based on data from 2010 to 2021 showed “no meaningful differences” in the results, said Dr. Zeitler, who is slated to present the findings at the annual scientific sessions of the Heart Rhythm Society.
Dr. Zeitler acknowledged that the observational study, even with its propensity-matched ablation and AAD cohorts, can only hint at a preference for ablation over AAD for lowering risk for AFib-associated dementia. “We know there’s unmeasured and unfixable confounding between those two groups, so we see this really as hypothesis-generating.”
It was “a well-done analysis,” and the conclusion that the dementia risk was lower with catheter ablation is “absolutely correct,” but only as far as the study and its limitations allow, agreed David Conen, MD, MPH, McMaster University, Hamilton, Ont., who is not a coauthor.
“Even with propensity matching, you can get rid of some sorts of confounding, but you can never get rid of all selection bias issues.” That, he said when interviewed, takes randomized trials.
Dr. Conen, who is studying cognitive decline in AFib as a SWISS-AF trial principal investigator, pointed to a secondary finding of the analysis as evidence for such confounding. He said the ablation group’s nearly 50% drop (P < .0001) in competing risk for death, compared with patients managed with AAD, isn’t plausible.
The finding “strongly suggests these people were healthier and that there’s some sort of selection bias. They were at lower risk of death, they were at lower risk of dementia, and they were probably also at lower risk of stroke, myocardial infarction, thrombosis, and cancer because they were just probably a little healthier than the others,” Dr. Conen said. The ablation and AAD groups “were two very different populations from the get-go.”
The analysis was based on U.S. insurance and Medicare claims data from AFib patients who either underwent catheter ablation after at least one AAD trial or filled prescriptions for at least two different antiarrhythmic agents in the year after AFib diagnosis. Patients with history of dementia, catheter or surgical AFib ablation, or a valve procedure were excluded.
The ablation and AAD-only groups each consisted of 19,066 patients after propensity matching, and the groups were balanced with respect to age, sex, type of insurance, CHA2DS2-VASc scores, and use of renin-angiotensin system inhibitors, oral anticoagulants, and antiplatelets.
The overall risk for dementia was 1.9% for the ablation group and 3.3% for AAD-only patients (hazard ratio, 0.59; 95% confidence interval, 0.52-0.67). Corresponding HRs by sex were 0.55 (95% CI, 0.46-0.66) for men and 0.60 (95% CI, 0.50-0.72) for women.
The competing risk for death was also significantly decreased in the ablation group (HR, 0.51; 95% CI, 0.46-0.55).
Dr. Zeitler pointed to a randomized trial now in the early stages called Neurocognition and Greater Maintenance of Sinus Rhythm in Atrial Fibrillation, or NOGGIN-AF, which will explore relationships between rhythm control therapy and dementia in patients with AFib, whether catheter ablation or AAD can mitigate that risk, and whether either strategy works better than the other, among other goals.
“I’m optimistic,” she said, “and I think it’s going to add to the growing motivations to get patients ablated more quickly and more broadly.”
The analysis was funded by Biosense-Webster. Dr. Zeitler disclosed consulting for Biosense-Webster and Arena Pharmaceuticals (now Pfizer); fees for speaking from Medtronic; and receiving research support from Boston Scientific, Sanofi, and Biosense-Webster. Dr. Conen has previously reported receiving speaker fees from Servier Canada.
A version of this article first appeared on Medscape.com.
The risk for dementia goes up in patients with atrial fibrillation (AFib), but some evidence suggests that risk can be blunted with therapies that restore sinus rhythm. But a new cohort study suggests that the treatment effect’s magnitude might depend on the rhythm control strategy. It hinted that AFib catheter ablation might be more effective than pharmacologic rhythm control alone at cutting the risk for dementia.
The case-matched study of more than 38,000 adults with AFib saw a 41% reduction (P < .0001) in risk for dementia among those who underwent catheter ablation after attempted rhythm control with antiarrhythmic drugs (AAD), compared with those managed with pharmacologic rhythm control therapy alone.
The observational study comprising 20 years of data comes with big limitations and can’t say for sure whether catheter ablation is better than AAD alone at cutting the dementia risk in AFib. But it and other evidence support the idea, which has yet to be explored in a randomized fashion.
In a secondary finding, the analysis showed a similar reduction in dementia risk from catheter ablation, compared with AAD, in women and in men by 40% and 45%, respectively (P < .0001 for both). The findings are particularly relevant “given the higher life-long risk of dementia among women and the lower likelihood that women will be offered ablation, which has been demonstrated repeatedly,” Emily P. Zeitler, MD, MHS, Dartmouth-Hitchcock Medical Center, Lebanon, N.H., said in an interview. “I think this is another reason to try to be more generous in offering ablation to women.”
Management of AFib certainly evolved in important ways from 2000 to 2021, the period covered by the study. But a sensitivity analysis based on data from 2010 to 2021 showed “no meaningful differences” in the results, said Dr. Zeitler, who is slated to present the findings at the annual scientific sessions of the Heart Rhythm Society.
Dr. Zeitler acknowledged that the observational study, even with its propensity-matched ablation and AAD cohorts, can only hint at a preference for ablation over AAD for lowering risk for AFib-associated dementia. “We know there’s unmeasured and unfixable confounding between those two groups, so we see this really as hypothesis-generating.”
It was “a well-done analysis,” and the conclusion that the dementia risk was lower with catheter ablation is “absolutely correct,” but only as far as the study and its limitations allow, agreed David Conen, MD, MPH, McMaster University, Hamilton, Ont., who is not a coauthor.
“Even with propensity matching, you can get rid of some sorts of confounding, but you can never get rid of all selection bias issues.” That, he said when interviewed, takes randomized trials.
Dr. Conen, who is studying cognitive decline in AFib as a SWISS-AF trial principal investigator, pointed to a secondary finding of the analysis as evidence for such confounding. He said the ablation group’s nearly 50% drop (P < .0001) in competing risk for death, compared with patients managed with AAD, isn’t plausible.
The finding “strongly suggests these people were healthier and that there’s some sort of selection bias. They were at lower risk of death, they were at lower risk of dementia, and they were probably also at lower risk of stroke, myocardial infarction, thrombosis, and cancer because they were just probably a little healthier than the others,” Dr. Conen said. The ablation and AAD groups “were two very different populations from the get-go.”
The analysis was based on U.S. insurance and Medicare claims data from AFib patients who either underwent catheter ablation after at least one AAD trial or filled prescriptions for at least two different antiarrhythmic agents in the year after AFib diagnosis. Patients with history of dementia, catheter or surgical AFib ablation, or a valve procedure were excluded.
The ablation and AAD-only groups each consisted of 19,066 patients after propensity matching, and the groups were balanced with respect to age, sex, type of insurance, CHA2DS2-VASc scores, and use of renin-angiotensin system inhibitors, oral anticoagulants, and antiplatelets.
The overall risk for dementia was 1.9% for the ablation group and 3.3% for AAD-only patients (hazard ratio, 0.59; 95% confidence interval, 0.52-0.67). Corresponding HRs by sex were 0.55 (95% CI, 0.46-0.66) for men and 0.60 (95% CI, 0.50-0.72) for women.
The competing risk for death was also significantly decreased in the ablation group (HR, 0.51; 95% CI, 0.46-0.55).
Dr. Zeitler pointed to a randomized trial now in the early stages called Neurocognition and Greater Maintenance of Sinus Rhythm in Atrial Fibrillation, or NOGGIN-AF, which will explore relationships between rhythm control therapy and dementia in patients with AFib, whether catheter ablation or AAD can mitigate that risk, and whether either strategy works better than the other, among other goals.
“I’m optimistic,” she said, “and I think it’s going to add to the growing motivations to get patients ablated more quickly and more broadly.”
The analysis was funded by Biosense-Webster. Dr. Zeitler disclosed consulting for Biosense-Webster and Arena Pharmaceuticals (now Pfizer); fees for speaking from Medtronic; and receiving research support from Boston Scientific, Sanofi, and Biosense-Webster. Dr. Conen has previously reported receiving speaker fees from Servier Canada.
A version of this article first appeared on Medscape.com.
HEART RHYTHM 2022
Traumatic brain injury linked to ‘striking’ risk for CVD, diabetes, brain disorders
Mild traumatic brain injury (TBI) is linked to a significantly increased risk for a host of subsequent cardiovascular, endocrine, neurologic, and psychiatric disorders, new research shows.
Incidence of hypertension, coronary heart disease, diabetes, stroke, depression, and dementia all began to increase soon after the brain injury and persisted over a decade in both mild and moderate to severe TBI.
Researchers found the multisystem comorbidities in all age groups, including in patients as young as 18. They also found that patients who developed multiple postinjury problems had higher mortality during the decade-long follow-up.
The findings suggest patients with TBI may require longer follow-up and proactive screening for multisystem disease, regardless of age or injury severity.
“The fact that both patients with mild and moderate to severe injuries both had long-term ongoing associations with comorbidities that continued over time and that they are cardiovascular, endocrine, neurologic, and behavioral health oriented was pretty striking,” study author Ross Zafonte, DO, PhD, president of Spaulding Rehab Hospital and professor and chair of physical medicine and rehab at Harvard Medical School, both in Boston, told this news organization.
The study was published online in JAMA Network Open.
Injury severity not a factor
An estimated 2.8 million individuals in the United States experience TBI every year. Worldwide, the figure may be as high as 74 million.
Studies have long suggested a link between brain injury and subsequent neurologic disorders, but research suggesting a possible link to cardiovascular and endocrine problems has recently gained attention.
Building on a 2021 study that showed increased incidence of cardiovascular issues following a concussion, the researchers examined medical records of previously healthy patients treated for TBI between 2000 and 2015 who also had at least 1 follow-up visit between 6 months and 10 years after the initial injury.
Researchers analyzed data from 13,053 individuals – 4,351 with mild injury (mTBI), 4351 with moderate to severe injury (msTBI), and 4351 with no TBI. The most common cause of injury was a fall. Patients with sports-related injuries were excluded.
Incidence of hypertension was significantly higher among patients with mTBI (hazard ratio, 2.5; 95% confidence interval, 2.1-2.9) and msTBI (HR, 2.4; 95% CI, 2.0-2.9), compared with the unaffected group. Risk for other cardiovascular problems, including hyperlipidemia, obesity, and coronary artery disease, were also higher in the affected groups.
TBI patients also reported higher incidence of endocrine diseases, including diabetes (mTBI: HR, 1.9; 95% CI, 1.4-2.7; msTBI: HR, 1.9; 95% CI, 1.4-2.6). Elevated risk for ischemic stroke or transient ischemic attack was also increased (mTBI: HR, 2.2; 95% CI, 1.4-3.3; msTBI: HR, 3.6; 95% CI, 2.4-5.3).
Regardless of injury severity, patients with TBI had a higher risk for neurologic and psychiatric diseases, particularly depression, dementia, and psychotic disorders. “This tells us that mild TBI is not clean of events,” Dr. Zafonte said.
Surprising rate of comorbidity in youth
Investigators found increased risk for posttrauma comorbidities in all age groups, but researchers were struck by the high rates in younger patients, aged 18-40. Compared with age-matched individuals with no TBI history, hypertension risk was nearly six times higher in those with mTBI (HR, 5.9; 95% CI, 3.9-9.1) and nearly four times higher in patients with msTBI (HR, 3.9; 95% CI, 2.5-6.1).
Rates of hyperlipidemia and diabetes were also higher in younger patients in the mTBI group and posttraumatic seizures and psychiatric disorders were elevated regardless of TBI severity.
Overall, patients with msTBI, but not those with mTBI, were at higher risk for mortality, compared with the unexposed group (432 deaths [9.9%] vs. 250 deaths [5.7%]; P < .001).
“It’s clear that what we may be dealing with is that it holds up even for the younger people,” Dr. Zafonte said. “We used to think brain injury risk is worse in the severe cases, which it is, and it’s worse later on among those who are older, which it is. But our younger folks don’t get away either.”
While the study offers associations between TBI and multisystem health problems, Dr. Zafonte said it’s impossible to say at this point whether the brain injury caused the increased risk for cardiovascular or endocrine problems. Other organ injuries sustained in the trauma may be a contributing factor.
“Further data is needed to elucidate the mechanism and the causative relationships, which we do not have here,” he said.
Many of the postinjury comorbidities emerged a median of 3.5 years after TBI, regardless of severity. But some of the cardiovascular and psychiatric conditions emerged far sooner than that.
That’s important because research suggests less than half of patients with TBI receive follow-up care.
“It does make sense for folks who are interacting with people who’ve had a TBI to be suspicious of medical comorbidities relatively early on, within the first couple of years,” Dr. Zafonte said.
In an invited commentary, Vijay Krishnamoorthy, MD, MPH, PhD, Duke University, Durham, N.C., and Monica S. Vavilala, MD, University of Washington, Seattle, highlight some of the study’s limitations, including a lack of information on comorbidity severity and the lack of a matched group of patients who experienced non-head trauma.
Despite those limitations, the study offers important information on how TBI may affect organs beyond the brain, they noted.
“These observations, if replicated in future studies, raise intriguing implications in the future care of patients with TBI, including heightened chronic disease-screening measures and possibly enhanced guidelines for chronic extracranial organ system care for patients who experience TBI,” Dr. Krishnamoorthy and Dr. Vavilala wrote.
The study received no specific funding. Dr. Zafonte reported having received personal fees from Springer/Demos, serving on scientific advisory boards for Myomo and OnCare and has received funding from the Football Players Health Study at Harvard, funded in part by the National Football League Players Association. Dr. Krishnamoorthy and Dr. Vavilala disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Mild traumatic brain injury (TBI) is linked to a significantly increased risk for a host of subsequent cardiovascular, endocrine, neurologic, and psychiatric disorders, new research shows.
Incidence of hypertension, coronary heart disease, diabetes, stroke, depression, and dementia all began to increase soon after the brain injury and persisted over a decade in both mild and moderate to severe TBI.
Researchers found the multisystem comorbidities in all age groups, including in patients as young as 18. They also found that patients who developed multiple postinjury problems had higher mortality during the decade-long follow-up.
The findings suggest patients with TBI may require longer follow-up and proactive screening for multisystem disease, regardless of age or injury severity.
“The fact that both patients with mild and moderate to severe injuries both had long-term ongoing associations with comorbidities that continued over time and that they are cardiovascular, endocrine, neurologic, and behavioral health oriented was pretty striking,” study author Ross Zafonte, DO, PhD, president of Spaulding Rehab Hospital and professor and chair of physical medicine and rehab at Harvard Medical School, both in Boston, told this news organization.
The study was published online in JAMA Network Open.
Injury severity not a factor
An estimated 2.8 million individuals in the United States experience TBI every year. Worldwide, the figure may be as high as 74 million.
Studies have long suggested a link between brain injury and subsequent neurologic disorders, but research suggesting a possible link to cardiovascular and endocrine problems has recently gained attention.
Building on a 2021 study that showed increased incidence of cardiovascular issues following a concussion, the researchers examined medical records of previously healthy patients treated for TBI between 2000 and 2015 who also had at least 1 follow-up visit between 6 months and 10 years after the initial injury.
Researchers analyzed data from 13,053 individuals – 4,351 with mild injury (mTBI), 4351 with moderate to severe injury (msTBI), and 4351 with no TBI. The most common cause of injury was a fall. Patients with sports-related injuries were excluded.
Incidence of hypertension was significantly higher among patients with mTBI (hazard ratio, 2.5; 95% confidence interval, 2.1-2.9) and msTBI (HR, 2.4; 95% CI, 2.0-2.9), compared with the unaffected group. Risk for other cardiovascular problems, including hyperlipidemia, obesity, and coronary artery disease, were also higher in the affected groups.
TBI patients also reported higher incidence of endocrine diseases, including diabetes (mTBI: HR, 1.9; 95% CI, 1.4-2.7; msTBI: HR, 1.9; 95% CI, 1.4-2.6). Elevated risk for ischemic stroke or transient ischemic attack was also increased (mTBI: HR, 2.2; 95% CI, 1.4-3.3; msTBI: HR, 3.6; 95% CI, 2.4-5.3).
Regardless of injury severity, patients with TBI had a higher risk for neurologic and psychiatric diseases, particularly depression, dementia, and psychotic disorders. “This tells us that mild TBI is not clean of events,” Dr. Zafonte said.
Surprising rate of comorbidity in youth
Investigators found increased risk for posttrauma comorbidities in all age groups, but researchers were struck by the high rates in younger patients, aged 18-40. Compared with age-matched individuals with no TBI history, hypertension risk was nearly six times higher in those with mTBI (HR, 5.9; 95% CI, 3.9-9.1) and nearly four times higher in patients with msTBI (HR, 3.9; 95% CI, 2.5-6.1).
Rates of hyperlipidemia and diabetes were also higher in younger patients in the mTBI group and posttraumatic seizures and psychiatric disorders were elevated regardless of TBI severity.
Overall, patients with msTBI, but not those with mTBI, were at higher risk for mortality, compared with the unexposed group (432 deaths [9.9%] vs. 250 deaths [5.7%]; P < .001).
“It’s clear that what we may be dealing with is that it holds up even for the younger people,” Dr. Zafonte said. “We used to think brain injury risk is worse in the severe cases, which it is, and it’s worse later on among those who are older, which it is. But our younger folks don’t get away either.”
While the study offers associations between TBI and multisystem health problems, Dr. Zafonte said it’s impossible to say at this point whether the brain injury caused the increased risk for cardiovascular or endocrine problems. Other organ injuries sustained in the trauma may be a contributing factor.
“Further data is needed to elucidate the mechanism and the causative relationships, which we do not have here,” he said.
Many of the postinjury comorbidities emerged a median of 3.5 years after TBI, regardless of severity. But some of the cardiovascular and psychiatric conditions emerged far sooner than that.
That’s important because research suggests less than half of patients with TBI receive follow-up care.
“It does make sense for folks who are interacting with people who’ve had a TBI to be suspicious of medical comorbidities relatively early on, within the first couple of years,” Dr. Zafonte said.
In an invited commentary, Vijay Krishnamoorthy, MD, MPH, PhD, Duke University, Durham, N.C., and Monica S. Vavilala, MD, University of Washington, Seattle, highlight some of the study’s limitations, including a lack of information on comorbidity severity and the lack of a matched group of patients who experienced non-head trauma.
Despite those limitations, the study offers important information on how TBI may affect organs beyond the brain, they noted.
“These observations, if replicated in future studies, raise intriguing implications in the future care of patients with TBI, including heightened chronic disease-screening measures and possibly enhanced guidelines for chronic extracranial organ system care for patients who experience TBI,” Dr. Krishnamoorthy and Dr. Vavilala wrote.
The study received no specific funding. Dr. Zafonte reported having received personal fees from Springer/Demos, serving on scientific advisory boards for Myomo and OnCare and has received funding from the Football Players Health Study at Harvard, funded in part by the National Football League Players Association. Dr. Krishnamoorthy and Dr. Vavilala disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Mild traumatic brain injury (TBI) is linked to a significantly increased risk for a host of subsequent cardiovascular, endocrine, neurologic, and psychiatric disorders, new research shows.
Incidence of hypertension, coronary heart disease, diabetes, stroke, depression, and dementia all began to increase soon after the brain injury and persisted over a decade in both mild and moderate to severe TBI.
Researchers found the multisystem comorbidities in all age groups, including in patients as young as 18. They also found that patients who developed multiple postinjury problems had higher mortality during the decade-long follow-up.
The findings suggest patients with TBI may require longer follow-up and proactive screening for multisystem disease, regardless of age or injury severity.
“The fact that both patients with mild and moderate to severe injuries both had long-term ongoing associations with comorbidities that continued over time and that they are cardiovascular, endocrine, neurologic, and behavioral health oriented was pretty striking,” study author Ross Zafonte, DO, PhD, president of Spaulding Rehab Hospital and professor and chair of physical medicine and rehab at Harvard Medical School, both in Boston, told this news organization.
The study was published online in JAMA Network Open.
Injury severity not a factor
An estimated 2.8 million individuals in the United States experience TBI every year. Worldwide, the figure may be as high as 74 million.
Studies have long suggested a link between brain injury and subsequent neurologic disorders, but research suggesting a possible link to cardiovascular and endocrine problems has recently gained attention.
Building on a 2021 study that showed increased incidence of cardiovascular issues following a concussion, the researchers examined medical records of previously healthy patients treated for TBI between 2000 and 2015 who also had at least 1 follow-up visit between 6 months and 10 years after the initial injury.
Researchers analyzed data from 13,053 individuals – 4,351 with mild injury (mTBI), 4351 with moderate to severe injury (msTBI), and 4351 with no TBI. The most common cause of injury was a fall. Patients with sports-related injuries were excluded.
Incidence of hypertension was significantly higher among patients with mTBI (hazard ratio, 2.5; 95% confidence interval, 2.1-2.9) and msTBI (HR, 2.4; 95% CI, 2.0-2.9), compared with the unaffected group. Risk for other cardiovascular problems, including hyperlipidemia, obesity, and coronary artery disease, were also higher in the affected groups.
TBI patients also reported higher incidence of endocrine diseases, including diabetes (mTBI: HR, 1.9; 95% CI, 1.4-2.7; msTBI: HR, 1.9; 95% CI, 1.4-2.6). Elevated risk for ischemic stroke or transient ischemic attack was also increased (mTBI: HR, 2.2; 95% CI, 1.4-3.3; msTBI: HR, 3.6; 95% CI, 2.4-5.3).
Regardless of injury severity, patients with TBI had a higher risk for neurologic and psychiatric diseases, particularly depression, dementia, and psychotic disorders. “This tells us that mild TBI is not clean of events,” Dr. Zafonte said.
Surprising rate of comorbidity in youth
Investigators found increased risk for posttrauma comorbidities in all age groups, but researchers were struck by the high rates in younger patients, aged 18-40. Compared with age-matched individuals with no TBI history, hypertension risk was nearly six times higher in those with mTBI (HR, 5.9; 95% CI, 3.9-9.1) and nearly four times higher in patients with msTBI (HR, 3.9; 95% CI, 2.5-6.1).
Rates of hyperlipidemia and diabetes were also higher in younger patients in the mTBI group and posttraumatic seizures and psychiatric disorders were elevated regardless of TBI severity.
Overall, patients with msTBI, but not those with mTBI, were at higher risk for mortality, compared with the unexposed group (432 deaths [9.9%] vs. 250 deaths [5.7%]; P < .001).
“It’s clear that what we may be dealing with is that it holds up even for the younger people,” Dr. Zafonte said. “We used to think brain injury risk is worse in the severe cases, which it is, and it’s worse later on among those who are older, which it is. But our younger folks don’t get away either.”
While the study offers associations between TBI and multisystem health problems, Dr. Zafonte said it’s impossible to say at this point whether the brain injury caused the increased risk for cardiovascular or endocrine problems. Other organ injuries sustained in the trauma may be a contributing factor.
“Further data is needed to elucidate the mechanism and the causative relationships, which we do not have here,” he said.
Many of the postinjury comorbidities emerged a median of 3.5 years after TBI, regardless of severity. But some of the cardiovascular and psychiatric conditions emerged far sooner than that.
That’s important because research suggests less than half of patients with TBI receive follow-up care.
“It does make sense for folks who are interacting with people who’ve had a TBI to be suspicious of medical comorbidities relatively early on, within the first couple of years,” Dr. Zafonte said.
In an invited commentary, Vijay Krishnamoorthy, MD, MPH, PhD, Duke University, Durham, N.C., and Monica S. Vavilala, MD, University of Washington, Seattle, highlight some of the study’s limitations, including a lack of information on comorbidity severity and the lack of a matched group of patients who experienced non-head trauma.
Despite those limitations, the study offers important information on how TBI may affect organs beyond the brain, they noted.
“These observations, if replicated in future studies, raise intriguing implications in the future care of patients with TBI, including heightened chronic disease-screening measures and possibly enhanced guidelines for chronic extracranial organ system care for patients who experience TBI,” Dr. Krishnamoorthy and Dr. Vavilala wrote.
The study received no specific funding. Dr. Zafonte reported having received personal fees from Springer/Demos, serving on scientific advisory boards for Myomo and OnCare and has received funding from the Football Players Health Study at Harvard, funded in part by the National Football League Players Association. Dr. Krishnamoorthy and Dr. Vavilala disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
New blood biomarker to detect early dementia?
Investigators found that plasma concentrations of 2-aminoethyl dihydrogen phosphate and taurine could distinguish adults with early-stage Alzheimer’s disease from cognitively normal adults.
“Our biomarker for early-stage Alzheimer’s disease represents new thinking and is unique from the amyloid-beta and p-tau molecules that are currently being investigated to diagnose AD,” Sandra Banack, PhD, senior scientist, Brain Chemistry Labs, Jackson, Wyoming, told this news organization.
If further studies pan out, Dr. Banack said this biomarker could “easily be transformed into a test to aid clinical evaluations for Alzheimer’s disease.”
The study was published online in PLOS ONE.
New drug target?
The researchers measured concentrations of 2-aminoethyl dihydrogen phosphate and taurine in blood plasma samples in 25 patients (21 men; mean age, 71) with a clinical diagnosis of early-stage Alzheimer’s based on a Clinical Dementia Rating (CDR) score of 0.5, suggesting very mild cognitive impairment, and 25 healthy controls (20 men; mean age, 39).
The concentration of 2-aminoethyl dihydrogen phosphate, normalized by the concentration of taurine, reliably distinguished blood samples of early-stage Alzheimer’s patients from controls in a blinded analysis.
This biomarker “could lead to new understanding of [AD] and lead to new drug candidates,” Dr. Banack told this news organization.
The researchers note that 2-aminoethyl dihydrogen phosphate plays an important role in the structure and function of cellular membranes.
Physiologic effects of increased 2-aminoethyl dihydrogen phosphate concentrations in the blood are not known. However, in one study, concentrations of this molecule were found to be significantly lower in the temporal cortex, frontal cortex, and hippocampus (40%) in patients with Alzheimer’s disease, compared with controls.
“New biomarkers take time before they can be implemented in the clinic. The next step will be to repeat the experiments using a large sample size of AD patient blood samples,” Dr. Banack told this news organization.
The study team is looking to source a larger sample size of AD blood samples to replicate these findings. They are also examining this biomarker relative to other neurodegenerative diseases.
“If verified with larger sample sizes, the quantification of 2-aminoethyl dihydrogen phosphate could potentially assist in the diagnosis of early-stage Alzheimer’s disease when used in conjunction with the patient’s CDR score and other potential AD biomarkers,” Dr. Banack and colleagues say.
Caveats, cautionary notes
Commenting on the findings, Rebecca M. Edelmayer, PhD, Alzheimer’s Association senior director of scientific engagement, said the study is “interesting, though very small-scale and very preliminary.”
Dr. Edelmayer said one “major limitation” is that participants did not have their Alzheimer’s diagnosis confirmed with “gold standard biomarkers. They have been diagnosed based only on their cognitive and behavioral symptoms.”
She also cautioned that the study population is not representative – either of the general public or people living with Alzheimer’s disease.
For example, 41 out of all 50 samples are from men, “though we know women are disproportionately impacted by Alzheimer’s.”
“There is a mismatch in the age of the study groups,” Dr. Edelmayer noted. The mean age of controls in the study was 39 and the mean age of people with dementia was 71. Race or ethnicity and other demographic information is also unclear from the article.
“There is an urgent need for simple, inexpensive, noninvasive and easily available diagnostic tools for Alzheimer’s, such as a blood test. A simple blood test for Alzheimer’s would be a great advance for individuals with – and at risk for – the disease, families, doctors, and researchers,” Dr. Edelmayer said.
“Bottom line,” Dr. Edelmayer continued, “these results need to be further tested and verified in long-term, large-scale studies with diverse populations that are representative of those living with Alzheimer’s disease.”
This research was supported by the William Stamps Farish Fund and the Josephine P. & John J. Louis Foundation. Brain Chemistry Labs has applied for a patent related to this research. Dr. Edelmayer has no relevant disclosures.
A version of this article first appeared on Medscape.com.
Investigators found that plasma concentrations of 2-aminoethyl dihydrogen phosphate and taurine could distinguish adults with early-stage Alzheimer’s disease from cognitively normal adults.
“Our biomarker for early-stage Alzheimer’s disease represents new thinking and is unique from the amyloid-beta and p-tau molecules that are currently being investigated to diagnose AD,” Sandra Banack, PhD, senior scientist, Brain Chemistry Labs, Jackson, Wyoming, told this news organization.
If further studies pan out, Dr. Banack said this biomarker could “easily be transformed into a test to aid clinical evaluations for Alzheimer’s disease.”
The study was published online in PLOS ONE.
New drug target?
The researchers measured concentrations of 2-aminoethyl dihydrogen phosphate and taurine in blood plasma samples in 25 patients (21 men; mean age, 71) with a clinical diagnosis of early-stage Alzheimer’s based on a Clinical Dementia Rating (CDR) score of 0.5, suggesting very mild cognitive impairment, and 25 healthy controls (20 men; mean age, 39).
The concentration of 2-aminoethyl dihydrogen phosphate, normalized by the concentration of taurine, reliably distinguished blood samples of early-stage Alzheimer’s patients from controls in a blinded analysis.
This biomarker “could lead to new understanding of [AD] and lead to new drug candidates,” Dr. Banack told this news organization.
The researchers note that 2-aminoethyl dihydrogen phosphate plays an important role in the structure and function of cellular membranes.
Physiologic effects of increased 2-aminoethyl dihydrogen phosphate concentrations in the blood are not known. However, in one study, concentrations of this molecule were found to be significantly lower in the temporal cortex, frontal cortex, and hippocampus (40%) in patients with Alzheimer’s disease, compared with controls.
“New biomarkers take time before they can be implemented in the clinic. The next step will be to repeat the experiments using a large sample size of AD patient blood samples,” Dr. Banack told this news organization.
The study team is looking to source a larger sample size of AD blood samples to replicate these findings. They are also examining this biomarker relative to other neurodegenerative diseases.
“If verified with larger sample sizes, the quantification of 2-aminoethyl dihydrogen phosphate could potentially assist in the diagnosis of early-stage Alzheimer’s disease when used in conjunction with the patient’s CDR score and other potential AD biomarkers,” Dr. Banack and colleagues say.
Caveats, cautionary notes
Commenting on the findings, Rebecca M. Edelmayer, PhD, Alzheimer’s Association senior director of scientific engagement, said the study is “interesting, though very small-scale and very preliminary.”
Dr. Edelmayer said one “major limitation” is that participants did not have their Alzheimer’s diagnosis confirmed with “gold standard biomarkers. They have been diagnosed based only on their cognitive and behavioral symptoms.”
She also cautioned that the study population is not representative – either of the general public or people living with Alzheimer’s disease.
For example, 41 out of all 50 samples are from men, “though we know women are disproportionately impacted by Alzheimer’s.”
“There is a mismatch in the age of the study groups,” Dr. Edelmayer noted. The mean age of controls in the study was 39 and the mean age of people with dementia was 71. Race or ethnicity and other demographic information is also unclear from the article.
“There is an urgent need for simple, inexpensive, noninvasive and easily available diagnostic tools for Alzheimer’s, such as a blood test. A simple blood test for Alzheimer’s would be a great advance for individuals with – and at risk for – the disease, families, doctors, and researchers,” Dr. Edelmayer said.
“Bottom line,” Dr. Edelmayer continued, “these results need to be further tested and verified in long-term, large-scale studies with diverse populations that are representative of those living with Alzheimer’s disease.”
This research was supported by the William Stamps Farish Fund and the Josephine P. & John J. Louis Foundation. Brain Chemistry Labs has applied for a patent related to this research. Dr. Edelmayer has no relevant disclosures.
A version of this article first appeared on Medscape.com.
Investigators found that plasma concentrations of 2-aminoethyl dihydrogen phosphate and taurine could distinguish adults with early-stage Alzheimer’s disease from cognitively normal adults.
“Our biomarker for early-stage Alzheimer’s disease represents new thinking and is unique from the amyloid-beta and p-tau molecules that are currently being investigated to diagnose AD,” Sandra Banack, PhD, senior scientist, Brain Chemistry Labs, Jackson, Wyoming, told this news organization.
If further studies pan out, Dr. Banack said this biomarker could “easily be transformed into a test to aid clinical evaluations for Alzheimer’s disease.”
The study was published online in PLOS ONE.
New drug target?
The researchers measured concentrations of 2-aminoethyl dihydrogen phosphate and taurine in blood plasma samples in 25 patients (21 men; mean age, 71) with a clinical diagnosis of early-stage Alzheimer’s based on a Clinical Dementia Rating (CDR) score of 0.5, suggesting very mild cognitive impairment, and 25 healthy controls (20 men; mean age, 39).
The concentration of 2-aminoethyl dihydrogen phosphate, normalized by the concentration of taurine, reliably distinguished blood samples of early-stage Alzheimer’s patients from controls in a blinded analysis.
This biomarker “could lead to new understanding of [AD] and lead to new drug candidates,” Dr. Banack told this news organization.
The researchers note that 2-aminoethyl dihydrogen phosphate plays an important role in the structure and function of cellular membranes.
Physiologic effects of increased 2-aminoethyl dihydrogen phosphate concentrations in the blood are not known. However, in one study, concentrations of this molecule were found to be significantly lower in the temporal cortex, frontal cortex, and hippocampus (40%) in patients with Alzheimer’s disease, compared with controls.
“New biomarkers take time before they can be implemented in the clinic. The next step will be to repeat the experiments using a large sample size of AD patient blood samples,” Dr. Banack told this news organization.
The study team is looking to source a larger sample size of AD blood samples to replicate these findings. They are also examining this biomarker relative to other neurodegenerative diseases.
“If verified with larger sample sizes, the quantification of 2-aminoethyl dihydrogen phosphate could potentially assist in the diagnosis of early-stage Alzheimer’s disease when used in conjunction with the patient’s CDR score and other potential AD biomarkers,” Dr. Banack and colleagues say.
Caveats, cautionary notes
Commenting on the findings, Rebecca M. Edelmayer, PhD, Alzheimer’s Association senior director of scientific engagement, said the study is “interesting, though very small-scale and very preliminary.”
Dr. Edelmayer said one “major limitation” is that participants did not have their Alzheimer’s diagnosis confirmed with “gold standard biomarkers. They have been diagnosed based only on their cognitive and behavioral symptoms.”
She also cautioned that the study population is not representative – either of the general public or people living with Alzheimer’s disease.
For example, 41 out of all 50 samples are from men, “though we know women are disproportionately impacted by Alzheimer’s.”
“There is a mismatch in the age of the study groups,” Dr. Edelmayer noted. The mean age of controls in the study was 39 and the mean age of people with dementia was 71. Race or ethnicity and other demographic information is also unclear from the article.
“There is an urgent need for simple, inexpensive, noninvasive and easily available diagnostic tools for Alzheimer’s, such as a blood test. A simple blood test for Alzheimer’s would be a great advance for individuals with – and at risk for – the disease, families, doctors, and researchers,” Dr. Edelmayer said.
“Bottom line,” Dr. Edelmayer continued, “these results need to be further tested and verified in long-term, large-scale studies with diverse populations that are representative of those living with Alzheimer’s disease.”
This research was supported by the William Stamps Farish Fund and the Josephine P. & John J. Louis Foundation. Brain Chemistry Labs has applied for a patent related to this research. Dr. Edelmayer has no relevant disclosures.
A version of this article first appeared on Medscape.com.
Impaired vision an overlooked dementia risk factor
Investigators analyzed estimated population attributable fractions (PAFs) associated with dementia in more than 16,000 older adults. A PAF represents the number of dementia cases that could be prevented if a given risk factor were eliminated.
Results showed the PAF of vision impairment was 1.8%, suggesting that healthy vision had the potential to prevent more than 100,000 cases of dementia in the United States.
“Vision impairment and blindness disproportionately impact older adults, yet vision impairment is often preventable or even correctable,” study investigator Joshua Ehrlich MD, assistant professor of ophthalmology and visual sciences, University of Michigan, Ann Arbor, said in an interview.
Poor vision affects not only how individuals see the world, but also their systemic health and well-being, Dr. Ehrlich said.
“Accordingly, ensuring that older adults receive appropriate eye care is vital to promoting health, independence, and optimal aging,” he added.
The findings were published online in JAMA Neurology.
A surprising omission
There is an “urgent need to identify modifiable risk factors for dementia that can be targeted with interventions to slow cognitive decline and prevent dementia,” the investigators wrote.
In 2020, the Lancet Commission report on dementia prevention, intervention, and care proposed a life-course model of 12 potentially modifiable dementia risk factors. This included lower educational level, hearing loss, traumatic brain injury, hypertension, excessive alcohol consumption, obesity, smoking, depression, social isolation, physical inactivity, diabetes, and air pollution.
Together, these factors are associated with about 40% of dementia cases worldwide, the report notes.
Vision impairment was not included in this model, “despite considerable evidence that it is associated with an elevated risk of incident dementia and that it may operate through the same pathways as hearing loss,” the current researchers wrote.
“We have known for some time that vision impairment is a risk factor for dementia [and] we also know that a very large fraction of vision impairment, possibly in excess of 80%, is avoidable or has simply yet to be addressed,” Dr. Ehrlich said.
He and his colleagues found it “surprising that vision impairment had been ignored in key models of modifiable dementia risk factors that are used to shape health policy and resource allocation.” They set out to demonstrate that, “in fact, vision impairment is just as influential as a number of other long accepted modifiable dementia risk factors.”
The investigators assessed data from the Health and Retirement Study (HRS), a panel study that surveys more than 20,000 U.S. adults aged 50 years or older every 2 years.
The investigators applied the same methods used by the Lancet Commission to the HRS dataset and added vision impairment to the Lancet life-course model. Air pollution was excluded in their model “because those data were not readily available in the HRS,” the researchers wrote.
They noted the PAF is “based on the population prevalence and relative risk of dementia for each risk factor” and is “weighted, based on a principal components analysis, to account for communality (clustering of risk factors).”
A missed prevention opportunity
The sample included 16,690 participants (54% were women, 51.5% were at least age 65, 80.2% were White, 10.6% were Black, 9.2% were other).
In total, the 12 potentially modifiable risk factors used in the researchers’ model were associated with an estimated 62.4% of dementia cases in the United States, with hypertension as the most prevalent risk factor with the highest weighted PAF.
A new focus for prevention
Commenting for this article, Suzann Pershing, MD, associate professor of ophthalmology, Stanford (Calif.) University, called the study “particularly important because, despite growing recognition of its importance in relation to cognition, visual impairment is often an underrecognized risk factor.”
The current research “builds on increasingly robust medical literature linking visual impairment and dementia, applying analogous methods to those used for the life course model recently presented by the Lancet Commission to evaluate potentially modifiable dementia risk factors,” said Dr. Pershing, who was not involved with the study.
The investigators “make a compelling argument for inclusion of visual impairment as one of the potentially modifiable risk factors; practicing clinicians and health care systems may consider screening and targeted therapies to address visual impairment, with a goal of population health and contributing to a reduction in future dementia disease burden,” she added.
In an accompanying editorial), Jennifer Deal, PhD, department of epidemiology and Cochlear Center for Hearing and Public Health, Baltimore, and Julio Rojas, MD, PhD, Memory and Aging Center, department of neurology, Weill Institute for Neurosciences, University of California, San Francisco, call the findings “an important reminder that dementia is a social problem in which potentially treatable risk factors, including visual impairment, are highly prevalent in disadvantaged populations.”
The editorialists noted that 90% of cases of vision impairment are “preventable or have yet to be treated. The two “highly cost-effective interventions” of eyeglasses and/or cataract surgery “remain underused both in the U.S. and globally, especially in disadvantaged communities,” they wrote.
They added that more research is needed to “test the effectiveness of interventions to preserve cognitive health by promoting healthy vision.”
The study was supported by grants from the National Institute on Aging, the National Institutes of Health, and Research to Prevent Blindness. The investigators reported no relevant financial relationships. Dr. Deal reported having received grants from the National Institute on Aging. Dr. Rojas reported serving as site principal investigator on clinical trials for Eli Lilly and Eisai and receiving grants from the National Institute on Aging. Dr. Pershing is a consultant for Acumen, and Verana Health (as DigiSight Technologies).
A version of this article first appeared on Medscape.com.
Investigators analyzed estimated population attributable fractions (PAFs) associated with dementia in more than 16,000 older adults. A PAF represents the number of dementia cases that could be prevented if a given risk factor were eliminated.
Results showed the PAF of vision impairment was 1.8%, suggesting that healthy vision had the potential to prevent more than 100,000 cases of dementia in the United States.
“Vision impairment and blindness disproportionately impact older adults, yet vision impairment is often preventable or even correctable,” study investigator Joshua Ehrlich MD, assistant professor of ophthalmology and visual sciences, University of Michigan, Ann Arbor, said in an interview.
Poor vision affects not only how individuals see the world, but also their systemic health and well-being, Dr. Ehrlich said.
“Accordingly, ensuring that older adults receive appropriate eye care is vital to promoting health, independence, and optimal aging,” he added.
The findings were published online in JAMA Neurology.
A surprising omission
There is an “urgent need to identify modifiable risk factors for dementia that can be targeted with interventions to slow cognitive decline and prevent dementia,” the investigators wrote.
In 2020, the Lancet Commission report on dementia prevention, intervention, and care proposed a life-course model of 12 potentially modifiable dementia risk factors. This included lower educational level, hearing loss, traumatic brain injury, hypertension, excessive alcohol consumption, obesity, smoking, depression, social isolation, physical inactivity, diabetes, and air pollution.
Together, these factors are associated with about 40% of dementia cases worldwide, the report notes.
Vision impairment was not included in this model, “despite considerable evidence that it is associated with an elevated risk of incident dementia and that it may operate through the same pathways as hearing loss,” the current researchers wrote.
“We have known for some time that vision impairment is a risk factor for dementia [and] we also know that a very large fraction of vision impairment, possibly in excess of 80%, is avoidable or has simply yet to be addressed,” Dr. Ehrlich said.
He and his colleagues found it “surprising that vision impairment had been ignored in key models of modifiable dementia risk factors that are used to shape health policy and resource allocation.” They set out to demonstrate that, “in fact, vision impairment is just as influential as a number of other long accepted modifiable dementia risk factors.”
The investigators assessed data from the Health and Retirement Study (HRS), a panel study that surveys more than 20,000 U.S. adults aged 50 years or older every 2 years.
The investigators applied the same methods used by the Lancet Commission to the HRS dataset and added vision impairment to the Lancet life-course model. Air pollution was excluded in their model “because those data were not readily available in the HRS,” the researchers wrote.
They noted the PAF is “based on the population prevalence and relative risk of dementia for each risk factor” and is “weighted, based on a principal components analysis, to account for communality (clustering of risk factors).”
A missed prevention opportunity
The sample included 16,690 participants (54% were women, 51.5% were at least age 65, 80.2% were White, 10.6% were Black, 9.2% were other).
In total, the 12 potentially modifiable risk factors used in the researchers’ model were associated with an estimated 62.4% of dementia cases in the United States, with hypertension as the most prevalent risk factor with the highest weighted PAF.
A new focus for prevention
Commenting for this article, Suzann Pershing, MD, associate professor of ophthalmology, Stanford (Calif.) University, called the study “particularly important because, despite growing recognition of its importance in relation to cognition, visual impairment is often an underrecognized risk factor.”
The current research “builds on increasingly robust medical literature linking visual impairment and dementia, applying analogous methods to those used for the life course model recently presented by the Lancet Commission to evaluate potentially modifiable dementia risk factors,” said Dr. Pershing, who was not involved with the study.
The investigators “make a compelling argument for inclusion of visual impairment as one of the potentially modifiable risk factors; practicing clinicians and health care systems may consider screening and targeted therapies to address visual impairment, with a goal of population health and contributing to a reduction in future dementia disease burden,” she added.
In an accompanying editorial), Jennifer Deal, PhD, department of epidemiology and Cochlear Center for Hearing and Public Health, Baltimore, and Julio Rojas, MD, PhD, Memory and Aging Center, department of neurology, Weill Institute for Neurosciences, University of California, San Francisco, call the findings “an important reminder that dementia is a social problem in which potentially treatable risk factors, including visual impairment, are highly prevalent in disadvantaged populations.”
The editorialists noted that 90% of cases of vision impairment are “preventable or have yet to be treated. The two “highly cost-effective interventions” of eyeglasses and/or cataract surgery “remain underused both in the U.S. and globally, especially in disadvantaged communities,” they wrote.
They added that more research is needed to “test the effectiveness of interventions to preserve cognitive health by promoting healthy vision.”
The study was supported by grants from the National Institute on Aging, the National Institutes of Health, and Research to Prevent Blindness. The investigators reported no relevant financial relationships. Dr. Deal reported having received grants from the National Institute on Aging. Dr. Rojas reported serving as site principal investigator on clinical trials for Eli Lilly and Eisai and receiving grants from the National Institute on Aging. Dr. Pershing is a consultant for Acumen, and Verana Health (as DigiSight Technologies).
A version of this article first appeared on Medscape.com.
Investigators analyzed estimated population attributable fractions (PAFs) associated with dementia in more than 16,000 older adults. A PAF represents the number of dementia cases that could be prevented if a given risk factor were eliminated.
Results showed the PAF of vision impairment was 1.8%, suggesting that healthy vision had the potential to prevent more than 100,000 cases of dementia in the United States.
“Vision impairment and blindness disproportionately impact older adults, yet vision impairment is often preventable or even correctable,” study investigator Joshua Ehrlich MD, assistant professor of ophthalmology and visual sciences, University of Michigan, Ann Arbor, said in an interview.
Poor vision affects not only how individuals see the world, but also their systemic health and well-being, Dr. Ehrlich said.
“Accordingly, ensuring that older adults receive appropriate eye care is vital to promoting health, independence, and optimal aging,” he added.
The findings were published online in JAMA Neurology.
A surprising omission
There is an “urgent need to identify modifiable risk factors for dementia that can be targeted with interventions to slow cognitive decline and prevent dementia,” the investigators wrote.
In 2020, the Lancet Commission report on dementia prevention, intervention, and care proposed a life-course model of 12 potentially modifiable dementia risk factors. This included lower educational level, hearing loss, traumatic brain injury, hypertension, excessive alcohol consumption, obesity, smoking, depression, social isolation, physical inactivity, diabetes, and air pollution.
Together, these factors are associated with about 40% of dementia cases worldwide, the report notes.
Vision impairment was not included in this model, “despite considerable evidence that it is associated with an elevated risk of incident dementia and that it may operate through the same pathways as hearing loss,” the current researchers wrote.
“We have known for some time that vision impairment is a risk factor for dementia [and] we also know that a very large fraction of vision impairment, possibly in excess of 80%, is avoidable or has simply yet to be addressed,” Dr. Ehrlich said.
He and his colleagues found it “surprising that vision impairment had been ignored in key models of modifiable dementia risk factors that are used to shape health policy and resource allocation.” They set out to demonstrate that, “in fact, vision impairment is just as influential as a number of other long accepted modifiable dementia risk factors.”
The investigators assessed data from the Health and Retirement Study (HRS), a panel study that surveys more than 20,000 U.S. adults aged 50 years or older every 2 years.
The investigators applied the same methods used by the Lancet Commission to the HRS dataset and added vision impairment to the Lancet life-course model. Air pollution was excluded in their model “because those data were not readily available in the HRS,” the researchers wrote.
They noted the PAF is “based on the population prevalence and relative risk of dementia for each risk factor” and is “weighted, based on a principal components analysis, to account for communality (clustering of risk factors).”
A missed prevention opportunity
The sample included 16,690 participants (54% were women, 51.5% were at least age 65, 80.2% were White, 10.6% were Black, 9.2% were other).
In total, the 12 potentially modifiable risk factors used in the researchers’ model were associated with an estimated 62.4% of dementia cases in the United States, with hypertension as the most prevalent risk factor with the highest weighted PAF.
A new focus for prevention
Commenting for this article, Suzann Pershing, MD, associate professor of ophthalmology, Stanford (Calif.) University, called the study “particularly important because, despite growing recognition of its importance in relation to cognition, visual impairment is often an underrecognized risk factor.”
The current research “builds on increasingly robust medical literature linking visual impairment and dementia, applying analogous methods to those used for the life course model recently presented by the Lancet Commission to evaluate potentially modifiable dementia risk factors,” said Dr. Pershing, who was not involved with the study.
The investigators “make a compelling argument for inclusion of visual impairment as one of the potentially modifiable risk factors; practicing clinicians and health care systems may consider screening and targeted therapies to address visual impairment, with a goal of population health and contributing to a reduction in future dementia disease burden,” she added.
In an accompanying editorial), Jennifer Deal, PhD, department of epidemiology and Cochlear Center for Hearing and Public Health, Baltimore, and Julio Rojas, MD, PhD, Memory and Aging Center, department of neurology, Weill Institute for Neurosciences, University of California, San Francisco, call the findings “an important reminder that dementia is a social problem in which potentially treatable risk factors, including visual impairment, are highly prevalent in disadvantaged populations.”
The editorialists noted that 90% of cases of vision impairment are “preventable or have yet to be treated. The two “highly cost-effective interventions” of eyeglasses and/or cataract surgery “remain underused both in the U.S. and globally, especially in disadvantaged communities,” they wrote.
They added that more research is needed to “test the effectiveness of interventions to preserve cognitive health by promoting healthy vision.”
The study was supported by grants from the National Institute on Aging, the National Institutes of Health, and Research to Prevent Blindness. The investigators reported no relevant financial relationships. Dr. Deal reported having received grants from the National Institute on Aging. Dr. Rojas reported serving as site principal investigator on clinical trials for Eli Lilly and Eisai and receiving grants from the National Institute on Aging. Dr. Pershing is a consultant for Acumen, and Verana Health (as DigiSight Technologies).
A version of this article first appeared on Medscape.com.
Virtual reality an ‘exciting opportunity’ for geriatric psychiatry
Researchers are increasingly turning their attention to virtual reality (VR) for the treatment of psychiatric disorders in older adults.
Recent studies have highlighted the usefulness of VR in treating depression and loneliness in older patients who may be socially isolated because of their age, comorbidities, or the COVID-19 pandemic.
“The unique capability of virtual reality to create an immersive and engaging setting is an exciting opportunity for geriatric psychiatry,” Harmehr Sekhon, PhD, postdoctoral research fellow, Lady Davis Institute/Jewish General Hospital, McGill University, Montreal, and McLean Hospital, Harvard Medical School, Boston, told this news organization.
, Dr. Sekhon said.
One novel approach involves using VR to administer a mindfulness intervention in older adults. Dr. Sekhon shared information on her own mindfulness study and on other developments in VR and telemedicine at the American Association for Geriatric Psychiatry annual meeting.
Potential bridging tool
As the population ages, the prevalence of mental health disorders increases. Telemedicine has proved to be a potential “bridge” to address the health care needs of older adults, Dr. Sekhon noted.
She cited her systematic review of telemedicine for older adults with dementia during COVID-19. Results showed that telemedicine was a “beneficial approach” to assisting these individuals and that it increased accessibility, said Dr. Sekhon.
In addition, a survey published last year showed that 87% of Americans in general want to continue using telehealth services after the pandemic. Most respondents agreed that telehealth had made it easier to get the care they needed. They also reported having received the same level of care via telehealth as with in-person care.
A growing body of research shows that VR has “positive influences on mood and well-being, cognition, pain management, [and] treatment of phobias in younger adults,” Dr. Sekhon said. She added that there is evidence that VR is feasible for older adults, with applications in cognitive disorders.
She cited a recent systematic review of 55 studies that assessed the impact of different types of VR on mental health in older adults. The results showed that VR could be helpful in screening for cognitive impairment – and it was comparable to some paper-based assessment. It was also useful as a training tool for those with cognitive impairment.
Examples of VR interventions that can be used to treat cognitive impairment include “virtual cities, kitchens, supermarkets,” Dr. Sekhon noted.
The technology is increasingly being used as a tool to deliver psychotherapy, in which patient engagement is “a key determinant” of outcomes, she added. “Virtual reality is a cutting-edge, engaging, and immersive technique to administer psychotherapy,” she said.
Such VR approaches are proving successful in older patients. Dr. Sekhon highlighted the case of an 85-year-old woman who engaged in ten sessions of psychodynamic psychotherapy that targeted persistent dysthymia and negativistic mood. The case was part of a proof-of-concept study published in the May issue of the American Journal of Geriatric Psychiatry.
Dr. Sekhon noted the intervention was well tolerated and was associated with minimal side effects.
VR-based meditation
Dr. Sekhon and her colleagues are now conducting a randomized controlled trial of VR meditation in older adults. VR-based meditation has been shown to increase relaxation and to decrease anxiety, sadness, and anger in younger adults. However, it has not been studied in the geriatric population.
The pilot study is assessing the feasibility and tolerability of VR meditation for older adults and its effects on stress, anxiety, depression, sleep, and quality of life. The study involves 30 adults aged 60 years and older.
Participants receive either 15-minute VR mindfulness meditation sessions twice a week for 4 weeks or are on a control wait list. The meditation sessions are user friendly and focus on breath meditation and body scans, Dr. Sekhon reported.
Because participants are older and balance is a concern, safety steps are incorporated into the sessions. “We ensure they’re doing this in a seated position, in a chair with arm rests, so that they’re very stable and there’s no risk of falls,” said Dr. Sekhon.
Another concern with VR is motion sickness, she noted. “It’s pretty minimal, but the best way we found so far is giving older adults time to adapt and feel comfortable with the VR,” she said. From the first session, participants learn how to put on the device and are checked to make sure they are comfortable with the process. To help them get used to everything, video and audio are not included during the first session.
Dr. Sekhon noted that results from the study are expected later this year.
In addition to mindfulness, researchers are using VR to deliver other established interventions, such as exposure therapy – and are implementing these approaches in varied environments, including long-term and palliative care settings.
VR-related technology is constantly improving and is becoming easier to use and more affordable, said Dr. Sekhon. She noted that the simplest devices that rely on smartphones cost as little as $15.
Although VR in older adults is promising, there are barriers to its adoption and use in research, she noted. For example, older adults may have cognitive, visual, or hearing impairments. They may have limited digital literacy, and/or they may not have access to the required technology.
These barriers can be overcome through workarounds, including providing instructional videos and digital literacy assistance via Zoom and working with community partners to facilitate study recruitment of older patients, Dr. Sekhon said.
Dr. Sekhon’s research is funded by the Canadian Institutes of Health Research and the Fonds de recherche du Quebec Sante.
A version of this article first appeared on Medscape.com.
Researchers are increasingly turning their attention to virtual reality (VR) for the treatment of psychiatric disorders in older adults.
Recent studies have highlighted the usefulness of VR in treating depression and loneliness in older patients who may be socially isolated because of their age, comorbidities, or the COVID-19 pandemic.
“The unique capability of virtual reality to create an immersive and engaging setting is an exciting opportunity for geriatric psychiatry,” Harmehr Sekhon, PhD, postdoctoral research fellow, Lady Davis Institute/Jewish General Hospital, McGill University, Montreal, and McLean Hospital, Harvard Medical School, Boston, told this news organization.
, Dr. Sekhon said.
One novel approach involves using VR to administer a mindfulness intervention in older adults. Dr. Sekhon shared information on her own mindfulness study and on other developments in VR and telemedicine at the American Association for Geriatric Psychiatry annual meeting.
Potential bridging tool
As the population ages, the prevalence of mental health disorders increases. Telemedicine has proved to be a potential “bridge” to address the health care needs of older adults, Dr. Sekhon noted.
She cited her systematic review of telemedicine for older adults with dementia during COVID-19. Results showed that telemedicine was a “beneficial approach” to assisting these individuals and that it increased accessibility, said Dr. Sekhon.
In addition, a survey published last year showed that 87% of Americans in general want to continue using telehealth services after the pandemic. Most respondents agreed that telehealth had made it easier to get the care they needed. They also reported having received the same level of care via telehealth as with in-person care.
A growing body of research shows that VR has “positive influences on mood and well-being, cognition, pain management, [and] treatment of phobias in younger adults,” Dr. Sekhon said. She added that there is evidence that VR is feasible for older adults, with applications in cognitive disorders.
She cited a recent systematic review of 55 studies that assessed the impact of different types of VR on mental health in older adults. The results showed that VR could be helpful in screening for cognitive impairment – and it was comparable to some paper-based assessment. It was also useful as a training tool for those with cognitive impairment.
Examples of VR interventions that can be used to treat cognitive impairment include “virtual cities, kitchens, supermarkets,” Dr. Sekhon noted.
The technology is increasingly being used as a tool to deliver psychotherapy, in which patient engagement is “a key determinant” of outcomes, she added. “Virtual reality is a cutting-edge, engaging, and immersive technique to administer psychotherapy,” she said.
Such VR approaches are proving successful in older patients. Dr. Sekhon highlighted the case of an 85-year-old woman who engaged in ten sessions of psychodynamic psychotherapy that targeted persistent dysthymia and negativistic mood. The case was part of a proof-of-concept study published in the May issue of the American Journal of Geriatric Psychiatry.
Dr. Sekhon noted the intervention was well tolerated and was associated with minimal side effects.
VR-based meditation
Dr. Sekhon and her colleagues are now conducting a randomized controlled trial of VR meditation in older adults. VR-based meditation has been shown to increase relaxation and to decrease anxiety, sadness, and anger in younger adults. However, it has not been studied in the geriatric population.
The pilot study is assessing the feasibility and tolerability of VR meditation for older adults and its effects on stress, anxiety, depression, sleep, and quality of life. The study involves 30 adults aged 60 years and older.
Participants receive either 15-minute VR mindfulness meditation sessions twice a week for 4 weeks or are on a control wait list. The meditation sessions are user friendly and focus on breath meditation and body scans, Dr. Sekhon reported.
Because participants are older and balance is a concern, safety steps are incorporated into the sessions. “We ensure they’re doing this in a seated position, in a chair with arm rests, so that they’re very stable and there’s no risk of falls,” said Dr. Sekhon.
Another concern with VR is motion sickness, she noted. “It’s pretty minimal, but the best way we found so far is giving older adults time to adapt and feel comfortable with the VR,” she said. From the first session, participants learn how to put on the device and are checked to make sure they are comfortable with the process. To help them get used to everything, video and audio are not included during the first session.
Dr. Sekhon noted that results from the study are expected later this year.
In addition to mindfulness, researchers are using VR to deliver other established interventions, such as exposure therapy – and are implementing these approaches in varied environments, including long-term and palliative care settings.
VR-related technology is constantly improving and is becoming easier to use and more affordable, said Dr. Sekhon. She noted that the simplest devices that rely on smartphones cost as little as $15.
Although VR in older adults is promising, there are barriers to its adoption and use in research, she noted. For example, older adults may have cognitive, visual, or hearing impairments. They may have limited digital literacy, and/or they may not have access to the required technology.
These barriers can be overcome through workarounds, including providing instructional videos and digital literacy assistance via Zoom and working with community partners to facilitate study recruitment of older patients, Dr. Sekhon said.
Dr. Sekhon’s research is funded by the Canadian Institutes of Health Research and the Fonds de recherche du Quebec Sante.
A version of this article first appeared on Medscape.com.
Researchers are increasingly turning their attention to virtual reality (VR) for the treatment of psychiatric disorders in older adults.
Recent studies have highlighted the usefulness of VR in treating depression and loneliness in older patients who may be socially isolated because of their age, comorbidities, or the COVID-19 pandemic.
“The unique capability of virtual reality to create an immersive and engaging setting is an exciting opportunity for geriatric psychiatry,” Harmehr Sekhon, PhD, postdoctoral research fellow, Lady Davis Institute/Jewish General Hospital, McGill University, Montreal, and McLean Hospital, Harvard Medical School, Boston, told this news organization.
, Dr. Sekhon said.
One novel approach involves using VR to administer a mindfulness intervention in older adults. Dr. Sekhon shared information on her own mindfulness study and on other developments in VR and telemedicine at the American Association for Geriatric Psychiatry annual meeting.
Potential bridging tool
As the population ages, the prevalence of mental health disorders increases. Telemedicine has proved to be a potential “bridge” to address the health care needs of older adults, Dr. Sekhon noted.
She cited her systematic review of telemedicine for older adults with dementia during COVID-19. Results showed that telemedicine was a “beneficial approach” to assisting these individuals and that it increased accessibility, said Dr. Sekhon.
In addition, a survey published last year showed that 87% of Americans in general want to continue using telehealth services after the pandemic. Most respondents agreed that telehealth had made it easier to get the care they needed. They also reported having received the same level of care via telehealth as with in-person care.
A growing body of research shows that VR has “positive influences on mood and well-being, cognition, pain management, [and] treatment of phobias in younger adults,” Dr. Sekhon said. She added that there is evidence that VR is feasible for older adults, with applications in cognitive disorders.
She cited a recent systematic review of 55 studies that assessed the impact of different types of VR on mental health in older adults. The results showed that VR could be helpful in screening for cognitive impairment – and it was comparable to some paper-based assessment. It was also useful as a training tool for those with cognitive impairment.
Examples of VR interventions that can be used to treat cognitive impairment include “virtual cities, kitchens, supermarkets,” Dr. Sekhon noted.
The technology is increasingly being used as a tool to deliver psychotherapy, in which patient engagement is “a key determinant” of outcomes, she added. “Virtual reality is a cutting-edge, engaging, and immersive technique to administer psychotherapy,” she said.
Such VR approaches are proving successful in older patients. Dr. Sekhon highlighted the case of an 85-year-old woman who engaged in ten sessions of psychodynamic psychotherapy that targeted persistent dysthymia and negativistic mood. The case was part of a proof-of-concept study published in the May issue of the American Journal of Geriatric Psychiatry.
Dr. Sekhon noted the intervention was well tolerated and was associated with minimal side effects.
VR-based meditation
Dr. Sekhon and her colleagues are now conducting a randomized controlled trial of VR meditation in older adults. VR-based meditation has been shown to increase relaxation and to decrease anxiety, sadness, and anger in younger adults. However, it has not been studied in the geriatric population.
The pilot study is assessing the feasibility and tolerability of VR meditation for older adults and its effects on stress, anxiety, depression, sleep, and quality of life. The study involves 30 adults aged 60 years and older.
Participants receive either 15-minute VR mindfulness meditation sessions twice a week for 4 weeks or are on a control wait list. The meditation sessions are user friendly and focus on breath meditation and body scans, Dr. Sekhon reported.
Because participants are older and balance is a concern, safety steps are incorporated into the sessions. “We ensure they’re doing this in a seated position, in a chair with arm rests, so that they’re very stable and there’s no risk of falls,” said Dr. Sekhon.
Another concern with VR is motion sickness, she noted. “It’s pretty minimal, but the best way we found so far is giving older adults time to adapt and feel comfortable with the VR,” she said. From the first session, participants learn how to put on the device and are checked to make sure they are comfortable with the process. To help them get used to everything, video and audio are not included during the first session.
Dr. Sekhon noted that results from the study are expected later this year.
In addition to mindfulness, researchers are using VR to deliver other established interventions, such as exposure therapy – and are implementing these approaches in varied environments, including long-term and palliative care settings.
VR-related technology is constantly improving and is becoming easier to use and more affordable, said Dr. Sekhon. She noted that the simplest devices that rely on smartphones cost as little as $15.
Although VR in older adults is promising, there are barriers to its adoption and use in research, she noted. For example, older adults may have cognitive, visual, or hearing impairments. They may have limited digital literacy, and/or they may not have access to the required technology.
These barriers can be overcome through workarounds, including providing instructional videos and digital literacy assistance via Zoom and working with community partners to facilitate study recruitment of older patients, Dr. Sekhon said.
Dr. Sekhon’s research is funded by the Canadian Institutes of Health Research and the Fonds de recherche du Quebec Sante.
A version of this article first appeared on Medscape.com.
FROM AAGP 2022
Nap length linked to cognitive changes
No wonder we feel worse after naps
Some of us have hectic schedules that may make a nap feel more necessary. It’s common knowledge that naps shouldn’t be too long – maybe 20 minutes or so – but if you frequently take 3-hour naps and wake up thinking you’re late for school even though you’re 47 and have your PhD, this LOTME is for you.
Studies have shown that there is a link between napping during the day and Alzheimer’s/cognitive decline, but now we’ve got a double whammy for you: Longer and more frequent napping is linked to worse cognition after a year, and in turn, those with cognitive decline and Alzheimer’s are known to nap longer and more frequently during the day.
“We now know that the pathology related to cognitive decline can cause other changes in function,” he said. “It’s really a multisystem disorder, also including difficulty sleeping, changes in movement, changes in body composition, depression symptoms, behavioral changes, etc.,” coauthor Aron Buchman, MD, said in a statement from Rush University Medical Center.
The investigators monitored 1,400 patients over the course of 14 years with wrist bracelets that recorded when a person was not active during the day and considered that a nap.
At the beginning of the study, 75% of the study subjects had no cognitive impairment, 19.5% had some cognitive impairment, and approximately 4% had Alzheimer’s. Napping during the day only increased about 11 minutes a year for those with no signs of cognitive impairment, but those who showed significantly more signs of cognitive decline doubled their nap time and those actually diagnosed with Alzheimer’s tripled theirs.
The investigators did not imply that napping causes Alzheimer’s, but they noted that people who are older and nap more than an hour a day are 40% more likely to be at risk. It is something to consider and monitor.
Sometimes, after all, a nap seems like the best idea ever, but more often than not we wake up feeling 10 times worse. Our bodies may be giving us a heads up.
Pokemon Go away depression
The summer of 2016 was a great time if you happened to be a fan of Pokemon. Which is quite a lot of people. For almost 20 years millions have enjoyed the games and animated series, but Pokemon Go brought the thrill of catching Pokemon to life in a whole new way. For the first time, you could go out into the world and pretend you were a real Pokemon trainer, and everywhere you went, there would be others like you.
The ability to chase after Pikachu and Charizard in real life (well, augmented reality, but close enough) seemed to bring people a lot of joy, but seemed is never good enough for science. Can’t have anecdotes, we need data! So researchers at the London School of Economics and Political Science conducted a study into how Pokemon Go affected local Internet search rates of depression as the game was released slowly around the world.
Through analyzing Google Trend data of words like “depression,” “anxiety,” and “stress,” the researchers found that the release of Pokemon Go was significantly associated with a noticeable, though short-term, drop in depression-related Internet searches. Location-based augmented reality games may alleviate symptoms of mild depression, the researchers said, as they encourage physical activity, face-to-face socialization, and exposure to nature, though they added that simply going outside is likely not enough to combat clinical cases of severe depression.
Still, augmented reality games represent a viable target for public health investment, since they’re easy to use and inexpensive to make. That said, we’re not sure we want the FDA or CDC making a new Pokemon Go game. They’d probably end up filling the streets with Mr. Mime. And no one would leave their house for that.
And now a word from our sponsor
How many times has this happened to you? You need to repair a jet engine, inspect a nuclear reactor cooling system, AND perform bowel surgery, but you can’t carry around all the heavy, old-fashioned tools needed for those jobs.
Well, we’ve got one tool that can do it all! And that tool is a snake. No, it’s a robot.
It’s both! It’s the COntinuum roBot for Remote Applications. COBRA is the robot that looks like a snake! A snake that’s 5 meters long but only as thick as a pencil (about 9 mm in diameter). A robot with “extraordinary manoeuvrability and responsiveness due to … a compliant-joint structure and multiple continuous sections that enable it to bend at around 90 degrees,” according to the team at the University of Nottingham (England) that developed it.
COBRA comes equipped with a stereovision camera and a miniature cutting tool to perform complex industrial repair, but other devices can be interchanged for possible medical use.
COBRA and its joystick-like controller were designed to be easy to use. Dr. Oladejo Olaleye, the ear, nose, and throat and robotic surgeon at University Hospitals of Leicester who is directing its surgical development, was able to use COBRA on a dummy after just 5 minutes of training. He called it “the future of diagnostic endoscopy and therapeutic surgery.”
Don’t be the last aircraft engineer/nuclear technician/surgeon on your block to have this ultraslender, ultramaneuverable reptilian repair robot. Get your COBRA now! Operators are standing by.
Disclaimer: Robot is still under development and not yet on sale.
Rule, (worm) Britannia!
As long as there have been people, there have been parasitic worms living in their guts. Helminth infection is a continuing and largely ignored crisis in poor, tropical nations, though worm-based diseases have been basically eliminated from wealthier countries.
This wasn’t always the case, however, as a study published in PLOS Neglected Tropical Diseases (now there’s a specific topic) has found. The researchers detail the glorious history of helminth infestation in the United Kingdom from the Victorian era all the way back to prehistory, scouring hundreds of skeletons found in 17 sites across the country for eggs, which can remain intact for thousands of years.
The researchers found that two eras in particular had very high rates of infection. Unsurprisingly, the late medieval era was one of them, but the other is less obvious. The Romans were famous for their hygiene, their baths, and their plumbing, but maybe they also should be famous for the abundance of worms in their bellies. That doesn’t make sense at first: Shouldn’t good hygiene lower infection? The benefits of a good sewer system, however, are lessened when the waste containing said infectious organisms is used to fertilize crops. Recycling is generally a good thing, but less so when you’re recycling parasitic worms.
Curiously, of the three sites from the industrial age, only the one in London had high levels of worm infestation. Considering how dirty and cramped 19th-century British cities were, one might expect disease to run rampant (tuberculosis certainly did), but the sites in Oxford and Birmingham were almost devoid of worms. The researchers theorized that this was because of access to clean well water. Or maybe worms just have a thing for London. [Editor’s note: It’s probably not that.]
No wonder we feel worse after naps
Some of us have hectic schedules that may make a nap feel more necessary. It’s common knowledge that naps shouldn’t be too long – maybe 20 minutes or so – but if you frequently take 3-hour naps and wake up thinking you’re late for school even though you’re 47 and have your PhD, this LOTME is for you.
Studies have shown that there is a link between napping during the day and Alzheimer’s/cognitive decline, but now we’ve got a double whammy for you: Longer and more frequent napping is linked to worse cognition after a year, and in turn, those with cognitive decline and Alzheimer’s are known to nap longer and more frequently during the day.
“We now know that the pathology related to cognitive decline can cause other changes in function,” he said. “It’s really a multisystem disorder, also including difficulty sleeping, changes in movement, changes in body composition, depression symptoms, behavioral changes, etc.,” coauthor Aron Buchman, MD, said in a statement from Rush University Medical Center.
The investigators monitored 1,400 patients over the course of 14 years with wrist bracelets that recorded when a person was not active during the day and considered that a nap.
At the beginning of the study, 75% of the study subjects had no cognitive impairment, 19.5% had some cognitive impairment, and approximately 4% had Alzheimer’s. Napping during the day only increased about 11 minutes a year for those with no signs of cognitive impairment, but those who showed significantly more signs of cognitive decline doubled their nap time and those actually diagnosed with Alzheimer’s tripled theirs.
The investigators did not imply that napping causes Alzheimer’s, but they noted that people who are older and nap more than an hour a day are 40% more likely to be at risk. It is something to consider and monitor.
Sometimes, after all, a nap seems like the best idea ever, but more often than not we wake up feeling 10 times worse. Our bodies may be giving us a heads up.
Pokemon Go away depression
The summer of 2016 was a great time if you happened to be a fan of Pokemon. Which is quite a lot of people. For almost 20 years millions have enjoyed the games and animated series, but Pokemon Go brought the thrill of catching Pokemon to life in a whole new way. For the first time, you could go out into the world and pretend you were a real Pokemon trainer, and everywhere you went, there would be others like you.
The ability to chase after Pikachu and Charizard in real life (well, augmented reality, but close enough) seemed to bring people a lot of joy, but seemed is never good enough for science. Can’t have anecdotes, we need data! So researchers at the London School of Economics and Political Science conducted a study into how Pokemon Go affected local Internet search rates of depression as the game was released slowly around the world.
Through analyzing Google Trend data of words like “depression,” “anxiety,” and “stress,” the researchers found that the release of Pokemon Go was significantly associated with a noticeable, though short-term, drop in depression-related Internet searches. Location-based augmented reality games may alleviate symptoms of mild depression, the researchers said, as they encourage physical activity, face-to-face socialization, and exposure to nature, though they added that simply going outside is likely not enough to combat clinical cases of severe depression.
Still, augmented reality games represent a viable target for public health investment, since they’re easy to use and inexpensive to make. That said, we’re not sure we want the FDA or CDC making a new Pokemon Go game. They’d probably end up filling the streets with Mr. Mime. And no one would leave their house for that.
And now a word from our sponsor
How many times has this happened to you? You need to repair a jet engine, inspect a nuclear reactor cooling system, AND perform bowel surgery, but you can’t carry around all the heavy, old-fashioned tools needed for those jobs.
Well, we’ve got one tool that can do it all! And that tool is a snake. No, it’s a robot.
It’s both! It’s the COntinuum roBot for Remote Applications. COBRA is the robot that looks like a snake! A snake that’s 5 meters long but only as thick as a pencil (about 9 mm in diameter). A robot with “extraordinary manoeuvrability and responsiveness due to … a compliant-joint structure and multiple continuous sections that enable it to bend at around 90 degrees,” according to the team at the University of Nottingham (England) that developed it.
COBRA comes equipped with a stereovision camera and a miniature cutting tool to perform complex industrial repair, but other devices can be interchanged for possible medical use.
COBRA and its joystick-like controller were designed to be easy to use. Dr. Oladejo Olaleye, the ear, nose, and throat and robotic surgeon at University Hospitals of Leicester who is directing its surgical development, was able to use COBRA on a dummy after just 5 minutes of training. He called it “the future of diagnostic endoscopy and therapeutic surgery.”
Don’t be the last aircraft engineer/nuclear technician/surgeon on your block to have this ultraslender, ultramaneuverable reptilian repair robot. Get your COBRA now! Operators are standing by.
Disclaimer: Robot is still under development and not yet on sale.
Rule, (worm) Britannia!
As long as there have been people, there have been parasitic worms living in their guts. Helminth infection is a continuing and largely ignored crisis in poor, tropical nations, though worm-based diseases have been basically eliminated from wealthier countries.
This wasn’t always the case, however, as a study published in PLOS Neglected Tropical Diseases (now there’s a specific topic) has found. The researchers detail the glorious history of helminth infestation in the United Kingdom from the Victorian era all the way back to prehistory, scouring hundreds of skeletons found in 17 sites across the country for eggs, which can remain intact for thousands of years.
The researchers found that two eras in particular had very high rates of infection. Unsurprisingly, the late medieval era was one of them, but the other is less obvious. The Romans were famous for their hygiene, their baths, and their plumbing, but maybe they also should be famous for the abundance of worms in their bellies. That doesn’t make sense at first: Shouldn’t good hygiene lower infection? The benefits of a good sewer system, however, are lessened when the waste containing said infectious organisms is used to fertilize crops. Recycling is generally a good thing, but less so when you’re recycling parasitic worms.
Curiously, of the three sites from the industrial age, only the one in London had high levels of worm infestation. Considering how dirty and cramped 19th-century British cities were, one might expect disease to run rampant (tuberculosis certainly did), but the sites in Oxford and Birmingham were almost devoid of worms. The researchers theorized that this was because of access to clean well water. Or maybe worms just have a thing for London. [Editor’s note: It’s probably not that.]
No wonder we feel worse after naps
Some of us have hectic schedules that may make a nap feel more necessary. It’s common knowledge that naps shouldn’t be too long – maybe 20 minutes or so – but if you frequently take 3-hour naps and wake up thinking you’re late for school even though you’re 47 and have your PhD, this LOTME is for you.
Studies have shown that there is a link between napping during the day and Alzheimer’s/cognitive decline, but now we’ve got a double whammy for you: Longer and more frequent napping is linked to worse cognition after a year, and in turn, those with cognitive decline and Alzheimer’s are known to nap longer and more frequently during the day.
“We now know that the pathology related to cognitive decline can cause other changes in function,” he said. “It’s really a multisystem disorder, also including difficulty sleeping, changes in movement, changes in body composition, depression symptoms, behavioral changes, etc.,” coauthor Aron Buchman, MD, said in a statement from Rush University Medical Center.
The investigators monitored 1,400 patients over the course of 14 years with wrist bracelets that recorded when a person was not active during the day and considered that a nap.
At the beginning of the study, 75% of the study subjects had no cognitive impairment, 19.5% had some cognitive impairment, and approximately 4% had Alzheimer’s. Napping during the day only increased about 11 minutes a year for those with no signs of cognitive impairment, but those who showed significantly more signs of cognitive decline doubled their nap time and those actually diagnosed with Alzheimer’s tripled theirs.
The investigators did not imply that napping causes Alzheimer’s, but they noted that people who are older and nap more than an hour a day are 40% more likely to be at risk. It is something to consider and monitor.
Sometimes, after all, a nap seems like the best idea ever, but more often than not we wake up feeling 10 times worse. Our bodies may be giving us a heads up.
Pokemon Go away depression
The summer of 2016 was a great time if you happened to be a fan of Pokemon. Which is quite a lot of people. For almost 20 years millions have enjoyed the games and animated series, but Pokemon Go brought the thrill of catching Pokemon to life in a whole new way. For the first time, you could go out into the world and pretend you were a real Pokemon trainer, and everywhere you went, there would be others like you.
The ability to chase after Pikachu and Charizard in real life (well, augmented reality, but close enough) seemed to bring people a lot of joy, but seemed is never good enough for science. Can’t have anecdotes, we need data! So researchers at the London School of Economics and Political Science conducted a study into how Pokemon Go affected local Internet search rates of depression as the game was released slowly around the world.
Through analyzing Google Trend data of words like “depression,” “anxiety,” and “stress,” the researchers found that the release of Pokemon Go was significantly associated with a noticeable, though short-term, drop in depression-related Internet searches. Location-based augmented reality games may alleviate symptoms of mild depression, the researchers said, as they encourage physical activity, face-to-face socialization, and exposure to nature, though they added that simply going outside is likely not enough to combat clinical cases of severe depression.
Still, augmented reality games represent a viable target for public health investment, since they’re easy to use and inexpensive to make. That said, we’re not sure we want the FDA or CDC making a new Pokemon Go game. They’d probably end up filling the streets with Mr. Mime. And no one would leave their house for that.
And now a word from our sponsor
How many times has this happened to you? You need to repair a jet engine, inspect a nuclear reactor cooling system, AND perform bowel surgery, but you can’t carry around all the heavy, old-fashioned tools needed for those jobs.
Well, we’ve got one tool that can do it all! And that tool is a snake. No, it’s a robot.
It’s both! It’s the COntinuum roBot for Remote Applications. COBRA is the robot that looks like a snake! A snake that’s 5 meters long but only as thick as a pencil (about 9 mm in diameter). A robot with “extraordinary manoeuvrability and responsiveness due to … a compliant-joint structure and multiple continuous sections that enable it to bend at around 90 degrees,” according to the team at the University of Nottingham (England) that developed it.
COBRA comes equipped with a stereovision camera and a miniature cutting tool to perform complex industrial repair, but other devices can be interchanged for possible medical use.
COBRA and its joystick-like controller were designed to be easy to use. Dr. Oladejo Olaleye, the ear, nose, and throat and robotic surgeon at University Hospitals of Leicester who is directing its surgical development, was able to use COBRA on a dummy after just 5 minutes of training. He called it “the future of diagnostic endoscopy and therapeutic surgery.”
Don’t be the last aircraft engineer/nuclear technician/surgeon on your block to have this ultraslender, ultramaneuverable reptilian repair robot. Get your COBRA now! Operators are standing by.
Disclaimer: Robot is still under development and not yet on sale.
Rule, (worm) Britannia!
As long as there have been people, there have been parasitic worms living in their guts. Helminth infection is a continuing and largely ignored crisis in poor, tropical nations, though worm-based diseases have been basically eliminated from wealthier countries.
This wasn’t always the case, however, as a study published in PLOS Neglected Tropical Diseases (now there’s a specific topic) has found. The researchers detail the glorious history of helminth infestation in the United Kingdom from the Victorian era all the way back to prehistory, scouring hundreds of skeletons found in 17 sites across the country for eggs, which can remain intact for thousands of years.
The researchers found that two eras in particular had very high rates of infection. Unsurprisingly, the late medieval era was one of them, but the other is less obvious. The Romans were famous for their hygiene, their baths, and their plumbing, but maybe they also should be famous for the abundance of worms in their bellies. That doesn’t make sense at first: Shouldn’t good hygiene lower infection? The benefits of a good sewer system, however, are lessened when the waste containing said infectious organisms is used to fertilize crops. Recycling is generally a good thing, but less so when you’re recycling parasitic worms.
Curiously, of the three sites from the industrial age, only the one in London had high levels of worm infestation. Considering how dirty and cramped 19th-century British cities were, one might expect disease to run rampant (tuberculosis certainly did), but the sites in Oxford and Birmingham were almost devoid of worms. The researchers theorized that this was because of access to clean well water. Or maybe worms just have a thing for London. [Editor’s note: It’s probably not that.]
Do personality traits predict cognitive decline?
new research shows.
Investigators analyzed data from almost 2,000 individuals enrolled in the Rush Memory and Aging Project (MAP) – a longitudinal study of older adults living in the greater Chicago metropolitan region and northeastern Illinois – with recruitment that began in 1997 and continues through today. Participants received a personality assessment as well as annual assessments of their cognitive abilities.
Those with high scores on measures of conscientiousness were significantly less likely to progress from normal cognition to mild cognitive impairment (MCI) during the study. In fact, scoring an extra 1 standard deviation on the conscientiousness scale was associated with a 22% lower risk of transitioning from no cognitive impairment (NCI) to MCI. On the other hand, scoring an additional 1 SD on a neuroticism scale was associated with a 12% increased risk of transitioning to MCI.
Participants who scored high on extraversion, as well as those who scored high on conscientiousness or low on neuroticism, tended to maintain normal cognitive functioning longer than other participants.
“Personality traits reflect relatively enduring patterns of thinking and behaving, which may cumulatively affect engagement in healthy and unhealthy behaviors and thought patterns across the lifespan,” lead author Tomiko Yoneda, PhD, a postdoctoral researcher in the department of medical social sciences, Northwestern University, Chicago, said in an interview.
“The accumulation of lifelong experiences may then contribute to susceptibility of particular diseases or disorders, such as mild cognitive impairment, or contribute to individual differences in the ability to withstand age-related neurological changes,” she added.
The study was published online in the Journal of Personality and Social Psychology.
Competing risk factors
Personality traits “reflect an individual’s persistent patterns of thinking, feeling, and behaving,” Dr. Yoneda said.
“For example, conscientiousness is characterized by competence, dutifulness, and self-discipline, while neuroticism is characterized by anxiety, depressive symptoms, and emotional instability. Likewise, individuals high in extraversion tend to be enthusiastic, gregarious, talkative, and assertive,” she added.
Previous research “suggests that low conscientiousness and high neuroticism are associated with an increased risk of cognitive impairment,” she continued. However, “there is also an increased risk of death in older adulthood – in other words, these outcomes are ‘competing risk factors.’”
Dr. Yoneda said her team wanted to “examine the impact of personality traits on the simultaneous risk of transitioning to mild cognitive impairment, dementia, and death.”
For the study, the researchers analyzed data from 1,954 participants in MAP (mean age at baseline 80 years, 73.7% female, 86.8% White), who received a personality assessment and annual assessments of their cognitive abilities.
To assess personality traits – in particular, conscientiousness, neuroticism, and extraversion – the researchers used the NEO Five Factor Inventory (NEO-FFI). They also used multistate survival modeling to examine the potential association between these traits and transitions from one cognitive status category to another (NCI, MCI, and dementia) and to death.
Cognitive healthspan
By the end of the study, over half of the sample (54%) had died.
Most transitions showed “relative stability in cognitive status across measurement occasions.”
- NCI to NCI (n = 7,368)
- MCI to MCI (n = 1,244)
- Dementia to dementia (n = 876)
There were 725 “backward transitions” from MCI to NCI, “which may reflect improvement or within-person variability in cognitive functioning, or learning effects,” the authors note.
There were only 114 “backward transitions” from dementia to MCI and only 12 from dementia to NCI, “suggesting that improvement in cognitive status was relatively rare, particularly once an individual progresses to dementia.”
After adjusting for demographics, depressive symptoms, and apolipoprotein (APOE) ε4 allele, the researchers found that personality traits were the most important factors in the transition from NCI to MCI.
Higher conscientiousness was associated with a decreased risk of transitioning from NCI to MCI (hazard ratio, 0.78; 95% confidence interval, 0.72-0.85). Conversely, higher neuroticism was associated with an increased risk of transitioning from NCI to MCI (HR, 1.12; 95% CI, 1.04-1.21) and a significantly decreased likelihood of transition back from MCI to NCI (HR, 0.90; 95% CI, 0.81-1.00).
Scoring ~6 points on a conscientiousness scale ranging from 0-48 (that is, 1 SD on the scale) was significantly associated with ~22% lower risk of transitioning forward from NCI to MCI, while scoring ~7 more points on a neuroticism scale (1 SD) was significantly associated with ~12% higher risk of transitioning from NCI to MCI.
Higher extraversion was associated with an increased likelihood of transitioning from MCI back to NCI (HR, 1.12; 95% CI, 1.03-1.22), and although extraversion was not associated with a longer total lifespan, participants who scored high on extraversion, as well as those who scored low on conscientiousness or low on neuroticism, maintained normal cognitive function longer than other participants.
“Our results suggest that high conscientiousness and low neuroticism may protect individuals against mild cognitive impairment,” said Dr. Yoneda.
Importantly, individuals who were either higher in conscientiousness, higher in extraversion, or lower in neuroticism had more years of “cognitive healthspan,” meaning more years without cognitive impairment,” she added.
In addition, “individuals lower in neuroticism and higher in extraversion were more likely to recover after receiving an MCI diagnosis, suggesting that these traits may be protective even after an individual starts to progress to dementia,” she said.
The authors note that the study focused on only three of the Big Five personality traits, while the other 2 – openness to experience and agreeableness – may also be associated with cognitive aging processes and mortality.
Nevertheless, given the current results, alongside extensive research in the personality field, aiming to increase conscientiousness through persistent behavioral change is one potential strategy for promoting healthy cognitive aging, Dr. Yoneda said.
‘Invaluable window’
In a comment, Brent Roberts, PhD, professor of psychology, University of Illinois Urbana-Champaign, said the study provides an “invaluable window into how personality affects the process of decline and either accelerates it, as in the role of neuroticism, or decelerates it, as in the role of conscientiousness.”
“I think the most fascinating finding was the fact that extraversion was related to transitioning from MCI back to NCI. These types of transitions have simply not been part of prior research, and it provides utterly unique insights and opportunities for interventions that may actually help people recover from a decline,” said Dr. Roberts, who was not involved in the research.
Claire Sexton, DPhil, Alzheimer’s Association director of scientific programs and outreach, called the paper “novel” because it investigated the transitions between normal cognition and mild impairment and between mild impairment and dementia.
Dr. Sexton, who was associated with this research team, cautioned that is it observational, “so it can illuminate associations or correlations, but not causes. As a result, we can’t say for sure what the mechanisms are behind these potential connections between personality and cognition, and more research is needed.”
The research was supported by the Alzheimer Society Research Program, Social Sciences and Humanities Research Council, and the National Institute on Aging of the National Institutes of Health. Dr. Yoneda and co-authors, Dr. Roberts, and Dr. Sexton have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
new research shows.
Investigators analyzed data from almost 2,000 individuals enrolled in the Rush Memory and Aging Project (MAP) – a longitudinal study of older adults living in the greater Chicago metropolitan region and northeastern Illinois – with recruitment that began in 1997 and continues through today. Participants received a personality assessment as well as annual assessments of their cognitive abilities.
Those with high scores on measures of conscientiousness were significantly less likely to progress from normal cognition to mild cognitive impairment (MCI) during the study. In fact, scoring an extra 1 standard deviation on the conscientiousness scale was associated with a 22% lower risk of transitioning from no cognitive impairment (NCI) to MCI. On the other hand, scoring an additional 1 SD on a neuroticism scale was associated with a 12% increased risk of transitioning to MCI.
Participants who scored high on extraversion, as well as those who scored high on conscientiousness or low on neuroticism, tended to maintain normal cognitive functioning longer than other participants.
“Personality traits reflect relatively enduring patterns of thinking and behaving, which may cumulatively affect engagement in healthy and unhealthy behaviors and thought patterns across the lifespan,” lead author Tomiko Yoneda, PhD, a postdoctoral researcher in the department of medical social sciences, Northwestern University, Chicago, said in an interview.
“The accumulation of lifelong experiences may then contribute to susceptibility of particular diseases or disorders, such as mild cognitive impairment, or contribute to individual differences in the ability to withstand age-related neurological changes,” she added.
The study was published online in the Journal of Personality and Social Psychology.
Competing risk factors
Personality traits “reflect an individual’s persistent patterns of thinking, feeling, and behaving,” Dr. Yoneda said.
“For example, conscientiousness is characterized by competence, dutifulness, and self-discipline, while neuroticism is characterized by anxiety, depressive symptoms, and emotional instability. Likewise, individuals high in extraversion tend to be enthusiastic, gregarious, talkative, and assertive,” she added.
Previous research “suggests that low conscientiousness and high neuroticism are associated with an increased risk of cognitive impairment,” she continued. However, “there is also an increased risk of death in older adulthood – in other words, these outcomes are ‘competing risk factors.’”
Dr. Yoneda said her team wanted to “examine the impact of personality traits on the simultaneous risk of transitioning to mild cognitive impairment, dementia, and death.”
For the study, the researchers analyzed data from 1,954 participants in MAP (mean age at baseline 80 years, 73.7% female, 86.8% White), who received a personality assessment and annual assessments of their cognitive abilities.
To assess personality traits – in particular, conscientiousness, neuroticism, and extraversion – the researchers used the NEO Five Factor Inventory (NEO-FFI). They also used multistate survival modeling to examine the potential association between these traits and transitions from one cognitive status category to another (NCI, MCI, and dementia) and to death.
Cognitive healthspan
By the end of the study, over half of the sample (54%) had died.
Most transitions showed “relative stability in cognitive status across measurement occasions.”
- NCI to NCI (n = 7,368)
- MCI to MCI (n = 1,244)
- Dementia to dementia (n = 876)
There were 725 “backward transitions” from MCI to NCI, “which may reflect improvement or within-person variability in cognitive functioning, or learning effects,” the authors note.
There were only 114 “backward transitions” from dementia to MCI and only 12 from dementia to NCI, “suggesting that improvement in cognitive status was relatively rare, particularly once an individual progresses to dementia.”
After adjusting for demographics, depressive symptoms, and apolipoprotein (APOE) ε4 allele, the researchers found that personality traits were the most important factors in the transition from NCI to MCI.
Higher conscientiousness was associated with a decreased risk of transitioning from NCI to MCI (hazard ratio, 0.78; 95% confidence interval, 0.72-0.85). Conversely, higher neuroticism was associated with an increased risk of transitioning from NCI to MCI (HR, 1.12; 95% CI, 1.04-1.21) and a significantly decreased likelihood of transition back from MCI to NCI (HR, 0.90; 95% CI, 0.81-1.00).
Scoring ~6 points on a conscientiousness scale ranging from 0-48 (that is, 1 SD on the scale) was significantly associated with ~22% lower risk of transitioning forward from NCI to MCI, while scoring ~7 more points on a neuroticism scale (1 SD) was significantly associated with ~12% higher risk of transitioning from NCI to MCI.
Higher extraversion was associated with an increased likelihood of transitioning from MCI back to NCI (HR, 1.12; 95% CI, 1.03-1.22), and although extraversion was not associated with a longer total lifespan, participants who scored high on extraversion, as well as those who scored low on conscientiousness or low on neuroticism, maintained normal cognitive function longer than other participants.
“Our results suggest that high conscientiousness and low neuroticism may protect individuals against mild cognitive impairment,” said Dr. Yoneda.
Importantly, individuals who were either higher in conscientiousness, higher in extraversion, or lower in neuroticism had more years of “cognitive healthspan,” meaning more years without cognitive impairment,” she added.
In addition, “individuals lower in neuroticism and higher in extraversion were more likely to recover after receiving an MCI diagnosis, suggesting that these traits may be protective even after an individual starts to progress to dementia,” she said.
The authors note that the study focused on only three of the Big Five personality traits, while the other 2 – openness to experience and agreeableness – may also be associated with cognitive aging processes and mortality.
Nevertheless, given the current results, alongside extensive research in the personality field, aiming to increase conscientiousness through persistent behavioral change is one potential strategy for promoting healthy cognitive aging, Dr. Yoneda said.
‘Invaluable window’
In a comment, Brent Roberts, PhD, professor of psychology, University of Illinois Urbana-Champaign, said the study provides an “invaluable window into how personality affects the process of decline and either accelerates it, as in the role of neuroticism, or decelerates it, as in the role of conscientiousness.”
“I think the most fascinating finding was the fact that extraversion was related to transitioning from MCI back to NCI. These types of transitions have simply not been part of prior research, and it provides utterly unique insights and opportunities for interventions that may actually help people recover from a decline,” said Dr. Roberts, who was not involved in the research.
Claire Sexton, DPhil, Alzheimer’s Association director of scientific programs and outreach, called the paper “novel” because it investigated the transitions between normal cognition and mild impairment and between mild impairment and dementia.
Dr. Sexton, who was associated with this research team, cautioned that is it observational, “so it can illuminate associations or correlations, but not causes. As a result, we can’t say for sure what the mechanisms are behind these potential connections between personality and cognition, and more research is needed.”
The research was supported by the Alzheimer Society Research Program, Social Sciences and Humanities Research Council, and the National Institute on Aging of the National Institutes of Health. Dr. Yoneda and co-authors, Dr. Roberts, and Dr. Sexton have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
new research shows.
Investigators analyzed data from almost 2,000 individuals enrolled in the Rush Memory and Aging Project (MAP) – a longitudinal study of older adults living in the greater Chicago metropolitan region and northeastern Illinois – with recruitment that began in 1997 and continues through today. Participants received a personality assessment as well as annual assessments of their cognitive abilities.
Those with high scores on measures of conscientiousness were significantly less likely to progress from normal cognition to mild cognitive impairment (MCI) during the study. In fact, scoring an extra 1 standard deviation on the conscientiousness scale was associated with a 22% lower risk of transitioning from no cognitive impairment (NCI) to MCI. On the other hand, scoring an additional 1 SD on a neuroticism scale was associated with a 12% increased risk of transitioning to MCI.
Participants who scored high on extraversion, as well as those who scored high on conscientiousness or low on neuroticism, tended to maintain normal cognitive functioning longer than other participants.
“Personality traits reflect relatively enduring patterns of thinking and behaving, which may cumulatively affect engagement in healthy and unhealthy behaviors and thought patterns across the lifespan,” lead author Tomiko Yoneda, PhD, a postdoctoral researcher in the department of medical social sciences, Northwestern University, Chicago, said in an interview.
“The accumulation of lifelong experiences may then contribute to susceptibility of particular diseases or disorders, such as mild cognitive impairment, or contribute to individual differences in the ability to withstand age-related neurological changes,” she added.
The study was published online in the Journal of Personality and Social Psychology.
Competing risk factors
Personality traits “reflect an individual’s persistent patterns of thinking, feeling, and behaving,” Dr. Yoneda said.
“For example, conscientiousness is characterized by competence, dutifulness, and self-discipline, while neuroticism is characterized by anxiety, depressive symptoms, and emotional instability. Likewise, individuals high in extraversion tend to be enthusiastic, gregarious, talkative, and assertive,” she added.
Previous research “suggests that low conscientiousness and high neuroticism are associated with an increased risk of cognitive impairment,” she continued. However, “there is also an increased risk of death in older adulthood – in other words, these outcomes are ‘competing risk factors.’”
Dr. Yoneda said her team wanted to “examine the impact of personality traits on the simultaneous risk of transitioning to mild cognitive impairment, dementia, and death.”
For the study, the researchers analyzed data from 1,954 participants in MAP (mean age at baseline 80 years, 73.7% female, 86.8% White), who received a personality assessment and annual assessments of their cognitive abilities.
To assess personality traits – in particular, conscientiousness, neuroticism, and extraversion – the researchers used the NEO Five Factor Inventory (NEO-FFI). They also used multistate survival modeling to examine the potential association between these traits and transitions from one cognitive status category to another (NCI, MCI, and dementia) and to death.
Cognitive healthspan
By the end of the study, over half of the sample (54%) had died.
Most transitions showed “relative stability in cognitive status across measurement occasions.”
- NCI to NCI (n = 7,368)
- MCI to MCI (n = 1,244)
- Dementia to dementia (n = 876)
There were 725 “backward transitions” from MCI to NCI, “which may reflect improvement or within-person variability in cognitive functioning, or learning effects,” the authors note.
There were only 114 “backward transitions” from dementia to MCI and only 12 from dementia to NCI, “suggesting that improvement in cognitive status was relatively rare, particularly once an individual progresses to dementia.”
After adjusting for demographics, depressive symptoms, and apolipoprotein (APOE) ε4 allele, the researchers found that personality traits were the most important factors in the transition from NCI to MCI.
Higher conscientiousness was associated with a decreased risk of transitioning from NCI to MCI (hazard ratio, 0.78; 95% confidence interval, 0.72-0.85). Conversely, higher neuroticism was associated with an increased risk of transitioning from NCI to MCI (HR, 1.12; 95% CI, 1.04-1.21) and a significantly decreased likelihood of transition back from MCI to NCI (HR, 0.90; 95% CI, 0.81-1.00).
Scoring ~6 points on a conscientiousness scale ranging from 0-48 (that is, 1 SD on the scale) was significantly associated with ~22% lower risk of transitioning forward from NCI to MCI, while scoring ~7 more points on a neuroticism scale (1 SD) was significantly associated with ~12% higher risk of transitioning from NCI to MCI.
Higher extraversion was associated with an increased likelihood of transitioning from MCI back to NCI (HR, 1.12; 95% CI, 1.03-1.22), and although extraversion was not associated with a longer total lifespan, participants who scored high on extraversion, as well as those who scored low on conscientiousness or low on neuroticism, maintained normal cognitive function longer than other participants.
“Our results suggest that high conscientiousness and low neuroticism may protect individuals against mild cognitive impairment,” said Dr. Yoneda.
Importantly, individuals who were either higher in conscientiousness, higher in extraversion, or lower in neuroticism had more years of “cognitive healthspan,” meaning more years without cognitive impairment,” she added.
In addition, “individuals lower in neuroticism and higher in extraversion were more likely to recover after receiving an MCI diagnosis, suggesting that these traits may be protective even after an individual starts to progress to dementia,” she said.
The authors note that the study focused on only three of the Big Five personality traits, while the other 2 – openness to experience and agreeableness – may also be associated with cognitive aging processes and mortality.
Nevertheless, given the current results, alongside extensive research in the personality field, aiming to increase conscientiousness through persistent behavioral change is one potential strategy for promoting healthy cognitive aging, Dr. Yoneda said.
‘Invaluable window’
In a comment, Brent Roberts, PhD, professor of psychology, University of Illinois Urbana-Champaign, said the study provides an “invaluable window into how personality affects the process of decline and either accelerates it, as in the role of neuroticism, or decelerates it, as in the role of conscientiousness.”
“I think the most fascinating finding was the fact that extraversion was related to transitioning from MCI back to NCI. These types of transitions have simply not been part of prior research, and it provides utterly unique insights and opportunities for interventions that may actually help people recover from a decline,” said Dr. Roberts, who was not involved in the research.
Claire Sexton, DPhil, Alzheimer’s Association director of scientific programs and outreach, called the paper “novel” because it investigated the transitions between normal cognition and mild impairment and between mild impairment and dementia.
Dr. Sexton, who was associated with this research team, cautioned that is it observational, “so it can illuminate associations or correlations, but not causes. As a result, we can’t say for sure what the mechanisms are behind these potential connections between personality and cognition, and more research is needed.”
The research was supported by the Alzheimer Society Research Program, Social Sciences and Humanities Research Council, and the National Institute on Aging of the National Institutes of Health. Dr. Yoneda and co-authors, Dr. Roberts, and Dr. Sexton have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY