User login
The role of aspirin today
This transcript has been edited for clarity.
Dear colleagues, I am Christoph Diener from the faculty of medicine at the University of Duisburg-Essen in Germany.
Usually in this video series, I report on interesting scientific studies in the field of neurology published in the last month. But I have to admit, June was a lousy month for new science in neurology. Therefore, this month I’d like to take a different approach and tell you about a very interesting, old drug.
We are celebrating the 125th anniversary of aspirin. Aspirin was first synthesized in Wuppertal, Germany, a city which is only 40 km from my location, by Felix Hoffmann. Hoffmann was searching for a new drug for his father who suffered from severe joint pain, and the available drugs at that time had terrible adverse events. This prompted him to work on a new drug, which was later called aspirin acetylsalicylic acid.
Aspirin has been used very successfully to the present day as therapy for joint pain or arthritis. But as you know, it’s also effective in headaches, in particular, tension-type headache. I think it’s one of the most used drugs in the world for the treatment of acute migraine attacks.
It’s also available in some European countries in intravenous form for the treatment of severe migraine attacks or in the emergency room, and it’s as effective as subcutaneous sumatriptan. It’s also an effective migraine preventive drug in a dose of 300 mg/d.
Discovering aspirin’s antiplatelet activity
There was an interesting observation by a dentist in the 1930s, who noted bleeding when he extracted teeth in people who took aspirin for joint pain. When he started to ask his patients about possible bleeding complications and vascular events, he observed that people who took aspirin didn’t have coronary myocardial infarctions.
It took a long time for people to discover that aspirin is not only a pain medication but also an antiplatelet agent. The first randomized study that showed that aspirin is effective in secondary prevention after myocardial infarction was published in 1974 in The New England Journal of Medicine. In 1980, aspirin was approved by the U.S. Food and Drug Administration for the secondary prevention of stroke and in 1984 for secondary prevention after myocardial infarction.
A history of efficacy
Aspirin also has a proven role in the secondary prevention of transient ischemic attack and ischemic stroke. Given early, it reduces the risk for a recurrent vascular event by 50% and long-term, compared with placebo, by 20%.
Interestingly, the doses are different in different areas of the world. In the United States, it’s either 81 mg or 325 mg. In Europe, it’s usually 100 mg. Until a few years ago, there was no single trial which used 100 mg of aspirin, compared with placebo for the secondary prevention of stroke.
If we look at dual antiplatelet therapy, the combination of aspirin and clopidogrel was not superior to aspirin alone or clopidogrel alone for long-term prevention, but the combination of dipyridamole and aspirin and the combination of cilostazol and aspirin were superior to aspirin alone for secondary stroke prevention. Short-term, within the first 30 days, the combination of aspirin and clopidogrel and the combination of ticagrelor and aspirin is superior to monotherapy but also have an increased risk for bleeding.
People with atrial fibrillation or embolic strokes need to be anticoagulated, but the addition of aspirin to anticoagulation does not increase efficacy, it only increases the risk for bleeding.
In people above the age of 75 years who have to take aspirin, there is an increased risk for upper gastrointestinal bleeding. These patients should, in addition, receive proton pump inhibitors.
The use of aspirin for the primary prevention of vascular events was promoted for almost 50 years all over the world, but in the last 5 years, a number of randomized trials clearly showed that aspirin is not effective, compared with placebo, in the primary prevention of vascular event stroke, myocardial infarction, and vascular death. It only increases the risk for bleeding.
So it’s a clear separation. Aspirin should not be used for primary prevention of vascular events, but it should be used in basically everyone who doesn’t have contraindications for secondary prevention of vascular events and vascular death.
Ladies and gentlemen, a drug that is 125 years old is also still one of the most used and affordable drugs all around the world. It’s highly effective and has only a small risk for major bleeding complications. It’s really time to celebrate aspirin for this achievement.
Dr. Diener is professor, department of neurology, Stroke Center-Headache Center, University Duisburg-Essen (Germany). A complete list of his financial disclosures is available at the link below.
A version of this article first appeared on Medscape.com.
This transcript has been edited for clarity.
Dear colleagues, I am Christoph Diener from the faculty of medicine at the University of Duisburg-Essen in Germany.
Usually in this video series, I report on interesting scientific studies in the field of neurology published in the last month. But I have to admit, June was a lousy month for new science in neurology. Therefore, this month I’d like to take a different approach and tell you about a very interesting, old drug.
We are celebrating the 125th anniversary of aspirin. Aspirin was first synthesized in Wuppertal, Germany, a city which is only 40 km from my location, by Felix Hoffmann. Hoffmann was searching for a new drug for his father who suffered from severe joint pain, and the available drugs at that time had terrible adverse events. This prompted him to work on a new drug, which was later called aspirin acetylsalicylic acid.
Aspirin has been used very successfully to the present day as therapy for joint pain or arthritis. But as you know, it’s also effective in headaches, in particular, tension-type headache. I think it’s one of the most used drugs in the world for the treatment of acute migraine attacks.
It’s also available in some European countries in intravenous form for the treatment of severe migraine attacks or in the emergency room, and it’s as effective as subcutaneous sumatriptan. It’s also an effective migraine preventive drug in a dose of 300 mg/d.
Discovering aspirin’s antiplatelet activity
There was an interesting observation by a dentist in the 1930s, who noted bleeding when he extracted teeth in people who took aspirin for joint pain. When he started to ask his patients about possible bleeding complications and vascular events, he observed that people who took aspirin didn’t have coronary myocardial infarctions.
It took a long time for people to discover that aspirin is not only a pain medication but also an antiplatelet agent. The first randomized study that showed that aspirin is effective in secondary prevention after myocardial infarction was published in 1974 in The New England Journal of Medicine. In 1980, aspirin was approved by the U.S. Food and Drug Administration for the secondary prevention of stroke and in 1984 for secondary prevention after myocardial infarction.
A history of efficacy
Aspirin also has a proven role in the secondary prevention of transient ischemic attack and ischemic stroke. Given early, it reduces the risk for a recurrent vascular event by 50% and long-term, compared with placebo, by 20%.
Interestingly, the doses are different in different areas of the world. In the United States, it’s either 81 mg or 325 mg. In Europe, it’s usually 100 mg. Until a few years ago, there was no single trial which used 100 mg of aspirin, compared with placebo for the secondary prevention of stroke.
If we look at dual antiplatelet therapy, the combination of aspirin and clopidogrel was not superior to aspirin alone or clopidogrel alone for long-term prevention, but the combination of dipyridamole and aspirin and the combination of cilostazol and aspirin were superior to aspirin alone for secondary stroke prevention. Short-term, within the first 30 days, the combination of aspirin and clopidogrel and the combination of ticagrelor and aspirin is superior to monotherapy but also have an increased risk for bleeding.
People with atrial fibrillation or embolic strokes need to be anticoagulated, but the addition of aspirin to anticoagulation does not increase efficacy, it only increases the risk for bleeding.
In people above the age of 75 years who have to take aspirin, there is an increased risk for upper gastrointestinal bleeding. These patients should, in addition, receive proton pump inhibitors.
The use of aspirin for the primary prevention of vascular events was promoted for almost 50 years all over the world, but in the last 5 years, a number of randomized trials clearly showed that aspirin is not effective, compared with placebo, in the primary prevention of vascular event stroke, myocardial infarction, and vascular death. It only increases the risk for bleeding.
So it’s a clear separation. Aspirin should not be used for primary prevention of vascular events, but it should be used in basically everyone who doesn’t have contraindications for secondary prevention of vascular events and vascular death.
Ladies and gentlemen, a drug that is 125 years old is also still one of the most used and affordable drugs all around the world. It’s highly effective and has only a small risk for major bleeding complications. It’s really time to celebrate aspirin for this achievement.
Dr. Diener is professor, department of neurology, Stroke Center-Headache Center, University Duisburg-Essen (Germany). A complete list of his financial disclosures is available at the link below.
A version of this article first appeared on Medscape.com.
This transcript has been edited for clarity.
Dear colleagues, I am Christoph Diener from the faculty of medicine at the University of Duisburg-Essen in Germany.
Usually in this video series, I report on interesting scientific studies in the field of neurology published in the last month. But I have to admit, June was a lousy month for new science in neurology. Therefore, this month I’d like to take a different approach and tell you about a very interesting, old drug.
We are celebrating the 125th anniversary of aspirin. Aspirin was first synthesized in Wuppertal, Germany, a city which is only 40 km from my location, by Felix Hoffmann. Hoffmann was searching for a new drug for his father who suffered from severe joint pain, and the available drugs at that time had terrible adverse events. This prompted him to work on a new drug, which was later called aspirin acetylsalicylic acid.
Aspirin has been used very successfully to the present day as therapy for joint pain or arthritis. But as you know, it’s also effective in headaches, in particular, tension-type headache. I think it’s one of the most used drugs in the world for the treatment of acute migraine attacks.
It’s also available in some European countries in intravenous form for the treatment of severe migraine attacks or in the emergency room, and it’s as effective as subcutaneous sumatriptan. It’s also an effective migraine preventive drug in a dose of 300 mg/d.
Discovering aspirin’s antiplatelet activity
There was an interesting observation by a dentist in the 1930s, who noted bleeding when he extracted teeth in people who took aspirin for joint pain. When he started to ask his patients about possible bleeding complications and vascular events, he observed that people who took aspirin didn’t have coronary myocardial infarctions.
It took a long time for people to discover that aspirin is not only a pain medication but also an antiplatelet agent. The first randomized study that showed that aspirin is effective in secondary prevention after myocardial infarction was published in 1974 in The New England Journal of Medicine. In 1980, aspirin was approved by the U.S. Food and Drug Administration for the secondary prevention of stroke and in 1984 for secondary prevention after myocardial infarction.
A history of efficacy
Aspirin also has a proven role in the secondary prevention of transient ischemic attack and ischemic stroke. Given early, it reduces the risk for a recurrent vascular event by 50% and long-term, compared with placebo, by 20%.
Interestingly, the doses are different in different areas of the world. In the United States, it’s either 81 mg or 325 mg. In Europe, it’s usually 100 mg. Until a few years ago, there was no single trial which used 100 mg of aspirin, compared with placebo for the secondary prevention of stroke.
If we look at dual antiplatelet therapy, the combination of aspirin and clopidogrel was not superior to aspirin alone or clopidogrel alone for long-term prevention, but the combination of dipyridamole and aspirin and the combination of cilostazol and aspirin were superior to aspirin alone for secondary stroke prevention. Short-term, within the first 30 days, the combination of aspirin and clopidogrel and the combination of ticagrelor and aspirin is superior to monotherapy but also have an increased risk for bleeding.
People with atrial fibrillation or embolic strokes need to be anticoagulated, but the addition of aspirin to anticoagulation does not increase efficacy, it only increases the risk for bleeding.
In people above the age of 75 years who have to take aspirin, there is an increased risk for upper gastrointestinal bleeding. These patients should, in addition, receive proton pump inhibitors.
The use of aspirin for the primary prevention of vascular events was promoted for almost 50 years all over the world, but in the last 5 years, a number of randomized trials clearly showed that aspirin is not effective, compared with placebo, in the primary prevention of vascular event stroke, myocardial infarction, and vascular death. It only increases the risk for bleeding.
So it’s a clear separation. Aspirin should not be used for primary prevention of vascular events, but it should be used in basically everyone who doesn’t have contraindications for secondary prevention of vascular events and vascular death.
Ladies and gentlemen, a drug that is 125 years old is also still one of the most used and affordable drugs all around the world. It’s highly effective and has only a small risk for major bleeding complications. It’s really time to celebrate aspirin for this achievement.
Dr. Diener is professor, department of neurology, Stroke Center-Headache Center, University Duisburg-Essen (Germany). A complete list of his financial disclosures is available at the link below.
A version of this article first appeared on Medscape.com.
Concerns that low LDL-C alters cognitive function challenged in novel analysis
PCSK9 inhibitors, which are among the most effective therapies for reducing LDL cholesterol (LDL-C), are associated with a neutral effect on cognitive function, according to a genetics-based Mendelian randomization study intended to sort out through the complexity of confounders.
The same study linked HMG-Co A reductase inhibitors (statins) with the potential for modest adverse neurocognitive effects, although these are likely to be outweighed by cardiovascular benefits, according to a collaborating team of investigators from the U.S. National Institutes of Health and the University of Oxford (England).
For clinicians and patients who continue to harbor concerns that cognitive function is threatened by very low LDL-C, this novel approach to evaluating risk is “reassuring,” according to the authors.
Early in clinical testing of PCSK9 inhibitors, a potential signal for adverse effects on cognitive function was reported but unconfirmed. This signal raised concern that extremely low levels of LDL-C, such as < 25 mg/dL, achieved with PCSK9 inhibitors might pose a risk to neurocognitive function.
Of several factors that provided a basis for concern, the PCSK9 enzyme is known to participate in brain development, according to the authors of this newly published study.
Mendelian randomization addresses complex issue
The objective of this Mendelian randomization analysis was to evaluate the relationship of PCSK9 inhibitors and statins on long-term neurocognitive function. Used previously to address other clinical issues, a drug-effect Mendelian randomization analysis evaluates genetic variants to determine whether there is a causal relationship between a risk, which in this case was lipid-lowering drugs, to a specific outcome, which was cognitive performance.
By looking directly at genetic variants that simulate the pharmacological inhibition of drug gene targets, the bias of confounders of clinical effects, such as baseline cognitive function, are avoided, according to the authors.
The message from this drug-effect Mendelian analysis was simple, according to the senior author of the study, Falk W. Lohoff, MD, chief of the section on clinical genomics and experimental therapeutics, National Institute of Alcohol Abuse and Alcoholism.
“Based on our data, we do not see a significant cognitive risk profile with PCSK9 inhibition associated with low LDL-C,” Dr. Lohoff said in an interview. He cautioned that “future long-term clinical studies are needed to confirm the absence of this effect,” but he and his coauthors noted that these data concur with the clinical studies.
From genome-wide association studies, single-nucleotide polymorphisms in PCSK9 and HMG-Co A reductase were extracted from a sample of more than 700,000 individuals of predominantly European ancestry. In the analysis, the investigators evaluated whether inhibition of PCSK9 or HMG-Co A reductase had an effect on seven clinical outcomes that relate to neurocognitive function, including memory, verbal intelligence, and reaction time, as well as biomarkers of cognitive function, such as cortical surface area.
The genetic effect of PCSK9 inhibition was “null for every cognitive-related outcome evaluated,” the investigators reported. The genetic effect of HMG-Co A reductase inhibition had a statistically significant but modest effect on cognitive performance (P = .03) and cortical surface area (P = .03). While the impact of HMG-Co A reductase inhibition on reaction time was stronger on a statistical basis (P = .0002), the investigators reported that it translated into a decrease of only 0.067 milliseconds per 38.7 mg/dL. They characterized this as a “small impact” unlikely to outweigh clinical benefits.
In an editorial that accompanied publication of this study, Brian A. Ference, MD, MPhil, provided context for the suitability of a Mendelian randomization analysis to address this or other questions regarding the impact of lipid-lowering therapies on clinical outcomes, and he ultimately concurred with the major conclusions
Ultimately, this analysis is consistent with other evidence that PCSK9 inhibition does not pose a risk of impaired cognitive function, he wrote. For statins, he concluded that this study “does not provide compelling evidence” to challenge their current clinical use.
Data do not support low LDL-C as cognitive risk factor
Moreover, this study – as well as other evidence – argues strongly against very low levels of LDL-C, regardless of how they are achieved, as a risk factor for diminished cognitive function, Dr. Ference, director of research in the division of translational therapeutics, University of Cambridge (England), said in an interview.
“There is no evidence from Mendelian randomization studies that lifelong exposure to lower LDL-C increases the risk of cognitive impairment,” he said. “This is true when evaluating lifelong exposure to lower LDL-C due to genetic variants in a wide variety of different genes or the genes that encode the target PCKS9 inhibitors, statins, or other lipid-lowering therapies.”
In other words, this study “adds to the accumulating evidence” that LDL-C lowering by itself does not contribute to an adverse impact on cognitive function despite persistent concern. This should not be surprising. Dr. Ference emphasized that there has never been strong evidence for an association.
“As I point out in the editorial, there is no biologically plausible mechanism by which reducing peripheral LDL-C should impact neurological function in any way, because the therapies do not cross the blood brain barrier, and because the nervous system produces its own cholesterol to maintain the integrity of membranes in nervous system cells,” he explained.
Dr. Lohoff reports no potential conflicts of interest. Dr. Ference has financial relationships with numerous pharmaceutical companies including those that make lipid-lowering therapies.
PCSK9 inhibitors, which are among the most effective therapies for reducing LDL cholesterol (LDL-C), are associated with a neutral effect on cognitive function, according to a genetics-based Mendelian randomization study intended to sort out through the complexity of confounders.
The same study linked HMG-Co A reductase inhibitors (statins) with the potential for modest adverse neurocognitive effects, although these are likely to be outweighed by cardiovascular benefits, according to a collaborating team of investigators from the U.S. National Institutes of Health and the University of Oxford (England).
For clinicians and patients who continue to harbor concerns that cognitive function is threatened by very low LDL-C, this novel approach to evaluating risk is “reassuring,” according to the authors.
Early in clinical testing of PCSK9 inhibitors, a potential signal for adverse effects on cognitive function was reported but unconfirmed. This signal raised concern that extremely low levels of LDL-C, such as < 25 mg/dL, achieved with PCSK9 inhibitors might pose a risk to neurocognitive function.
Of several factors that provided a basis for concern, the PCSK9 enzyme is known to participate in brain development, according to the authors of this newly published study.
Mendelian randomization addresses complex issue
The objective of this Mendelian randomization analysis was to evaluate the relationship of PCSK9 inhibitors and statins on long-term neurocognitive function. Used previously to address other clinical issues, a drug-effect Mendelian randomization analysis evaluates genetic variants to determine whether there is a causal relationship between a risk, which in this case was lipid-lowering drugs, to a specific outcome, which was cognitive performance.
By looking directly at genetic variants that simulate the pharmacological inhibition of drug gene targets, the bias of confounders of clinical effects, such as baseline cognitive function, are avoided, according to the authors.
The message from this drug-effect Mendelian analysis was simple, according to the senior author of the study, Falk W. Lohoff, MD, chief of the section on clinical genomics and experimental therapeutics, National Institute of Alcohol Abuse and Alcoholism.
“Based on our data, we do not see a significant cognitive risk profile with PCSK9 inhibition associated with low LDL-C,” Dr. Lohoff said in an interview. He cautioned that “future long-term clinical studies are needed to confirm the absence of this effect,” but he and his coauthors noted that these data concur with the clinical studies.
From genome-wide association studies, single-nucleotide polymorphisms in PCSK9 and HMG-Co A reductase were extracted from a sample of more than 700,000 individuals of predominantly European ancestry. In the analysis, the investigators evaluated whether inhibition of PCSK9 or HMG-Co A reductase had an effect on seven clinical outcomes that relate to neurocognitive function, including memory, verbal intelligence, and reaction time, as well as biomarkers of cognitive function, such as cortical surface area.
The genetic effect of PCSK9 inhibition was “null for every cognitive-related outcome evaluated,” the investigators reported. The genetic effect of HMG-Co A reductase inhibition had a statistically significant but modest effect on cognitive performance (P = .03) and cortical surface area (P = .03). While the impact of HMG-Co A reductase inhibition on reaction time was stronger on a statistical basis (P = .0002), the investigators reported that it translated into a decrease of only 0.067 milliseconds per 38.7 mg/dL. They characterized this as a “small impact” unlikely to outweigh clinical benefits.
In an editorial that accompanied publication of this study, Brian A. Ference, MD, MPhil, provided context for the suitability of a Mendelian randomization analysis to address this or other questions regarding the impact of lipid-lowering therapies on clinical outcomes, and he ultimately concurred with the major conclusions
Ultimately, this analysis is consistent with other evidence that PCSK9 inhibition does not pose a risk of impaired cognitive function, he wrote. For statins, he concluded that this study “does not provide compelling evidence” to challenge their current clinical use.
Data do not support low LDL-C as cognitive risk factor
Moreover, this study – as well as other evidence – argues strongly against very low levels of LDL-C, regardless of how they are achieved, as a risk factor for diminished cognitive function, Dr. Ference, director of research in the division of translational therapeutics, University of Cambridge (England), said in an interview.
“There is no evidence from Mendelian randomization studies that lifelong exposure to lower LDL-C increases the risk of cognitive impairment,” he said. “This is true when evaluating lifelong exposure to lower LDL-C due to genetic variants in a wide variety of different genes or the genes that encode the target PCKS9 inhibitors, statins, or other lipid-lowering therapies.”
In other words, this study “adds to the accumulating evidence” that LDL-C lowering by itself does not contribute to an adverse impact on cognitive function despite persistent concern. This should not be surprising. Dr. Ference emphasized that there has never been strong evidence for an association.
“As I point out in the editorial, there is no biologically plausible mechanism by which reducing peripheral LDL-C should impact neurological function in any way, because the therapies do not cross the blood brain barrier, and because the nervous system produces its own cholesterol to maintain the integrity of membranes in nervous system cells,” he explained.
Dr. Lohoff reports no potential conflicts of interest. Dr. Ference has financial relationships with numerous pharmaceutical companies including those that make lipid-lowering therapies.
PCSK9 inhibitors, which are among the most effective therapies for reducing LDL cholesterol (LDL-C), are associated with a neutral effect on cognitive function, according to a genetics-based Mendelian randomization study intended to sort out through the complexity of confounders.
The same study linked HMG-Co A reductase inhibitors (statins) with the potential for modest adverse neurocognitive effects, although these are likely to be outweighed by cardiovascular benefits, according to a collaborating team of investigators from the U.S. National Institutes of Health and the University of Oxford (England).
For clinicians and patients who continue to harbor concerns that cognitive function is threatened by very low LDL-C, this novel approach to evaluating risk is “reassuring,” according to the authors.
Early in clinical testing of PCSK9 inhibitors, a potential signal for adverse effects on cognitive function was reported but unconfirmed. This signal raised concern that extremely low levels of LDL-C, such as < 25 mg/dL, achieved with PCSK9 inhibitors might pose a risk to neurocognitive function.
Of several factors that provided a basis for concern, the PCSK9 enzyme is known to participate in brain development, according to the authors of this newly published study.
Mendelian randomization addresses complex issue
The objective of this Mendelian randomization analysis was to evaluate the relationship of PCSK9 inhibitors and statins on long-term neurocognitive function. Used previously to address other clinical issues, a drug-effect Mendelian randomization analysis evaluates genetic variants to determine whether there is a causal relationship between a risk, which in this case was lipid-lowering drugs, to a specific outcome, which was cognitive performance.
By looking directly at genetic variants that simulate the pharmacological inhibition of drug gene targets, the bias of confounders of clinical effects, such as baseline cognitive function, are avoided, according to the authors.
The message from this drug-effect Mendelian analysis was simple, according to the senior author of the study, Falk W. Lohoff, MD, chief of the section on clinical genomics and experimental therapeutics, National Institute of Alcohol Abuse and Alcoholism.
“Based on our data, we do not see a significant cognitive risk profile with PCSK9 inhibition associated with low LDL-C,” Dr. Lohoff said in an interview. He cautioned that “future long-term clinical studies are needed to confirm the absence of this effect,” but he and his coauthors noted that these data concur with the clinical studies.
From genome-wide association studies, single-nucleotide polymorphisms in PCSK9 and HMG-Co A reductase were extracted from a sample of more than 700,000 individuals of predominantly European ancestry. In the analysis, the investigators evaluated whether inhibition of PCSK9 or HMG-Co A reductase had an effect on seven clinical outcomes that relate to neurocognitive function, including memory, verbal intelligence, and reaction time, as well as biomarkers of cognitive function, such as cortical surface area.
The genetic effect of PCSK9 inhibition was “null for every cognitive-related outcome evaluated,” the investigators reported. The genetic effect of HMG-Co A reductase inhibition had a statistically significant but modest effect on cognitive performance (P = .03) and cortical surface area (P = .03). While the impact of HMG-Co A reductase inhibition on reaction time was stronger on a statistical basis (P = .0002), the investigators reported that it translated into a decrease of only 0.067 milliseconds per 38.7 mg/dL. They characterized this as a “small impact” unlikely to outweigh clinical benefits.
In an editorial that accompanied publication of this study, Brian A. Ference, MD, MPhil, provided context for the suitability of a Mendelian randomization analysis to address this or other questions regarding the impact of lipid-lowering therapies on clinical outcomes, and he ultimately concurred with the major conclusions
Ultimately, this analysis is consistent with other evidence that PCSK9 inhibition does not pose a risk of impaired cognitive function, he wrote. For statins, he concluded that this study “does not provide compelling evidence” to challenge their current clinical use.
Data do not support low LDL-C as cognitive risk factor
Moreover, this study – as well as other evidence – argues strongly against very low levels of LDL-C, regardless of how they are achieved, as a risk factor for diminished cognitive function, Dr. Ference, director of research in the division of translational therapeutics, University of Cambridge (England), said in an interview.
“There is no evidence from Mendelian randomization studies that lifelong exposure to lower LDL-C increases the risk of cognitive impairment,” he said. “This is true when evaluating lifelong exposure to lower LDL-C due to genetic variants in a wide variety of different genes or the genes that encode the target PCKS9 inhibitors, statins, or other lipid-lowering therapies.”
In other words, this study “adds to the accumulating evidence” that LDL-C lowering by itself does not contribute to an adverse impact on cognitive function despite persistent concern. This should not be surprising. Dr. Ference emphasized that there has never been strong evidence for an association.
“As I point out in the editorial, there is no biologically plausible mechanism by which reducing peripheral LDL-C should impact neurological function in any way, because the therapies do not cross the blood brain barrier, and because the nervous system produces its own cholesterol to maintain the integrity of membranes in nervous system cells,” he explained.
Dr. Lohoff reports no potential conflicts of interest. Dr. Ference has financial relationships with numerous pharmaceutical companies including those that make lipid-lowering therapies.
FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY
A ‘promising target’ to improve outcomes in late-life depression
A new study sheds light on the neurologic underpinnings of late-life depression (LLD) with apathy and its frequently poor response to treatment.
Investigators headed by Faith Gunning, PhD, of the Institute of Geriatric Psychiatry, Weill Cornell Medicine, New York, analyzed baseline and posttreatment brain MRIs and functional MRIs (fMRIs) of older adults with depression who participated in a 12-week open-label nonrandomized clinical trial of escitalopram. Participants had undergone clinical and cognitive assessments.
Disturbances were found in resting state functional connectivity (rsFC) between the salience network (SN) and other large-scale networks that support goal-directed behavior, especially in patients with depression who also had features of apathy.
“This study suggests that, among older adults with depression, distinct network abnormalities may be associated with apathy and poor response to first-line pharmacotherapy and may serve as promising targets for novel interventions,” the investigators write.
The study was published online in JAMA Network Open.
A leading cause of disability
LLD is a “leading cause of disability and medical morbidity in older adulthood,” with one-third to one-half of patients with LLD also suffering from apathy, the authors write.
Older adults with depression and comorbid apathy have poorer outcomes, including lower remission rates and poorer response to first-line antidepressants, compared with those with LLD but who do not have apathy.
Despite the high prevalence of apathy in people with depression, “little is known about its optimal treatment and, more broadly, about the brain-based mechanisms of apathy,” the authors note.
An “emerging hypothesis” points to the role of a compromised SN and its large-scale connections between apathy and poor treatment response in LLD.
The SN (which includes the insula and the dorsal anterior cingulate cortex) “attributes motivational value to a stimulus” and “dynamically coordinates the activity of other large-scale networks, including the executive control network and default mode network (DMN).”
Preliminary studies of apathy in patients with depression report reduced volume in structures of the SN and suggest disruption in functional connectivity among the SN, DMN, and the executive control network; but the mechanisms linking apathy to poor antidepressant response in LLD “are not well understood.”
“Connectometry” is a “novel approach to diffusion MRI analysis that quantifies the local connectome of white matter pathways.” It has been used along with resting-state imagery, but it had not been used in studying apathy.
The researchers investigated the functional connectivity of the SN, hypothesizing that alterations in connectivity among key nodes of the SN and other core circuits that modulate goal-directed behavior (DMN and the executive control network) were implicated in individuals with depression and apathy.
They applied connectometry to “identify pathway-level disruptions in structural connectivity,” hypothesizing that compromise of frontoparietal and frontolimbic pathways would be associated with apathy in patients with LLD.
They also wanted to know whether apathy-related network abnormalities were associated with antidepressant response after 12 weeks of pharmacotherapy with the selective serotonin reuptake inhibitor escitalopram.
Emerging model
The study included 40 older adults (65% women; mean [SD] age, 70.0 [6.6] years) with DSM-IV–diagnosis major depressive disorder (without psychotic features) who were from a single-group, open-label escitalopram treatment trial.
The Hamilton-Depression (HAM-D) scale was used to assess depression, while the Apathy Evaluation Scale was used to assess apathy. On the Apathy Evaluation Scale, a score of greater than 40.5 represents “clinically significant apathy.” Participants completed these tests at baseline and after 12 weeks of escitalopram treatment.
They also completed a battery of neuropsychological tests to assess cognition and underwent MRI imaging. fMRI was used to map group differences in rsFC of the SN, and diffusion connectometry was used to “evaluate pathway-level disruptions in structural connectivity.”
Of the participants, 20 had clinically significant apathy. There were no differences in age, sex, educational level, or the severity of depression at baseline between those who did and those who did not have apathy.
Compared with participants with depression but not apathy, those with depression and comorbid apathy had lower rsFC of salience network seeds (specifically, the dorsolateral prefrontal cortex [DLPFC], premotor cortex, midcingulate cortex, and paracentral lobule).
They also had greater rsFC in the lateral temporal cortex and temporal pole (z > 2.7; Bonferroni-corrected threshold of P < .0125).
Additionally, participants with apathy had lower structural connectivity in the splenium, cingulum, and fronto-occipital fasciculus, compared with those without apathy (t > 2.5; false discovery rate–corrected P = .02).
Of the 27 participants who completed escitalopram treatment; 16 (59%) achieved remission (defined as an HAM-D score <10). Participants with apathy had poorer response to escitalopram treatment.
Lower insula-DLPFC/midcingulate cortex rsFC was associated with less improvement in depressive symptoms (HAM-D percentage change, beta [df] = .588 [26]; P = .001) as well as a greater likelihood that the participant would not achieve remission after treatment (odds ratio, 1.041; 95% confidence interval, 1.003-1.081; P = .04).
In regression models, lower insula-DLPFC/midcingulate cortex rsFC was found to be a mediator of the association between baseline apathy and persistence of depression.
The SN findings were also relevant to cognition. Lower dorsal anterior cingulate-DLPFC/paracentral rsFC was found to be associated with residual cognitive difficulties on measures of attention and executive function (beta [df] = .445 [26] and beta [df] = .384 [26], respectively; for each, P = .04).
“These findings support an emerging model of apathy, which proposes that apathy may arise from dysfunctional interactions among core networks (that is, SN, DMN, and executive control) that support motivated behavior,” the investigators write.
“This may cause a failure of network integration, leading to difficulties with salience processing, action planning, and behavioral initiation that manifests clinically as apathy,” they conclude.
One limitation they note was the lack of longitudinal follow-up after acute treatment and a “relatively limited neuropsychological battery.” Therefore, they could not “establish the persistence of treatment differences nor the specificity of cognitive associations.”
The investigators add that “novel interventions that modulate interactions among affected circuits may help to improve clinical outcomes in this distinct subgroup of older adults with depression, for whom few effective treatments exist.”
Commenting on the study, Helen Lavretsy, MD, professor of psychiatry in residence and director of the Late-Life Mood, Stress, and Wellness Research Program and the Integrative Psychiatry Clinic, Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, said, the findings “can be used in future studies targeting apathy and the underlying neural mechanisms of brain connectivity.” Dr. Lavretsy was not involved with the study.
The study was supported by grants from the National Institute of Mental Health. Dr. Gunning reported receiving grants from the National Institute of Mental Health during the conduct of the study and grants from Akili Interactive. The other authors’ disclosures are listed on the original article. Dr. Lavretsky reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A new study sheds light on the neurologic underpinnings of late-life depression (LLD) with apathy and its frequently poor response to treatment.
Investigators headed by Faith Gunning, PhD, of the Institute of Geriatric Psychiatry, Weill Cornell Medicine, New York, analyzed baseline and posttreatment brain MRIs and functional MRIs (fMRIs) of older adults with depression who participated in a 12-week open-label nonrandomized clinical trial of escitalopram. Participants had undergone clinical and cognitive assessments.
Disturbances were found in resting state functional connectivity (rsFC) between the salience network (SN) and other large-scale networks that support goal-directed behavior, especially in patients with depression who also had features of apathy.
“This study suggests that, among older adults with depression, distinct network abnormalities may be associated with apathy and poor response to first-line pharmacotherapy and may serve as promising targets for novel interventions,” the investigators write.
The study was published online in JAMA Network Open.
A leading cause of disability
LLD is a “leading cause of disability and medical morbidity in older adulthood,” with one-third to one-half of patients with LLD also suffering from apathy, the authors write.
Older adults with depression and comorbid apathy have poorer outcomes, including lower remission rates and poorer response to first-line antidepressants, compared with those with LLD but who do not have apathy.
Despite the high prevalence of apathy in people with depression, “little is known about its optimal treatment and, more broadly, about the brain-based mechanisms of apathy,” the authors note.
An “emerging hypothesis” points to the role of a compromised SN and its large-scale connections between apathy and poor treatment response in LLD.
The SN (which includes the insula and the dorsal anterior cingulate cortex) “attributes motivational value to a stimulus” and “dynamically coordinates the activity of other large-scale networks, including the executive control network and default mode network (DMN).”
Preliminary studies of apathy in patients with depression report reduced volume in structures of the SN and suggest disruption in functional connectivity among the SN, DMN, and the executive control network; but the mechanisms linking apathy to poor antidepressant response in LLD “are not well understood.”
“Connectometry” is a “novel approach to diffusion MRI analysis that quantifies the local connectome of white matter pathways.” It has been used along with resting-state imagery, but it had not been used in studying apathy.
The researchers investigated the functional connectivity of the SN, hypothesizing that alterations in connectivity among key nodes of the SN and other core circuits that modulate goal-directed behavior (DMN and the executive control network) were implicated in individuals with depression and apathy.
They applied connectometry to “identify pathway-level disruptions in structural connectivity,” hypothesizing that compromise of frontoparietal and frontolimbic pathways would be associated with apathy in patients with LLD.
They also wanted to know whether apathy-related network abnormalities were associated with antidepressant response after 12 weeks of pharmacotherapy with the selective serotonin reuptake inhibitor escitalopram.
Emerging model
The study included 40 older adults (65% women; mean [SD] age, 70.0 [6.6] years) with DSM-IV–diagnosis major depressive disorder (without psychotic features) who were from a single-group, open-label escitalopram treatment trial.
The Hamilton-Depression (HAM-D) scale was used to assess depression, while the Apathy Evaluation Scale was used to assess apathy. On the Apathy Evaluation Scale, a score of greater than 40.5 represents “clinically significant apathy.” Participants completed these tests at baseline and after 12 weeks of escitalopram treatment.
They also completed a battery of neuropsychological tests to assess cognition and underwent MRI imaging. fMRI was used to map group differences in rsFC of the SN, and diffusion connectometry was used to “evaluate pathway-level disruptions in structural connectivity.”
Of the participants, 20 had clinically significant apathy. There were no differences in age, sex, educational level, or the severity of depression at baseline between those who did and those who did not have apathy.
Compared with participants with depression but not apathy, those with depression and comorbid apathy had lower rsFC of salience network seeds (specifically, the dorsolateral prefrontal cortex [DLPFC], premotor cortex, midcingulate cortex, and paracentral lobule).
They also had greater rsFC in the lateral temporal cortex and temporal pole (z > 2.7; Bonferroni-corrected threshold of P < .0125).
Additionally, participants with apathy had lower structural connectivity in the splenium, cingulum, and fronto-occipital fasciculus, compared with those without apathy (t > 2.5; false discovery rate–corrected P = .02).
Of the 27 participants who completed escitalopram treatment; 16 (59%) achieved remission (defined as an HAM-D score <10). Participants with apathy had poorer response to escitalopram treatment.
Lower insula-DLPFC/midcingulate cortex rsFC was associated with less improvement in depressive symptoms (HAM-D percentage change, beta [df] = .588 [26]; P = .001) as well as a greater likelihood that the participant would not achieve remission after treatment (odds ratio, 1.041; 95% confidence interval, 1.003-1.081; P = .04).
In regression models, lower insula-DLPFC/midcingulate cortex rsFC was found to be a mediator of the association between baseline apathy and persistence of depression.
The SN findings were also relevant to cognition. Lower dorsal anterior cingulate-DLPFC/paracentral rsFC was found to be associated with residual cognitive difficulties on measures of attention and executive function (beta [df] = .445 [26] and beta [df] = .384 [26], respectively; for each, P = .04).
“These findings support an emerging model of apathy, which proposes that apathy may arise from dysfunctional interactions among core networks (that is, SN, DMN, and executive control) that support motivated behavior,” the investigators write.
“This may cause a failure of network integration, leading to difficulties with salience processing, action planning, and behavioral initiation that manifests clinically as apathy,” they conclude.
One limitation they note was the lack of longitudinal follow-up after acute treatment and a “relatively limited neuropsychological battery.” Therefore, they could not “establish the persistence of treatment differences nor the specificity of cognitive associations.”
The investigators add that “novel interventions that modulate interactions among affected circuits may help to improve clinical outcomes in this distinct subgroup of older adults with depression, for whom few effective treatments exist.”
Commenting on the study, Helen Lavretsy, MD, professor of psychiatry in residence and director of the Late-Life Mood, Stress, and Wellness Research Program and the Integrative Psychiatry Clinic, Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, said, the findings “can be used in future studies targeting apathy and the underlying neural mechanisms of brain connectivity.” Dr. Lavretsy was not involved with the study.
The study was supported by grants from the National Institute of Mental Health. Dr. Gunning reported receiving grants from the National Institute of Mental Health during the conduct of the study and grants from Akili Interactive. The other authors’ disclosures are listed on the original article. Dr. Lavretsky reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A new study sheds light on the neurologic underpinnings of late-life depression (LLD) with apathy and its frequently poor response to treatment.
Investigators headed by Faith Gunning, PhD, of the Institute of Geriatric Psychiatry, Weill Cornell Medicine, New York, analyzed baseline and posttreatment brain MRIs and functional MRIs (fMRIs) of older adults with depression who participated in a 12-week open-label nonrandomized clinical trial of escitalopram. Participants had undergone clinical and cognitive assessments.
Disturbances were found in resting state functional connectivity (rsFC) between the salience network (SN) and other large-scale networks that support goal-directed behavior, especially in patients with depression who also had features of apathy.
“This study suggests that, among older adults with depression, distinct network abnormalities may be associated with apathy and poor response to first-line pharmacotherapy and may serve as promising targets for novel interventions,” the investigators write.
The study was published online in JAMA Network Open.
A leading cause of disability
LLD is a “leading cause of disability and medical morbidity in older adulthood,” with one-third to one-half of patients with LLD also suffering from apathy, the authors write.
Older adults with depression and comorbid apathy have poorer outcomes, including lower remission rates and poorer response to first-line antidepressants, compared with those with LLD but who do not have apathy.
Despite the high prevalence of apathy in people with depression, “little is known about its optimal treatment and, more broadly, about the brain-based mechanisms of apathy,” the authors note.
An “emerging hypothesis” points to the role of a compromised SN and its large-scale connections between apathy and poor treatment response in LLD.
The SN (which includes the insula and the dorsal anterior cingulate cortex) “attributes motivational value to a stimulus” and “dynamically coordinates the activity of other large-scale networks, including the executive control network and default mode network (DMN).”
Preliminary studies of apathy in patients with depression report reduced volume in structures of the SN and suggest disruption in functional connectivity among the SN, DMN, and the executive control network; but the mechanisms linking apathy to poor antidepressant response in LLD “are not well understood.”
“Connectometry” is a “novel approach to diffusion MRI analysis that quantifies the local connectome of white matter pathways.” It has been used along with resting-state imagery, but it had not been used in studying apathy.
The researchers investigated the functional connectivity of the SN, hypothesizing that alterations in connectivity among key nodes of the SN and other core circuits that modulate goal-directed behavior (DMN and the executive control network) were implicated in individuals with depression and apathy.
They applied connectometry to “identify pathway-level disruptions in structural connectivity,” hypothesizing that compromise of frontoparietal and frontolimbic pathways would be associated with apathy in patients with LLD.
They also wanted to know whether apathy-related network abnormalities were associated with antidepressant response after 12 weeks of pharmacotherapy with the selective serotonin reuptake inhibitor escitalopram.
Emerging model
The study included 40 older adults (65% women; mean [SD] age, 70.0 [6.6] years) with DSM-IV–diagnosis major depressive disorder (without psychotic features) who were from a single-group, open-label escitalopram treatment trial.
The Hamilton-Depression (HAM-D) scale was used to assess depression, while the Apathy Evaluation Scale was used to assess apathy. On the Apathy Evaluation Scale, a score of greater than 40.5 represents “clinically significant apathy.” Participants completed these tests at baseline and after 12 weeks of escitalopram treatment.
They also completed a battery of neuropsychological tests to assess cognition and underwent MRI imaging. fMRI was used to map group differences in rsFC of the SN, and diffusion connectometry was used to “evaluate pathway-level disruptions in structural connectivity.”
Of the participants, 20 had clinically significant apathy. There were no differences in age, sex, educational level, or the severity of depression at baseline between those who did and those who did not have apathy.
Compared with participants with depression but not apathy, those with depression and comorbid apathy had lower rsFC of salience network seeds (specifically, the dorsolateral prefrontal cortex [DLPFC], premotor cortex, midcingulate cortex, and paracentral lobule).
They also had greater rsFC in the lateral temporal cortex and temporal pole (z > 2.7; Bonferroni-corrected threshold of P < .0125).
Additionally, participants with apathy had lower structural connectivity in the splenium, cingulum, and fronto-occipital fasciculus, compared with those without apathy (t > 2.5; false discovery rate–corrected P = .02).
Of the 27 participants who completed escitalopram treatment; 16 (59%) achieved remission (defined as an HAM-D score <10). Participants with apathy had poorer response to escitalopram treatment.
Lower insula-DLPFC/midcingulate cortex rsFC was associated with less improvement in depressive symptoms (HAM-D percentage change, beta [df] = .588 [26]; P = .001) as well as a greater likelihood that the participant would not achieve remission after treatment (odds ratio, 1.041; 95% confidence interval, 1.003-1.081; P = .04).
In regression models, lower insula-DLPFC/midcingulate cortex rsFC was found to be a mediator of the association between baseline apathy and persistence of depression.
The SN findings were also relevant to cognition. Lower dorsal anterior cingulate-DLPFC/paracentral rsFC was found to be associated with residual cognitive difficulties on measures of attention and executive function (beta [df] = .445 [26] and beta [df] = .384 [26], respectively; for each, P = .04).
“These findings support an emerging model of apathy, which proposes that apathy may arise from dysfunctional interactions among core networks (that is, SN, DMN, and executive control) that support motivated behavior,” the investigators write.
“This may cause a failure of network integration, leading to difficulties with salience processing, action planning, and behavioral initiation that manifests clinically as apathy,” they conclude.
One limitation they note was the lack of longitudinal follow-up after acute treatment and a “relatively limited neuropsychological battery.” Therefore, they could not “establish the persistence of treatment differences nor the specificity of cognitive associations.”
The investigators add that “novel interventions that modulate interactions among affected circuits may help to improve clinical outcomes in this distinct subgroup of older adults with depression, for whom few effective treatments exist.”
Commenting on the study, Helen Lavretsy, MD, professor of psychiatry in residence and director of the Late-Life Mood, Stress, and Wellness Research Program and the Integrative Psychiatry Clinic, Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, said, the findings “can be used in future studies targeting apathy and the underlying neural mechanisms of brain connectivity.” Dr. Lavretsy was not involved with the study.
The study was supported by grants from the National Institute of Mental Health. Dr. Gunning reported receiving grants from the National Institute of Mental Health during the conduct of the study and grants from Akili Interactive. The other authors’ disclosures are listed on the original article. Dr. Lavretsky reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
AAP updates hyperbilirubinemia guideline
Raising phototherapy thresholds and revising risk assessment are among the key changes in the American Academy of Pediatrics’ updated guidelines for managing hyperbilirubinemia in infants 35 weeks’ gestation and older.
“More than 80% of newborn infants will have some degree of jaundice,” Alex R. Kemper, MD, of Nationwide Children’s Hospital, Columbus, Ohio, and coauthors wrote. Careful monitoring is needed manage high bilirubin concentrations and avoid acute bilirubin encephalopathy (ABE) and kernicterus, a disabling neurologic condition.
The current revision, published in Pediatrics, updates and replaces the 2004 AAP clinical practice guidelines for the management and prevention of hyperbilirubinemia in newborns of at least 35 weeks’ gestation.
The guideline committee reviewed evidence published since the previous guidelines were issued in 2004, and addressed similar issues of prevention, risk assessment, monitoring, and treatment.
A notable change from 2004 was the inclusion of a 2009 recommendation update for “universal predischarge bilirubin screening with measures of total serum bilirubin (TSB) or transcutaneous bilirubin (TcB) linked to specific recommendations for follow-up,” the authors wrote.
In terms of prevention, recommendations include a direct antiglobulin test (DAT) for infants whose mother’s antibody screen was positive or unknown. In addition, exclusive breastfeeding is known to be associated with hyperbilirubinemia, but clinicians should support breastfeeding while monitoring for signs of hyperbilirubinemia because of suboptimal feeding, the authors noted. However, the guidelines recommend against oral supplementation with water or dextrose water to prevent hyperbilirubinemia.
For assessment and monitoring, the guidelines advise the use of total serum bilirubin (TSB) as the definitive test for hyperbilirubinemia to guide phototherapy and escalation of care, including exchange transfusion. “The presence of hyperbilirubinemia neurotoxicity risk factors lowers the threshold for treatment with phototherapy and the level at which care should be escalated,” the authors wrote. They also emphasized the need to consider glucose-6-phosphate dehydrogenase deficiency, a genetic condition that decreases protection against oxidative stress and has been identified as a leading cause of hazardous hyperbilirubinemia worldwide.
The guidelines recommend assessing all infants for jaundice at least every 12 hours after delivery until discharge, with TSB or TcB measured as soon as possible for those with suspected jaundice. The complete guidelines include charts for TSB levels to guide escalation of care. “Blood for TSB can be obtained at the time it is collected for newborn screening tests to avoid an additional heel stick,” the authors noted.
The rate of increase in TSB or TcB, if more than one measure is available, may identify infants at higher risk of hyperbilirubinemia, according to the guidelines, and a possible delay of hospital discharge may be needed for infants if appropriate follow-up is not feasible.
In terms of treatment, new evidence that bilirubin neurotoxicity does not occur until concentrations well above those given in the 2004 guidelines justified raising the treatment thresholds, although by a narrow range. “With the increased phototherapy thresholds, appropriately following the current guidelines including bilirubin screening during the birth hospitalization and timely postdischarge follow-up is important,” the authors wrote. The new thresholds, outlined in the complete guidelines, are based on gestational age, hyperbilirubinemia neurotoxicity risk factors, and the age of the infant in hours. However, infants may be treated at lower levels, based on individual circumstances, family preferences, and shared decision-making with clinicians. Home-based phototherapy may be used in some infants, but should not be used if there is a question about the device quality, delivery time, and ability of caregivers to use the device correctly.
“Discontinuing phototherapy is an option when the TSB has decreased by at least 2 mg/dL below the hour-specific threshold at the initiation of phototherapy,” and follow-up should be based on risk of rebound hyperbilirubinemia, according to the guidelines.
“This clinical practice guideline provides indications and approaches for phototherapy and escalation of care and when treatment and monitoring can be safely discontinued,” However, clinicians should understand the rationale for the recommendations and combine them with their clinical judgment, including shared decision-making when appropriate, the authors concluded.
Updated evidence supports escalating care
The take-home message for pediatricians is that neonatal hyperbilirubinemia is a very common finding, and complications are rare, but the condition can result in devastating life-long results, Cathy Haut, DNP, CPNP-AC, CPNP-PC, a pediatric nurse practitioner in Rehoboth Beach, Del., said in an interview.
“Previous guidelines published in 2004 and updated in 2009 included evidence-based recommendations, but additional research was still needed to provide guidance for providers to prevent complications of hyperbilirubinemia,” said Dr. Haut, who was not involved in producing the guidelines.
“New data documenting additional risk factors, the importance of ongoing breastfeeding support, and addressing hyperbilirubinemia as an urgent problem” are additions to prevention methods in the latest published guidelines, she said.
“Acute encephalopathy and kernicterus can result from hyperbilirubinemia with severe and devastating neurologic effects, but are preventable by early identification and treatment,” said Dr. Haut. Therefore, “it is not surprising that the AAP utilized continuing and more recent evidence to support new recommendations. Both maternal and neonatal risk factors have long been considered in the development of neonatal hyperbilirubinemia, but recent recommendations incorporate additional risk factor evaluation and urgency in time to appropriate care. Detailed thresholds for phototherapy and exchange transfusion will benefit the families of full-term infants without other risk factors and escalate care for those neonates with risk factors.”
However, potential barriers to following the guidelines persist, Dr. Haut noted.
“Frequent infant follow-up can be challenging for busy primary care offices with outpatient laboratory results often taking much longer to obtain than in a hospital setting,” she said.
Also, “taking a newborn to the emergency department or an inpatient laboratory can be frightening for families with the risk of illness exposure. Frequent monitoring of serum bilirubin levels is disturbing for parents and inconvenient immediately postpartum,” Dr. Haut explained. “Few practices utilize transcutaneous bilirubin monitoring which may be one method of added screening.”
In addition, “despite the importance of breastfeeding, ongoing support is not readily available for mothers after hospital discharge. A lactation specialist in the office setting can take the burden off providers and add opportunity for family education.”
As for additional research, “continued evaluation of the comparison of transcutaneous bilirubin monitoring and serum levels along with the use of transcutaneous monitoring in facilities outside the hospital setting may be warranted,” Dr. Haut said. “Data collection on incidence and accompanying risk factors of neonates who develop acute hyperbilirubinemia encephalopathy and kernicterus is a long-term study opportunity.”
The guidelines received no external funding. Lead author Dr. Kemper had no financial conflicts to disclose. Dr. Haut had no financial conflicts to disclose and serves on the editorial advisory board of Pediatric News.
Raising phototherapy thresholds and revising risk assessment are among the key changes in the American Academy of Pediatrics’ updated guidelines for managing hyperbilirubinemia in infants 35 weeks’ gestation and older.
“More than 80% of newborn infants will have some degree of jaundice,” Alex R. Kemper, MD, of Nationwide Children’s Hospital, Columbus, Ohio, and coauthors wrote. Careful monitoring is needed manage high bilirubin concentrations and avoid acute bilirubin encephalopathy (ABE) and kernicterus, a disabling neurologic condition.
The current revision, published in Pediatrics, updates and replaces the 2004 AAP clinical practice guidelines for the management and prevention of hyperbilirubinemia in newborns of at least 35 weeks’ gestation.
The guideline committee reviewed evidence published since the previous guidelines were issued in 2004, and addressed similar issues of prevention, risk assessment, monitoring, and treatment.
A notable change from 2004 was the inclusion of a 2009 recommendation update for “universal predischarge bilirubin screening with measures of total serum bilirubin (TSB) or transcutaneous bilirubin (TcB) linked to specific recommendations for follow-up,” the authors wrote.
In terms of prevention, recommendations include a direct antiglobulin test (DAT) for infants whose mother’s antibody screen was positive or unknown. In addition, exclusive breastfeeding is known to be associated with hyperbilirubinemia, but clinicians should support breastfeeding while monitoring for signs of hyperbilirubinemia because of suboptimal feeding, the authors noted. However, the guidelines recommend against oral supplementation with water or dextrose water to prevent hyperbilirubinemia.
For assessment and monitoring, the guidelines advise the use of total serum bilirubin (TSB) as the definitive test for hyperbilirubinemia to guide phototherapy and escalation of care, including exchange transfusion. “The presence of hyperbilirubinemia neurotoxicity risk factors lowers the threshold for treatment with phototherapy and the level at which care should be escalated,” the authors wrote. They also emphasized the need to consider glucose-6-phosphate dehydrogenase deficiency, a genetic condition that decreases protection against oxidative stress and has been identified as a leading cause of hazardous hyperbilirubinemia worldwide.
The guidelines recommend assessing all infants for jaundice at least every 12 hours after delivery until discharge, with TSB or TcB measured as soon as possible for those with suspected jaundice. The complete guidelines include charts for TSB levels to guide escalation of care. “Blood for TSB can be obtained at the time it is collected for newborn screening tests to avoid an additional heel stick,” the authors noted.
The rate of increase in TSB or TcB, if more than one measure is available, may identify infants at higher risk of hyperbilirubinemia, according to the guidelines, and a possible delay of hospital discharge may be needed for infants if appropriate follow-up is not feasible.
In terms of treatment, new evidence that bilirubin neurotoxicity does not occur until concentrations well above those given in the 2004 guidelines justified raising the treatment thresholds, although by a narrow range. “With the increased phototherapy thresholds, appropriately following the current guidelines including bilirubin screening during the birth hospitalization and timely postdischarge follow-up is important,” the authors wrote. The new thresholds, outlined in the complete guidelines, are based on gestational age, hyperbilirubinemia neurotoxicity risk factors, and the age of the infant in hours. However, infants may be treated at lower levels, based on individual circumstances, family preferences, and shared decision-making with clinicians. Home-based phototherapy may be used in some infants, but should not be used if there is a question about the device quality, delivery time, and ability of caregivers to use the device correctly.
“Discontinuing phototherapy is an option when the TSB has decreased by at least 2 mg/dL below the hour-specific threshold at the initiation of phototherapy,” and follow-up should be based on risk of rebound hyperbilirubinemia, according to the guidelines.
“This clinical practice guideline provides indications and approaches for phototherapy and escalation of care and when treatment and monitoring can be safely discontinued,” However, clinicians should understand the rationale for the recommendations and combine them with their clinical judgment, including shared decision-making when appropriate, the authors concluded.
Updated evidence supports escalating care
The take-home message for pediatricians is that neonatal hyperbilirubinemia is a very common finding, and complications are rare, but the condition can result in devastating life-long results, Cathy Haut, DNP, CPNP-AC, CPNP-PC, a pediatric nurse practitioner in Rehoboth Beach, Del., said in an interview.
“Previous guidelines published in 2004 and updated in 2009 included evidence-based recommendations, but additional research was still needed to provide guidance for providers to prevent complications of hyperbilirubinemia,” said Dr. Haut, who was not involved in producing the guidelines.
“New data documenting additional risk factors, the importance of ongoing breastfeeding support, and addressing hyperbilirubinemia as an urgent problem” are additions to prevention methods in the latest published guidelines, she said.
“Acute encephalopathy and kernicterus can result from hyperbilirubinemia with severe and devastating neurologic effects, but are preventable by early identification and treatment,” said Dr. Haut. Therefore, “it is not surprising that the AAP utilized continuing and more recent evidence to support new recommendations. Both maternal and neonatal risk factors have long been considered in the development of neonatal hyperbilirubinemia, but recent recommendations incorporate additional risk factor evaluation and urgency in time to appropriate care. Detailed thresholds for phototherapy and exchange transfusion will benefit the families of full-term infants without other risk factors and escalate care for those neonates with risk factors.”
However, potential barriers to following the guidelines persist, Dr. Haut noted.
“Frequent infant follow-up can be challenging for busy primary care offices with outpatient laboratory results often taking much longer to obtain than in a hospital setting,” she said.
Also, “taking a newborn to the emergency department or an inpatient laboratory can be frightening for families with the risk of illness exposure. Frequent monitoring of serum bilirubin levels is disturbing for parents and inconvenient immediately postpartum,” Dr. Haut explained. “Few practices utilize transcutaneous bilirubin monitoring which may be one method of added screening.”
In addition, “despite the importance of breastfeeding, ongoing support is not readily available for mothers after hospital discharge. A lactation specialist in the office setting can take the burden off providers and add opportunity for family education.”
As for additional research, “continued evaluation of the comparison of transcutaneous bilirubin monitoring and serum levels along with the use of transcutaneous monitoring in facilities outside the hospital setting may be warranted,” Dr. Haut said. “Data collection on incidence and accompanying risk factors of neonates who develop acute hyperbilirubinemia encephalopathy and kernicterus is a long-term study opportunity.”
The guidelines received no external funding. Lead author Dr. Kemper had no financial conflicts to disclose. Dr. Haut had no financial conflicts to disclose and serves on the editorial advisory board of Pediatric News.
Raising phototherapy thresholds and revising risk assessment are among the key changes in the American Academy of Pediatrics’ updated guidelines for managing hyperbilirubinemia in infants 35 weeks’ gestation and older.
“More than 80% of newborn infants will have some degree of jaundice,” Alex R. Kemper, MD, of Nationwide Children’s Hospital, Columbus, Ohio, and coauthors wrote. Careful monitoring is needed manage high bilirubin concentrations and avoid acute bilirubin encephalopathy (ABE) and kernicterus, a disabling neurologic condition.
The current revision, published in Pediatrics, updates and replaces the 2004 AAP clinical practice guidelines for the management and prevention of hyperbilirubinemia in newborns of at least 35 weeks’ gestation.
The guideline committee reviewed evidence published since the previous guidelines were issued in 2004, and addressed similar issues of prevention, risk assessment, monitoring, and treatment.
A notable change from 2004 was the inclusion of a 2009 recommendation update for “universal predischarge bilirubin screening with measures of total serum bilirubin (TSB) or transcutaneous bilirubin (TcB) linked to specific recommendations for follow-up,” the authors wrote.
In terms of prevention, recommendations include a direct antiglobulin test (DAT) for infants whose mother’s antibody screen was positive or unknown. In addition, exclusive breastfeeding is known to be associated with hyperbilirubinemia, but clinicians should support breastfeeding while monitoring for signs of hyperbilirubinemia because of suboptimal feeding, the authors noted. However, the guidelines recommend against oral supplementation with water or dextrose water to prevent hyperbilirubinemia.
For assessment and monitoring, the guidelines advise the use of total serum bilirubin (TSB) as the definitive test for hyperbilirubinemia to guide phototherapy and escalation of care, including exchange transfusion. “The presence of hyperbilirubinemia neurotoxicity risk factors lowers the threshold for treatment with phototherapy and the level at which care should be escalated,” the authors wrote. They also emphasized the need to consider glucose-6-phosphate dehydrogenase deficiency, a genetic condition that decreases protection against oxidative stress and has been identified as a leading cause of hazardous hyperbilirubinemia worldwide.
The guidelines recommend assessing all infants for jaundice at least every 12 hours after delivery until discharge, with TSB or TcB measured as soon as possible for those with suspected jaundice. The complete guidelines include charts for TSB levels to guide escalation of care. “Blood for TSB can be obtained at the time it is collected for newborn screening tests to avoid an additional heel stick,” the authors noted.
The rate of increase in TSB or TcB, if more than one measure is available, may identify infants at higher risk of hyperbilirubinemia, according to the guidelines, and a possible delay of hospital discharge may be needed for infants if appropriate follow-up is not feasible.
In terms of treatment, new evidence that bilirubin neurotoxicity does not occur until concentrations well above those given in the 2004 guidelines justified raising the treatment thresholds, although by a narrow range. “With the increased phototherapy thresholds, appropriately following the current guidelines including bilirubin screening during the birth hospitalization and timely postdischarge follow-up is important,” the authors wrote. The new thresholds, outlined in the complete guidelines, are based on gestational age, hyperbilirubinemia neurotoxicity risk factors, and the age of the infant in hours. However, infants may be treated at lower levels, based on individual circumstances, family preferences, and shared decision-making with clinicians. Home-based phototherapy may be used in some infants, but should not be used if there is a question about the device quality, delivery time, and ability of caregivers to use the device correctly.
“Discontinuing phototherapy is an option when the TSB has decreased by at least 2 mg/dL below the hour-specific threshold at the initiation of phototherapy,” and follow-up should be based on risk of rebound hyperbilirubinemia, according to the guidelines.
“This clinical practice guideline provides indications and approaches for phototherapy and escalation of care and when treatment and monitoring can be safely discontinued,” However, clinicians should understand the rationale for the recommendations and combine them with their clinical judgment, including shared decision-making when appropriate, the authors concluded.
Updated evidence supports escalating care
The take-home message for pediatricians is that neonatal hyperbilirubinemia is a very common finding, and complications are rare, but the condition can result in devastating life-long results, Cathy Haut, DNP, CPNP-AC, CPNP-PC, a pediatric nurse practitioner in Rehoboth Beach, Del., said in an interview.
“Previous guidelines published in 2004 and updated in 2009 included evidence-based recommendations, but additional research was still needed to provide guidance for providers to prevent complications of hyperbilirubinemia,” said Dr. Haut, who was not involved in producing the guidelines.
“New data documenting additional risk factors, the importance of ongoing breastfeeding support, and addressing hyperbilirubinemia as an urgent problem” are additions to prevention methods in the latest published guidelines, she said.
“Acute encephalopathy and kernicterus can result from hyperbilirubinemia with severe and devastating neurologic effects, but are preventable by early identification and treatment,” said Dr. Haut. Therefore, “it is not surprising that the AAP utilized continuing and more recent evidence to support new recommendations. Both maternal and neonatal risk factors have long been considered in the development of neonatal hyperbilirubinemia, but recent recommendations incorporate additional risk factor evaluation and urgency in time to appropriate care. Detailed thresholds for phototherapy and exchange transfusion will benefit the families of full-term infants without other risk factors and escalate care for those neonates with risk factors.”
However, potential barriers to following the guidelines persist, Dr. Haut noted.
“Frequent infant follow-up can be challenging for busy primary care offices with outpatient laboratory results often taking much longer to obtain than in a hospital setting,” she said.
Also, “taking a newborn to the emergency department or an inpatient laboratory can be frightening for families with the risk of illness exposure. Frequent monitoring of serum bilirubin levels is disturbing for parents and inconvenient immediately postpartum,” Dr. Haut explained. “Few practices utilize transcutaneous bilirubin monitoring which may be one method of added screening.”
In addition, “despite the importance of breastfeeding, ongoing support is not readily available for mothers after hospital discharge. A lactation specialist in the office setting can take the burden off providers and add opportunity for family education.”
As for additional research, “continued evaluation of the comparison of transcutaneous bilirubin monitoring and serum levels along with the use of transcutaneous monitoring in facilities outside the hospital setting may be warranted,” Dr. Haut said. “Data collection on incidence and accompanying risk factors of neonates who develop acute hyperbilirubinemia encephalopathy and kernicterus is a long-term study opportunity.”
The guidelines received no external funding. Lead author Dr. Kemper had no financial conflicts to disclose. Dr. Haut had no financial conflicts to disclose and serves on the editorial advisory board of Pediatric News.
FROM PEDIATRICS
Regular exercise appears to slow cognitive decline in MCI
(MCI), new research from the largest study of its kind suggests. Topline results from the EXERT trial showed patients with MCI who participated regularly in either aerobic exercise or stretching/balance/range-of-motion exercises maintained stable global cognitive function over 12 months of follow-up – with no differences between the two types of exercise.
“We’re excited about these findings, because these types of exercises that we’re seeing can protect against cognitive decline are accessible to everyone and therefore scalable to the public,” study investigator Laura Baker, PhD, Wake Forest University School of Medicine, Winston-Salem, N.C., said at a press briefing.
The topline results were presented at the 2022 Alzheimer’s Association International Conference.
No decline
The 18-month EXERT trial was designed to be the definitive study to answer the question about whether exercise can slow cognitive decline in older adults with amnestic MCI, Dr. Baker reported. Investigators enrolled 296 sedentary men and women with MCI (mean age, about 75 years). All were randomly allocated to either an aerobic exercise group (maintaining a heart rate at about 70%-85%) or a stretching and balance group (maintaining heart rate less than 35%).
Both groups exercised four times per week for about 30-40 minutes. In the first 12 months they were supervised by a trainer at the YMCA and then they exercised independently for the final 6 months.
Participants were assessed at baseline and every 6 months. The primary endpoint was change from baseline on the ADAS-Cog-Exec, a validated measure of global cognitive function, at the end of the 12 months of supervised exercise.
During the first 12 months, participants completed over 31,000 sessions of exercise, which is “quite impressive,” Dr. Baker said.
Over the first 12 months, neither the aerobic group nor the stretch/balance group showed a decline on the ADAS-Cog-Exec.
“We saw no group differences, and importantly, no decline after 12 months,” Dr. Baker reported.
Supported exercise is ‘crucial’
To help “make sense” of these findings, Dr. Baker noted that 12-month changes in the ADAS-Cog-Exec for the EXERT intervention groups were also compared with a “usual care” cohort of adults matched for age, sex, education, baseline cognitive status, and APOE4 genotype.
In this “apples-to-apples” comparison, the usual care cohort showed the expected decline or worsening of cognitive function over 12 months on the ADAS-Cog-Exec, but the EXERT exercise groups did not.
Dr. Baker noted that both exercise groups received equal amounts of weekly socialization, which may have contributed to the apparent protective effects on the brain.
A greater volume of exercise in EXERT, compared with other trials, may also be a factor. Each individual participant in EXERT completed more than 100 hours of exercise.
“The take-home message is that an increased amount of either low-intensity or high-intensity exercise for 120-150 minutes per week for 12 months may slow cognitive decline in sedentary older adults with MCI,” Dr. Baker said.
“What’s critical is that this regular exercise must be supported in these older [patients] with MCI. It must be supervised. There has to be some social component,” she added.
In her view, 120 minutes of regular supported exercise for sedentary individuals with MCI “needs to be part of the recommendation for risk reduction.”
Important study
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that several studies over the years have suggested that different types of exercise can have benefits on the brain.
“What’s important about this study is that it’s in a population of people that have MCI and are already experiencing memory changes,” Dr. Snyder said.
“The results suggest that engaging in both of these types of exercise may be beneficial for our brain. And given that this is the largest study of its kind in a population of people with MCI, it suggests it’s ‘never too late’ to start exercising,” she added.
Dr. Snyder noted the importance of continuing this work and to continue following these individuals “over time as well.”
The study was funded by the National Institutes of Health, National Institute on Aging. Dr. Baker and Dr. Snyder have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(MCI), new research from the largest study of its kind suggests. Topline results from the EXERT trial showed patients with MCI who participated regularly in either aerobic exercise or stretching/balance/range-of-motion exercises maintained stable global cognitive function over 12 months of follow-up – with no differences between the two types of exercise.
“We’re excited about these findings, because these types of exercises that we’re seeing can protect against cognitive decline are accessible to everyone and therefore scalable to the public,” study investigator Laura Baker, PhD, Wake Forest University School of Medicine, Winston-Salem, N.C., said at a press briefing.
The topline results were presented at the 2022 Alzheimer’s Association International Conference.
No decline
The 18-month EXERT trial was designed to be the definitive study to answer the question about whether exercise can slow cognitive decline in older adults with amnestic MCI, Dr. Baker reported. Investigators enrolled 296 sedentary men and women with MCI (mean age, about 75 years). All were randomly allocated to either an aerobic exercise group (maintaining a heart rate at about 70%-85%) or a stretching and balance group (maintaining heart rate less than 35%).
Both groups exercised four times per week for about 30-40 minutes. In the first 12 months they were supervised by a trainer at the YMCA and then they exercised independently for the final 6 months.
Participants were assessed at baseline and every 6 months. The primary endpoint was change from baseline on the ADAS-Cog-Exec, a validated measure of global cognitive function, at the end of the 12 months of supervised exercise.
During the first 12 months, participants completed over 31,000 sessions of exercise, which is “quite impressive,” Dr. Baker said.
Over the first 12 months, neither the aerobic group nor the stretch/balance group showed a decline on the ADAS-Cog-Exec.
“We saw no group differences, and importantly, no decline after 12 months,” Dr. Baker reported.
Supported exercise is ‘crucial’
To help “make sense” of these findings, Dr. Baker noted that 12-month changes in the ADAS-Cog-Exec for the EXERT intervention groups were also compared with a “usual care” cohort of adults matched for age, sex, education, baseline cognitive status, and APOE4 genotype.
In this “apples-to-apples” comparison, the usual care cohort showed the expected decline or worsening of cognitive function over 12 months on the ADAS-Cog-Exec, but the EXERT exercise groups did not.
Dr. Baker noted that both exercise groups received equal amounts of weekly socialization, which may have contributed to the apparent protective effects on the brain.
A greater volume of exercise in EXERT, compared with other trials, may also be a factor. Each individual participant in EXERT completed more than 100 hours of exercise.
“The take-home message is that an increased amount of either low-intensity or high-intensity exercise for 120-150 minutes per week for 12 months may slow cognitive decline in sedentary older adults with MCI,” Dr. Baker said.
“What’s critical is that this regular exercise must be supported in these older [patients] with MCI. It must be supervised. There has to be some social component,” she added.
In her view, 120 minutes of regular supported exercise for sedentary individuals with MCI “needs to be part of the recommendation for risk reduction.”
Important study
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that several studies over the years have suggested that different types of exercise can have benefits on the brain.
“What’s important about this study is that it’s in a population of people that have MCI and are already experiencing memory changes,” Dr. Snyder said.
“The results suggest that engaging in both of these types of exercise may be beneficial for our brain. And given that this is the largest study of its kind in a population of people with MCI, it suggests it’s ‘never too late’ to start exercising,” she added.
Dr. Snyder noted the importance of continuing this work and to continue following these individuals “over time as well.”
The study was funded by the National Institutes of Health, National Institute on Aging. Dr. Baker and Dr. Snyder have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(MCI), new research from the largest study of its kind suggests. Topline results from the EXERT trial showed patients with MCI who participated regularly in either aerobic exercise or stretching/balance/range-of-motion exercises maintained stable global cognitive function over 12 months of follow-up – with no differences between the two types of exercise.
“We’re excited about these findings, because these types of exercises that we’re seeing can protect against cognitive decline are accessible to everyone and therefore scalable to the public,” study investigator Laura Baker, PhD, Wake Forest University School of Medicine, Winston-Salem, N.C., said at a press briefing.
The topline results were presented at the 2022 Alzheimer’s Association International Conference.
No decline
The 18-month EXERT trial was designed to be the definitive study to answer the question about whether exercise can slow cognitive decline in older adults with amnestic MCI, Dr. Baker reported. Investigators enrolled 296 sedentary men and women with MCI (mean age, about 75 years). All were randomly allocated to either an aerobic exercise group (maintaining a heart rate at about 70%-85%) or a stretching and balance group (maintaining heart rate less than 35%).
Both groups exercised four times per week for about 30-40 minutes. In the first 12 months they were supervised by a trainer at the YMCA and then they exercised independently for the final 6 months.
Participants were assessed at baseline and every 6 months. The primary endpoint was change from baseline on the ADAS-Cog-Exec, a validated measure of global cognitive function, at the end of the 12 months of supervised exercise.
During the first 12 months, participants completed over 31,000 sessions of exercise, which is “quite impressive,” Dr. Baker said.
Over the first 12 months, neither the aerobic group nor the stretch/balance group showed a decline on the ADAS-Cog-Exec.
“We saw no group differences, and importantly, no decline after 12 months,” Dr. Baker reported.
Supported exercise is ‘crucial’
To help “make sense” of these findings, Dr. Baker noted that 12-month changes in the ADAS-Cog-Exec for the EXERT intervention groups were also compared with a “usual care” cohort of adults matched for age, sex, education, baseline cognitive status, and APOE4 genotype.
In this “apples-to-apples” comparison, the usual care cohort showed the expected decline or worsening of cognitive function over 12 months on the ADAS-Cog-Exec, but the EXERT exercise groups did not.
Dr. Baker noted that both exercise groups received equal amounts of weekly socialization, which may have contributed to the apparent protective effects on the brain.
A greater volume of exercise in EXERT, compared with other trials, may also be a factor. Each individual participant in EXERT completed more than 100 hours of exercise.
“The take-home message is that an increased amount of either low-intensity or high-intensity exercise for 120-150 minutes per week for 12 months may slow cognitive decline in sedentary older adults with MCI,” Dr. Baker said.
“What’s critical is that this regular exercise must be supported in these older [patients] with MCI. It must be supervised. There has to be some social component,” she added.
In her view, 120 minutes of regular supported exercise for sedentary individuals with MCI “needs to be part of the recommendation for risk reduction.”
Important study
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that several studies over the years have suggested that different types of exercise can have benefits on the brain.
“What’s important about this study is that it’s in a population of people that have MCI and are already experiencing memory changes,” Dr. Snyder said.
“The results suggest that engaging in both of these types of exercise may be beneficial for our brain. And given that this is the largest study of its kind in a population of people with MCI, it suggests it’s ‘never too late’ to start exercising,” she added.
Dr. Snyder noted the importance of continuing this work and to continue following these individuals “over time as well.”
The study was funded by the National Institutes of Health, National Institute on Aging. Dr. Baker and Dr. Snyder have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
From AAIC 2022
Waking up at night could be your brain boosting your memory
We tend to think a good night’s sleep should be uninterrupted, but surprising new research from the University of Copenhagen suggests just the opposite:
The study, done on mice, found that the stress transmitter noradrenaline wakes up the brain many times a night. These “microarousals” were linked to memory consolidation, meaning they help you remember the previous day’s events. In fact, the more “awake” you are during a microarousal, the better the memory boost, suggests the research, which was published in Nature Neuroscience.
“Every time I wake up in the middle of the night now, I think – ah, nice, I probably just had great memory-boosting sleep,” said study author Celia Kjaerby, PhD, an assistant professor at the university’s Center for Translational Neuromedicine.
The findings add insight to what happens in the brain during sleep and may help pave the way for new treatments for those who have sleep disorders.
Waves of noradrenaline
Previous research has suggested that noradrenaline – a hormone that increases during stress but also helps you stay focused – is inactive during sleep. So, the researchers were surprised to see high levels of it in the brains of the sleeping rodents.
“I still remember seeing the first traces showing the brain activity of the norepinephrine stress system during sleep. We could not believe our eyes,” Dr. Kjaerby said. “Everyone had thought the system would be quiet. And now we have found out that it completely controls the microarchitecture of sleep.”
Those noradrenaline levels rise and fall like waves every 30 seconds during non-REM (NREM) sleep. At each “peak” the brain is briefly awake, and at each “valley” it is asleep. Typically, these awakenings are so brief that the sleeping subject does not notice. But the higher the rise, the longer the awakening – and the more likely the sleeper may notice.
During the valleys, or when norepinephrine drops, so-called sleep spindles occur.
“These are short oscillatory bursts of brain activity linked to memory consolidation,” Dr. Kjaerby said. Occasionally there is a “deep valley,” lasting 3-5 minutes, leading to more sleep spindles. The mice with the most deep valleys also had the best memories, the researchers noted.
“We have shown that the amount of these super-boosts of sleep spindles, and not REM sleep, defines how well you remember the experiences you had prior to going to sleep,” said Dr. Kjaerby.
Deep valleys were followed by longer awakenings, the researchers observed. So, the longer the valley, the longer the awakening – and the better the memory boost. This means that, though restless sleep is not good, waking up briefly may be a natural part of memory-related sleep phases and may even mean you’ve slept well.
What happens in our brains when we sleep: Piecing it together
The findings fit with previous clinical data that shows we wake up roughly 100-plus times a night, mostly during NREM sleep stage 2 (the spindle-rich sleep stage), Dr. Kjaerby said.
Still, more research on these small awakenings is needed, Dr. Kjaerby said, noting that professor Maiken Nedergaard, MD, another author of this study, has found that the brain cleans up waste products through a rinsing fluid system.
“It remains a puzzle why the fluid system is so active when we sleep,” Dr. Kjaerby said. “We believe these short awakenings could potentially be the key to answering this question.”
A version of this article first appeared on WebMD.com.
We tend to think a good night’s sleep should be uninterrupted, but surprising new research from the University of Copenhagen suggests just the opposite:
The study, done on mice, found that the stress transmitter noradrenaline wakes up the brain many times a night. These “microarousals” were linked to memory consolidation, meaning they help you remember the previous day’s events. In fact, the more “awake” you are during a microarousal, the better the memory boost, suggests the research, which was published in Nature Neuroscience.
“Every time I wake up in the middle of the night now, I think – ah, nice, I probably just had great memory-boosting sleep,” said study author Celia Kjaerby, PhD, an assistant professor at the university’s Center for Translational Neuromedicine.
The findings add insight to what happens in the brain during sleep and may help pave the way for new treatments for those who have sleep disorders.
Waves of noradrenaline
Previous research has suggested that noradrenaline – a hormone that increases during stress but also helps you stay focused – is inactive during sleep. So, the researchers were surprised to see high levels of it in the brains of the sleeping rodents.
“I still remember seeing the first traces showing the brain activity of the norepinephrine stress system during sleep. We could not believe our eyes,” Dr. Kjaerby said. “Everyone had thought the system would be quiet. And now we have found out that it completely controls the microarchitecture of sleep.”
Those noradrenaline levels rise and fall like waves every 30 seconds during non-REM (NREM) sleep. At each “peak” the brain is briefly awake, and at each “valley” it is asleep. Typically, these awakenings are so brief that the sleeping subject does not notice. But the higher the rise, the longer the awakening – and the more likely the sleeper may notice.
During the valleys, or when norepinephrine drops, so-called sleep spindles occur.
“These are short oscillatory bursts of brain activity linked to memory consolidation,” Dr. Kjaerby said. Occasionally there is a “deep valley,” lasting 3-5 minutes, leading to more sleep spindles. The mice with the most deep valleys also had the best memories, the researchers noted.
“We have shown that the amount of these super-boosts of sleep spindles, and not REM sleep, defines how well you remember the experiences you had prior to going to sleep,” said Dr. Kjaerby.
Deep valleys were followed by longer awakenings, the researchers observed. So, the longer the valley, the longer the awakening – and the better the memory boost. This means that, though restless sleep is not good, waking up briefly may be a natural part of memory-related sleep phases and may even mean you’ve slept well.
What happens in our brains when we sleep: Piecing it together
The findings fit with previous clinical data that shows we wake up roughly 100-plus times a night, mostly during NREM sleep stage 2 (the spindle-rich sleep stage), Dr. Kjaerby said.
Still, more research on these small awakenings is needed, Dr. Kjaerby said, noting that professor Maiken Nedergaard, MD, another author of this study, has found that the brain cleans up waste products through a rinsing fluid system.
“It remains a puzzle why the fluid system is so active when we sleep,” Dr. Kjaerby said. “We believe these short awakenings could potentially be the key to answering this question.”
A version of this article first appeared on WebMD.com.
We tend to think a good night’s sleep should be uninterrupted, but surprising new research from the University of Copenhagen suggests just the opposite:
The study, done on mice, found that the stress transmitter noradrenaline wakes up the brain many times a night. These “microarousals” were linked to memory consolidation, meaning they help you remember the previous day’s events. In fact, the more “awake” you are during a microarousal, the better the memory boost, suggests the research, which was published in Nature Neuroscience.
“Every time I wake up in the middle of the night now, I think – ah, nice, I probably just had great memory-boosting sleep,” said study author Celia Kjaerby, PhD, an assistant professor at the university’s Center for Translational Neuromedicine.
The findings add insight to what happens in the brain during sleep and may help pave the way for new treatments for those who have sleep disorders.
Waves of noradrenaline
Previous research has suggested that noradrenaline – a hormone that increases during stress but also helps you stay focused – is inactive during sleep. So, the researchers were surprised to see high levels of it in the brains of the sleeping rodents.
“I still remember seeing the first traces showing the brain activity of the norepinephrine stress system during sleep. We could not believe our eyes,” Dr. Kjaerby said. “Everyone had thought the system would be quiet. And now we have found out that it completely controls the microarchitecture of sleep.”
Those noradrenaline levels rise and fall like waves every 30 seconds during non-REM (NREM) sleep. At each “peak” the brain is briefly awake, and at each “valley” it is asleep. Typically, these awakenings are so brief that the sleeping subject does not notice. But the higher the rise, the longer the awakening – and the more likely the sleeper may notice.
During the valleys, or when norepinephrine drops, so-called sleep spindles occur.
“These are short oscillatory bursts of brain activity linked to memory consolidation,” Dr. Kjaerby said. Occasionally there is a “deep valley,” lasting 3-5 minutes, leading to more sleep spindles. The mice with the most deep valleys also had the best memories, the researchers noted.
“We have shown that the amount of these super-boosts of sleep spindles, and not REM sleep, defines how well you remember the experiences you had prior to going to sleep,” said Dr. Kjaerby.
Deep valleys were followed by longer awakenings, the researchers observed. So, the longer the valley, the longer the awakening – and the better the memory boost. This means that, though restless sleep is not good, waking up briefly may be a natural part of memory-related sleep phases and may even mean you’ve slept well.
What happens in our brains when we sleep: Piecing it together
The findings fit with previous clinical data that shows we wake up roughly 100-plus times a night, mostly during NREM sleep stage 2 (the spindle-rich sleep stage), Dr. Kjaerby said.
Still, more research on these small awakenings is needed, Dr. Kjaerby said, noting that professor Maiken Nedergaard, MD, another author of this study, has found that the brain cleans up waste products through a rinsing fluid system.
“It remains a puzzle why the fluid system is so active when we sleep,” Dr. Kjaerby said. “We believe these short awakenings could potentially be the key to answering this question.”
A version of this article first appeared on WebMD.com.
FROM NATURE NEUROSCIENCE
Chronically low wages linked to subsequent memory decline
, new research suggests. In a new analysis of more than 3,000 participants in the Health and Retirement Study, those who sustained low wages in midlife showed significantly faster memory decline than their peers who never earned low wages.
The findings could have implications for future public policy and research initiatives, the investigators noted.
“Our findings, which suggest a pattern of sustained low-wage earning is harmful for cognitive health, [are] broadly applicable to researchers across numerous health disciplines,” said co-investigator Katrina Kezios, PhD, postdoctoral researcher, department of epidemiology, Mailman School of Public Health, Columbia University, New York.
The findings were presented at the 2022 Alzheimer’s Association International Conference.
Growing number of low-wage workers
Low-wage workers make up a growing share of the U.S. labor market. Yet little research has examined the long-term relationship between earning low wages and memory decline.
The current investigators assessed 1992-2016 data from the Health and Retirement Study, a longitudinal survey of nationally representative samples of Americans aged 50 years and older. Study participants are interviewed every 2 years and provide, among other things, information on work-related factors, including hourly wages.
Memory function was measured at each visit from 2004 to 2016 using a memory composite score. The score included immediate and delayed word recall memory assessments. For those who became too impaired to complete cognitive assessment, memory tests by proxy informants were utilized.
On average, participants completed 4.8 memory assessments over the course of the study.
Researchers defined “low wage” as an hourly wage lower than two-thirds of the federal median wage for the corresponding year. They categorized low-wage exposure history as “never” or “intermittent” or “sustained” on the basis of wages earned from 1992 to 2004.
The current analysis included 3,803 participants, 1,913 of whom were men. All participants were born from 1936 to 1941. In 2004, the average age was 65 years, and the mean memory score was 1.15 standard units.
The investigators adjusted for factors that could confound the relationship between wages and cognition, including the participant’s education, parental education, household wealth, and marital status. Later, whether the participants’ occupation type was of low skill or not was also included.
Cognitive harm
The confounder-adjusted annual rate of memory decline among workers who never earned low wages was –0.12 standard units (95% confidence interval, –0.14 to –0.10).
Compared with these workers, memory decline was significantly faster among participants with sustained low wage–earning during midlife (beta for interaction between time and exposure group, –0.012; 95% CI, –0.02 to 0.01), corresponding to an annual rate of –0.13 standard units.
Put another way, the cognitive aging experienced by workers earning low wages over a 10-year period was equivalent to what workers who never earned low wages would experience over 11 years.
Although similar associations were found for men and women, it was stronger in magnitude for men – a finding Dr. Kezios said was somewhat surprising. She noted that women are commonly more at risk for dementia than men.
However, she advises caution in interpreting this finding, as there were so few men in the sustained low-wage group. “Women disproportionately make up the group of workers earning low wages,” she said.
The negative low coefficient found for those who persistently earned low wages was also observed for those who intermittently earned low wages, but this was not statistically significant.
“We can speculate or hypothesize the cumulative effect of earning low wages at each exposure interval produces more cognitive harm than maybe earning low wages at some time points over that exposure period,” said Dr. Kezios.
A sensitivity analysis that examined wage earning at the same ages but in two different birth cohorts showed similar results for the two groups. When researchers removed self-employed workers from the study sample, the same association between sustained low wages and memory decline was found.
“Our findings held up, which gave us a little more reassurance that what we were seeing is at least signaling there might be something there,” said Dr. Kezios.
She described the study as a “first pass” for documenting the harmful cognitive effects of consistently earning low wages.
It would be interesting, she said, to now determine whether there’s a “dose effect” for having a low salary. However, other studies with different designs would be needed to determine at what income level cognitive health starts to be protected and the impact of raising the minimum wage, she added.
Unique study
Heather Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said the study was unique. “I don’t think we have seen anything like this before,” said Dr. Snyder.
The study, which links sustained low-wage earning in midlife to later memory decline, “is looking beyond some of the other measures we’ve seen when we looked at socioeconomic status,” she noted.
The results “beg the question” of whether people who earn low wages have less access to health care, she added.
“We should think about how to ensure access and equity around health care and around potential ways that may address components of risk individuals have during their life course,” Dr. Snyder said.
She noted that the study provides a “start” at considering potential policies to address the impact of sustained low wages on overall health, particularly cognitive health, throughout life.
The study had no outside funding. Dr. Kezios has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. In a new analysis of more than 3,000 participants in the Health and Retirement Study, those who sustained low wages in midlife showed significantly faster memory decline than their peers who never earned low wages.
The findings could have implications for future public policy and research initiatives, the investigators noted.
“Our findings, which suggest a pattern of sustained low-wage earning is harmful for cognitive health, [are] broadly applicable to researchers across numerous health disciplines,” said co-investigator Katrina Kezios, PhD, postdoctoral researcher, department of epidemiology, Mailman School of Public Health, Columbia University, New York.
The findings were presented at the 2022 Alzheimer’s Association International Conference.
Growing number of low-wage workers
Low-wage workers make up a growing share of the U.S. labor market. Yet little research has examined the long-term relationship between earning low wages and memory decline.
The current investigators assessed 1992-2016 data from the Health and Retirement Study, a longitudinal survey of nationally representative samples of Americans aged 50 years and older. Study participants are interviewed every 2 years and provide, among other things, information on work-related factors, including hourly wages.
Memory function was measured at each visit from 2004 to 2016 using a memory composite score. The score included immediate and delayed word recall memory assessments. For those who became too impaired to complete cognitive assessment, memory tests by proxy informants were utilized.
On average, participants completed 4.8 memory assessments over the course of the study.
Researchers defined “low wage” as an hourly wage lower than two-thirds of the federal median wage for the corresponding year. They categorized low-wage exposure history as “never” or “intermittent” or “sustained” on the basis of wages earned from 1992 to 2004.
The current analysis included 3,803 participants, 1,913 of whom were men. All participants were born from 1936 to 1941. In 2004, the average age was 65 years, and the mean memory score was 1.15 standard units.
The investigators adjusted for factors that could confound the relationship between wages and cognition, including the participant’s education, parental education, household wealth, and marital status. Later, whether the participants’ occupation type was of low skill or not was also included.
Cognitive harm
The confounder-adjusted annual rate of memory decline among workers who never earned low wages was –0.12 standard units (95% confidence interval, –0.14 to –0.10).
Compared with these workers, memory decline was significantly faster among participants with sustained low wage–earning during midlife (beta for interaction between time and exposure group, –0.012; 95% CI, –0.02 to 0.01), corresponding to an annual rate of –0.13 standard units.
Put another way, the cognitive aging experienced by workers earning low wages over a 10-year period was equivalent to what workers who never earned low wages would experience over 11 years.
Although similar associations were found for men and women, it was stronger in magnitude for men – a finding Dr. Kezios said was somewhat surprising. She noted that women are commonly more at risk for dementia than men.
However, she advises caution in interpreting this finding, as there were so few men in the sustained low-wage group. “Women disproportionately make up the group of workers earning low wages,” she said.
The negative low coefficient found for those who persistently earned low wages was also observed for those who intermittently earned low wages, but this was not statistically significant.
“We can speculate or hypothesize the cumulative effect of earning low wages at each exposure interval produces more cognitive harm than maybe earning low wages at some time points over that exposure period,” said Dr. Kezios.
A sensitivity analysis that examined wage earning at the same ages but in two different birth cohorts showed similar results for the two groups. When researchers removed self-employed workers from the study sample, the same association between sustained low wages and memory decline was found.
“Our findings held up, which gave us a little more reassurance that what we were seeing is at least signaling there might be something there,” said Dr. Kezios.
She described the study as a “first pass” for documenting the harmful cognitive effects of consistently earning low wages.
It would be interesting, she said, to now determine whether there’s a “dose effect” for having a low salary. However, other studies with different designs would be needed to determine at what income level cognitive health starts to be protected and the impact of raising the minimum wage, she added.
Unique study
Heather Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said the study was unique. “I don’t think we have seen anything like this before,” said Dr. Snyder.
The study, which links sustained low-wage earning in midlife to later memory decline, “is looking beyond some of the other measures we’ve seen when we looked at socioeconomic status,” she noted.
The results “beg the question” of whether people who earn low wages have less access to health care, she added.
“We should think about how to ensure access and equity around health care and around potential ways that may address components of risk individuals have during their life course,” Dr. Snyder said.
She noted that the study provides a “start” at considering potential policies to address the impact of sustained low wages on overall health, particularly cognitive health, throughout life.
The study had no outside funding. Dr. Kezios has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. In a new analysis of more than 3,000 participants in the Health and Retirement Study, those who sustained low wages in midlife showed significantly faster memory decline than their peers who never earned low wages.
The findings could have implications for future public policy and research initiatives, the investigators noted.
“Our findings, which suggest a pattern of sustained low-wage earning is harmful for cognitive health, [are] broadly applicable to researchers across numerous health disciplines,” said co-investigator Katrina Kezios, PhD, postdoctoral researcher, department of epidemiology, Mailman School of Public Health, Columbia University, New York.
The findings were presented at the 2022 Alzheimer’s Association International Conference.
Growing number of low-wage workers
Low-wage workers make up a growing share of the U.S. labor market. Yet little research has examined the long-term relationship between earning low wages and memory decline.
The current investigators assessed 1992-2016 data from the Health and Retirement Study, a longitudinal survey of nationally representative samples of Americans aged 50 years and older. Study participants are interviewed every 2 years and provide, among other things, information on work-related factors, including hourly wages.
Memory function was measured at each visit from 2004 to 2016 using a memory composite score. The score included immediate and delayed word recall memory assessments. For those who became too impaired to complete cognitive assessment, memory tests by proxy informants were utilized.
On average, participants completed 4.8 memory assessments over the course of the study.
Researchers defined “low wage” as an hourly wage lower than two-thirds of the federal median wage for the corresponding year. They categorized low-wage exposure history as “never” or “intermittent” or “sustained” on the basis of wages earned from 1992 to 2004.
The current analysis included 3,803 participants, 1,913 of whom were men. All participants were born from 1936 to 1941. In 2004, the average age was 65 years, and the mean memory score was 1.15 standard units.
The investigators adjusted for factors that could confound the relationship between wages and cognition, including the participant’s education, parental education, household wealth, and marital status. Later, whether the participants’ occupation type was of low skill or not was also included.
Cognitive harm
The confounder-adjusted annual rate of memory decline among workers who never earned low wages was –0.12 standard units (95% confidence interval, –0.14 to –0.10).
Compared with these workers, memory decline was significantly faster among participants with sustained low wage–earning during midlife (beta for interaction between time and exposure group, –0.012; 95% CI, –0.02 to 0.01), corresponding to an annual rate of –0.13 standard units.
Put another way, the cognitive aging experienced by workers earning low wages over a 10-year period was equivalent to what workers who never earned low wages would experience over 11 years.
Although similar associations were found for men and women, it was stronger in magnitude for men – a finding Dr. Kezios said was somewhat surprising. She noted that women are commonly more at risk for dementia than men.
However, she advises caution in interpreting this finding, as there were so few men in the sustained low-wage group. “Women disproportionately make up the group of workers earning low wages,” she said.
The negative low coefficient found for those who persistently earned low wages was also observed for those who intermittently earned low wages, but this was not statistically significant.
“We can speculate or hypothesize the cumulative effect of earning low wages at each exposure interval produces more cognitive harm than maybe earning low wages at some time points over that exposure period,” said Dr. Kezios.
A sensitivity analysis that examined wage earning at the same ages but in two different birth cohorts showed similar results for the two groups. When researchers removed self-employed workers from the study sample, the same association between sustained low wages and memory decline was found.
“Our findings held up, which gave us a little more reassurance that what we were seeing is at least signaling there might be something there,” said Dr. Kezios.
She described the study as a “first pass” for documenting the harmful cognitive effects of consistently earning low wages.
It would be interesting, she said, to now determine whether there’s a “dose effect” for having a low salary. However, other studies with different designs would be needed to determine at what income level cognitive health starts to be protected and the impact of raising the minimum wage, she added.
Unique study
Heather Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said the study was unique. “I don’t think we have seen anything like this before,” said Dr. Snyder.
The study, which links sustained low-wage earning in midlife to later memory decline, “is looking beyond some of the other measures we’ve seen when we looked at socioeconomic status,” she noted.
The results “beg the question” of whether people who earn low wages have less access to health care, she added.
“We should think about how to ensure access and equity around health care and around potential ways that may address components of risk individuals have during their life course,” Dr. Snyder said.
She noted that the study provides a “start” at considering potential policies to address the impact of sustained low wages on overall health, particularly cognitive health, throughout life.
The study had no outside funding. Dr. Kezios has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
From AAIC 2022
More evidence that ultraprocessed foods are detrimental for the brain
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included participants aged 35 and older, showed that higher intake of UPFs was significantly associated with a faster rate of decline in both executive and global cognitive function.
“Based on these findings, doctors might counsel patients to prefer cooking at home [and] choosing fresher ingredients instead of buying ready-made meals and snacks,” said coinvestigator Natalia Gonçalves, PhD, University of São Paulo, Brazil.
Presented at the Alzheimer’s Association International Conference, the findings align with those from a recent study in Neurology. That study linked a diet high in UPFs to an increased risk for dementia.
Increasing worldwide consumption
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs include soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, fries, and many more.
Over the past 30 years, there has been a steady increase in consumption of UPFs worldwide. They are thought to induce systemic inflammation and oxidative stress and have been linked to a variety of ailments, such as overweight/obesity, cardiovascular disease, and cancer.
UPFs may also be a risk factor for cognitive decline, although data are scarce as to their effects on the brain.
To investigate, Dr. Gonçalves and colleagues evaluated longitudinal data on 10,775 adults (mean age, 50.6 years; 56% women; 55% White) who participated in the ELSA-Brasil study. They were evaluated in three waves (2008-2010, 2012-2014, and 2017-2019).
Information on diet was obtained via food frequency questionnaires and included information regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
Significant decline
Using linear mixed effects models that were adjusted for sociodemographic, lifestyle, and clinical variables, the investigators assessed the association of dietary UPFs as a percentage of total daily calories with cognitive performance over time.
During a median follow-up of 8 years, UPF intake in quartiles 2 to 4 (vs. quartile 1) was associated with a significant decline in global cognition (P = .003) and executive function (P = .015).
“Participants who reported consumption of more than 20% of daily calories from ultraprocessed foods had a 28% faster rate of global cognitive decline and a 25% faster decrease of the executive function compared to those who reported eating less than 20% of daily calories from ultraprocessed foods,” Dr. Gonçalves reported.
“Considering a person who eats a total of 2,000 kcal per day, 20% of daily calories from ultraprocessed foods are about two 1.5-ounce bars of KitKat, or five slices of bread, or about a third of an 8.5-ounce package of chips,” she explained.
Dr. Gonçalves noted that the reasons UPFs may harm the brain remain a “very relevant but not yet well-studied topic.”
Hypotheses include secondary effects from cerebrovascular lesions or chronic inflammation processes. More studies are needed to investigate the possible mechanisms that might explain the harm of UPFs to the brain, she said.
‘Troubling but not surprising’
Commenting on the study, Percy Griffin, PhD, director of scientific engagement for the Alzheimer’s Association, said there is “growing evidence that what we eat can impact our brains as we age.”
He added that many previous studies have suggested it is best for the brain for one to eat a heart-healthy, balanced diet that is low in processed foods and high in whole, nutritional foods, such as vegetables and fruits.
“These new data from the Alzheimer’s Association International Conference suggest eating a large amount of ultraprocessed food can significantly accelerate cognitive decline,” said Dr. Griffin, who was not involved with the research.
He noted that an increase in the availability and consumption of fast foods, processed foods, and UPFs is due to a number of socioeconomic factors, including low access to healthy foods, less time to prepare foods from scratch, and an inability to afford whole foods.
“Ultraprocessed foods make up more than half of American diets. It’s troubling but not surprising to see new data suggesting these foods can significantly accelerate cognitive decline,” Dr. Griffin said.
“The good news is there are steps we can take to reduce risk of cognitive decline as we age. These include eating a balanced diet, exercising regularly, getting good sleep, staying cognitively engaged, protecting from head injury, not smoking, and managing heart health,” he added.
Past research has suggested that the greatest benefit is from engaging in combinations of these lifestyle changes and that they are beneficial at any age, he noted.
“Even if you begin with one or two healthful actions, you’re moving in the right direction. It’s never too early or too late to incorporate these habits into your life,” Dr. Griffin said.
The study had no specific funding. Dr. Gonçalves and Dr. Griffin have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included participants aged 35 and older, showed that higher intake of UPFs was significantly associated with a faster rate of decline in both executive and global cognitive function.
“Based on these findings, doctors might counsel patients to prefer cooking at home [and] choosing fresher ingredients instead of buying ready-made meals and snacks,” said coinvestigator Natalia Gonçalves, PhD, University of São Paulo, Brazil.
Presented at the Alzheimer’s Association International Conference, the findings align with those from a recent study in Neurology. That study linked a diet high in UPFs to an increased risk for dementia.
Increasing worldwide consumption
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs include soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, fries, and many more.
Over the past 30 years, there has been a steady increase in consumption of UPFs worldwide. They are thought to induce systemic inflammation and oxidative stress and have been linked to a variety of ailments, such as overweight/obesity, cardiovascular disease, and cancer.
UPFs may also be a risk factor for cognitive decline, although data are scarce as to their effects on the brain.
To investigate, Dr. Gonçalves and colleagues evaluated longitudinal data on 10,775 adults (mean age, 50.6 years; 56% women; 55% White) who participated in the ELSA-Brasil study. They were evaluated in three waves (2008-2010, 2012-2014, and 2017-2019).
Information on diet was obtained via food frequency questionnaires and included information regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
Significant decline
Using linear mixed effects models that were adjusted for sociodemographic, lifestyle, and clinical variables, the investigators assessed the association of dietary UPFs as a percentage of total daily calories with cognitive performance over time.
During a median follow-up of 8 years, UPF intake in quartiles 2 to 4 (vs. quartile 1) was associated with a significant decline in global cognition (P = .003) and executive function (P = .015).
“Participants who reported consumption of more than 20% of daily calories from ultraprocessed foods had a 28% faster rate of global cognitive decline and a 25% faster decrease of the executive function compared to those who reported eating less than 20% of daily calories from ultraprocessed foods,” Dr. Gonçalves reported.
“Considering a person who eats a total of 2,000 kcal per day, 20% of daily calories from ultraprocessed foods are about two 1.5-ounce bars of KitKat, or five slices of bread, or about a third of an 8.5-ounce package of chips,” she explained.
Dr. Gonçalves noted that the reasons UPFs may harm the brain remain a “very relevant but not yet well-studied topic.”
Hypotheses include secondary effects from cerebrovascular lesions or chronic inflammation processes. More studies are needed to investigate the possible mechanisms that might explain the harm of UPFs to the brain, she said.
‘Troubling but not surprising’
Commenting on the study, Percy Griffin, PhD, director of scientific engagement for the Alzheimer’s Association, said there is “growing evidence that what we eat can impact our brains as we age.”
He added that many previous studies have suggested it is best for the brain for one to eat a heart-healthy, balanced diet that is low in processed foods and high in whole, nutritional foods, such as vegetables and fruits.
“These new data from the Alzheimer’s Association International Conference suggest eating a large amount of ultraprocessed food can significantly accelerate cognitive decline,” said Dr. Griffin, who was not involved with the research.
He noted that an increase in the availability and consumption of fast foods, processed foods, and UPFs is due to a number of socioeconomic factors, including low access to healthy foods, less time to prepare foods from scratch, and an inability to afford whole foods.
“Ultraprocessed foods make up more than half of American diets. It’s troubling but not surprising to see new data suggesting these foods can significantly accelerate cognitive decline,” Dr. Griffin said.
“The good news is there are steps we can take to reduce risk of cognitive decline as we age. These include eating a balanced diet, exercising regularly, getting good sleep, staying cognitively engaged, protecting from head injury, not smoking, and managing heart health,” he added.
Past research has suggested that the greatest benefit is from engaging in combinations of these lifestyle changes and that they are beneficial at any age, he noted.
“Even if you begin with one or two healthful actions, you’re moving in the right direction. It’s never too early or too late to incorporate these habits into your life,” Dr. Griffin said.
The study had no specific funding. Dr. Gonçalves and Dr. Griffin have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included participants aged 35 and older, showed that higher intake of UPFs was significantly associated with a faster rate of decline in both executive and global cognitive function.
“Based on these findings, doctors might counsel patients to prefer cooking at home [and] choosing fresher ingredients instead of buying ready-made meals and snacks,” said coinvestigator Natalia Gonçalves, PhD, University of São Paulo, Brazil.
Presented at the Alzheimer’s Association International Conference, the findings align with those from a recent study in Neurology. That study linked a diet high in UPFs to an increased risk for dementia.
Increasing worldwide consumption
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs include soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, fries, and many more.
Over the past 30 years, there has been a steady increase in consumption of UPFs worldwide. They are thought to induce systemic inflammation and oxidative stress and have been linked to a variety of ailments, such as overweight/obesity, cardiovascular disease, and cancer.
UPFs may also be a risk factor for cognitive decline, although data are scarce as to their effects on the brain.
To investigate, Dr. Gonçalves and colleagues evaluated longitudinal data on 10,775 adults (mean age, 50.6 years; 56% women; 55% White) who participated in the ELSA-Brasil study. They were evaluated in three waves (2008-2010, 2012-2014, and 2017-2019).
Information on diet was obtained via food frequency questionnaires and included information regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
Significant decline
Using linear mixed effects models that were adjusted for sociodemographic, lifestyle, and clinical variables, the investigators assessed the association of dietary UPFs as a percentage of total daily calories with cognitive performance over time.
During a median follow-up of 8 years, UPF intake in quartiles 2 to 4 (vs. quartile 1) was associated with a significant decline in global cognition (P = .003) and executive function (P = .015).
“Participants who reported consumption of more than 20% of daily calories from ultraprocessed foods had a 28% faster rate of global cognitive decline and a 25% faster decrease of the executive function compared to those who reported eating less than 20% of daily calories from ultraprocessed foods,” Dr. Gonçalves reported.
“Considering a person who eats a total of 2,000 kcal per day, 20% of daily calories from ultraprocessed foods are about two 1.5-ounce bars of KitKat, or five slices of bread, or about a third of an 8.5-ounce package of chips,” she explained.
Dr. Gonçalves noted that the reasons UPFs may harm the brain remain a “very relevant but not yet well-studied topic.”
Hypotheses include secondary effects from cerebrovascular lesions or chronic inflammation processes. More studies are needed to investigate the possible mechanisms that might explain the harm of UPFs to the brain, she said.
‘Troubling but not surprising’
Commenting on the study, Percy Griffin, PhD, director of scientific engagement for the Alzheimer’s Association, said there is “growing evidence that what we eat can impact our brains as we age.”
He added that many previous studies have suggested it is best for the brain for one to eat a heart-healthy, balanced diet that is low in processed foods and high in whole, nutritional foods, such as vegetables and fruits.
“These new data from the Alzheimer’s Association International Conference suggest eating a large amount of ultraprocessed food can significantly accelerate cognitive decline,” said Dr. Griffin, who was not involved with the research.
He noted that an increase in the availability and consumption of fast foods, processed foods, and UPFs is due to a number of socioeconomic factors, including low access to healthy foods, less time to prepare foods from scratch, and an inability to afford whole foods.
“Ultraprocessed foods make up more than half of American diets. It’s troubling but not surprising to see new data suggesting these foods can significantly accelerate cognitive decline,” Dr. Griffin said.
“The good news is there are steps we can take to reduce risk of cognitive decline as we age. These include eating a balanced diet, exercising regularly, getting good sleep, staying cognitively engaged, protecting from head injury, not smoking, and managing heart health,” he added.
Past research has suggested that the greatest benefit is from engaging in combinations of these lifestyle changes and that they are beneficial at any age, he noted.
“Even if you begin with one or two healthful actions, you’re moving in the right direction. It’s never too early or too late to incorporate these habits into your life,” Dr. Griffin said.
The study had no specific funding. Dr. Gonçalves and Dr. Griffin have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
From AAIC 2022
Racism tied to cognition in middle-aged, elderly
It is generally understood that racism, whether structural or personal, harms the well-being of the individual who experiences it. It has harmful health effects, and it contributes to ethnic inequality.
That was the fundamental message behind two studies presented at a press conference at the Alzheimer’s Association International Conference.
“We know that there are communities like black African Americans and Hispanic Latinos who are at greater risk for developing Alzheimer’s or another dementia,” said Carl Hill, PhD, who served as a moderator during the press conference. He pointed out that the genetic and lifestyle factors linked to dementia tell only part of the story. “It’s important that the science also examines the unique experiences of those at greater risk for dementia in our society,” said Dr. Hill, who is Alzheimer’s Association Chief Diversity Equity and Inclusion Officer.
Racism, memory, and cognition in middle-aged patients
Jennifer J. Manly, PhD, professor of neuropsychology at Columbia University, New York, presented a study of experience of racism and memory scores among a highly diverse, middle-aged cohort.
“There’s little understanding of how the multiple levels of racism – including intrapersonal, institutional, and structural racism – influence cognitive aging and dementia risk,” Dr. Manly said during the press conference.
Among 1,095 participants, 19.5% were non-Latinx White (61% female, mean age 57), 26.0% were non-Latinx Black (63% female, mean age 56), 32.3% were English-speaking Latinx (66% female, mean age 50), and 21.2% were Spanish-speaking Latinx (68% female, mean age 58).
The researchers used the Everyday Discrimination (ED) scale to measure experience of individual racism, the Major Discrimination (MD) scale to measure experience of institutional racism, and residential segregation of the census block group for an individual’s parents to measure residential segregation. Outcome measures included the Digit Span to assess attention and working memory, and the Selective Reminding Test to assess episodic memory.
The study found a clear association between racism and cognition. “The association of interpersonal racism to memory corresponds to 3 years of chronological age, and was driven by non-Hispanic black participants. Next, there was a reliable relationship between institutional racism and memory scores among non-Hispanic black participants, such that each reported civil rights violation corresponded to the effect of about 4.5 years of age on memory,” said Dr. Manly.
“The bottom line is that our results suggest that exposure to racism is a substantial driver of later life memory function, even in middle age, and especially for Black people,” Dr. Manly added.
The results should alert physicians to the complexities of racism and its impact. “Health providers need to be aware that many accumulated risks are historical and structural, and not controlled by the individual. Maybe more importantly, the medical system itself may perpetuate discriminatory experiences that contribute to worse health,” said Dr. Manly.
Latinx concerns
Also at the press conference, Adriana Perez, PhD, emphasized the challenges that Spanish-speaking Latinxs have with health care. Just 5%-7% of nurses are Latinx. “The same could be said for physicians, for clinical psychologists ... as you look at the really critical positions to address brain health equity, we are not represented there,” said Dr. Perez, an assistant professor and senior fellow at the University of Pennsylvania School of Nursing in Philadelphia.
She also pointed out that Latinx representation in clinical trials is very low, even though surveys performed by the Alzheimer’s Association show that this population values medical science and is willing to participate. In fact, 85% said they would participate if invited. The trouble is that many clinical trial announcements state that participants must speak English. Even the many Latinos who are bilingual may be put off by that wording: “That is a message that you’re not invited. That’s how it’s perceived,” said Dr. Perez.
Racism and cognition in the elderly
At the press conference, Kristen George, PhD, presented results from a study of individuals over age 90. “Racial disparities in dementia have been well characterized, particularly among those people who are aged 65 and older, but we don’t know very much about the oldest old individuals who are aged 90 and older. This group is one of the fastest growing segments of the population, and it’s becoming increasingly diverse,” said Dr. George, assistant professor of epidemiology at the University of California, Davis.
The group enrolled 445 Asian, Black, Latinx, White, and multiracial individuals who were members of Kaiser Permanente Northern California, with a mean age of 92.7 years. They used the Major Experiences of Discrimination Scale to assess discrimination.
The researchers divided them into three groups based on gender, race, and responses to the 10-item scale. Class 1 included largely White men who had reported workplace discrimination, with an average of two major discrimination experiences. Class 2 was made up of White women and non-Whites who reported little or no discrimination, with an average of 0 experiences. Class 3 included all non-White participants, and they reported a mean of four discrimination experiences.
Using class 2 as a reference, executive function was better among class 1 individuals (beta = 0.28; 95% CI, 0.03-0.52) but there was no significant difference between class 3 and class 2. Class 1 had better baseline semantic memory than class 2 (beta = 0.33; 95% CI, 0.07-0.58), and those in class 3 performed significantly worse than class 2 (beta = –0.24; 95% CI, –0.48 to –0.00). There were no between-group differences in baseline verbal or episodic memory.
Dr. Perez, Dr. Manly, Dr. George, and Dr. Hill have no relevant financial disclosures.
It is generally understood that racism, whether structural or personal, harms the well-being of the individual who experiences it. It has harmful health effects, and it contributes to ethnic inequality.
That was the fundamental message behind two studies presented at a press conference at the Alzheimer’s Association International Conference.
“We know that there are communities like black African Americans and Hispanic Latinos who are at greater risk for developing Alzheimer’s or another dementia,” said Carl Hill, PhD, who served as a moderator during the press conference. He pointed out that the genetic and lifestyle factors linked to dementia tell only part of the story. “It’s important that the science also examines the unique experiences of those at greater risk for dementia in our society,” said Dr. Hill, who is Alzheimer’s Association Chief Diversity Equity and Inclusion Officer.
Racism, memory, and cognition in middle-aged patients
Jennifer J. Manly, PhD, professor of neuropsychology at Columbia University, New York, presented a study of experience of racism and memory scores among a highly diverse, middle-aged cohort.
“There’s little understanding of how the multiple levels of racism – including intrapersonal, institutional, and structural racism – influence cognitive aging and dementia risk,” Dr. Manly said during the press conference.
Among 1,095 participants, 19.5% were non-Latinx White (61% female, mean age 57), 26.0% were non-Latinx Black (63% female, mean age 56), 32.3% were English-speaking Latinx (66% female, mean age 50), and 21.2% were Spanish-speaking Latinx (68% female, mean age 58).
The researchers used the Everyday Discrimination (ED) scale to measure experience of individual racism, the Major Discrimination (MD) scale to measure experience of institutional racism, and residential segregation of the census block group for an individual’s parents to measure residential segregation. Outcome measures included the Digit Span to assess attention and working memory, and the Selective Reminding Test to assess episodic memory.
The study found a clear association between racism and cognition. “The association of interpersonal racism to memory corresponds to 3 years of chronological age, and was driven by non-Hispanic black participants. Next, there was a reliable relationship between institutional racism and memory scores among non-Hispanic black participants, such that each reported civil rights violation corresponded to the effect of about 4.5 years of age on memory,” said Dr. Manly.
“The bottom line is that our results suggest that exposure to racism is a substantial driver of later life memory function, even in middle age, and especially for Black people,” Dr. Manly added.
The results should alert physicians to the complexities of racism and its impact. “Health providers need to be aware that many accumulated risks are historical and structural, and not controlled by the individual. Maybe more importantly, the medical system itself may perpetuate discriminatory experiences that contribute to worse health,” said Dr. Manly.
Latinx concerns
Also at the press conference, Adriana Perez, PhD, emphasized the challenges that Spanish-speaking Latinxs have with health care. Just 5%-7% of nurses are Latinx. “The same could be said for physicians, for clinical psychologists ... as you look at the really critical positions to address brain health equity, we are not represented there,” said Dr. Perez, an assistant professor and senior fellow at the University of Pennsylvania School of Nursing in Philadelphia.
She also pointed out that Latinx representation in clinical trials is very low, even though surveys performed by the Alzheimer’s Association show that this population values medical science and is willing to participate. In fact, 85% said they would participate if invited. The trouble is that many clinical trial announcements state that participants must speak English. Even the many Latinos who are bilingual may be put off by that wording: “That is a message that you’re not invited. That’s how it’s perceived,” said Dr. Perez.
Racism and cognition in the elderly
At the press conference, Kristen George, PhD, presented results from a study of individuals over age 90. “Racial disparities in dementia have been well characterized, particularly among those people who are aged 65 and older, but we don’t know very much about the oldest old individuals who are aged 90 and older. This group is one of the fastest growing segments of the population, and it’s becoming increasingly diverse,” said Dr. George, assistant professor of epidemiology at the University of California, Davis.
The group enrolled 445 Asian, Black, Latinx, White, and multiracial individuals who were members of Kaiser Permanente Northern California, with a mean age of 92.7 years. They used the Major Experiences of Discrimination Scale to assess discrimination.
The researchers divided them into three groups based on gender, race, and responses to the 10-item scale. Class 1 included largely White men who had reported workplace discrimination, with an average of two major discrimination experiences. Class 2 was made up of White women and non-Whites who reported little or no discrimination, with an average of 0 experiences. Class 3 included all non-White participants, and they reported a mean of four discrimination experiences.
Using class 2 as a reference, executive function was better among class 1 individuals (beta = 0.28; 95% CI, 0.03-0.52) but there was no significant difference between class 3 and class 2. Class 1 had better baseline semantic memory than class 2 (beta = 0.33; 95% CI, 0.07-0.58), and those in class 3 performed significantly worse than class 2 (beta = –0.24; 95% CI, –0.48 to –0.00). There were no between-group differences in baseline verbal or episodic memory.
Dr. Perez, Dr. Manly, Dr. George, and Dr. Hill have no relevant financial disclosures.
It is generally understood that racism, whether structural or personal, harms the well-being of the individual who experiences it. It has harmful health effects, and it contributes to ethnic inequality.
That was the fundamental message behind two studies presented at a press conference at the Alzheimer’s Association International Conference.
“We know that there are communities like black African Americans and Hispanic Latinos who are at greater risk for developing Alzheimer’s or another dementia,” said Carl Hill, PhD, who served as a moderator during the press conference. He pointed out that the genetic and lifestyle factors linked to dementia tell only part of the story. “It’s important that the science also examines the unique experiences of those at greater risk for dementia in our society,” said Dr. Hill, who is Alzheimer’s Association Chief Diversity Equity and Inclusion Officer.
Racism, memory, and cognition in middle-aged patients
Jennifer J. Manly, PhD, professor of neuropsychology at Columbia University, New York, presented a study of experience of racism and memory scores among a highly diverse, middle-aged cohort.
“There’s little understanding of how the multiple levels of racism – including intrapersonal, institutional, and structural racism – influence cognitive aging and dementia risk,” Dr. Manly said during the press conference.
Among 1,095 participants, 19.5% were non-Latinx White (61% female, mean age 57), 26.0% were non-Latinx Black (63% female, mean age 56), 32.3% were English-speaking Latinx (66% female, mean age 50), and 21.2% were Spanish-speaking Latinx (68% female, mean age 58).
The researchers used the Everyday Discrimination (ED) scale to measure experience of individual racism, the Major Discrimination (MD) scale to measure experience of institutional racism, and residential segregation of the census block group for an individual’s parents to measure residential segregation. Outcome measures included the Digit Span to assess attention and working memory, and the Selective Reminding Test to assess episodic memory.
The study found a clear association between racism and cognition. “The association of interpersonal racism to memory corresponds to 3 years of chronological age, and was driven by non-Hispanic black participants. Next, there was a reliable relationship between institutional racism and memory scores among non-Hispanic black participants, such that each reported civil rights violation corresponded to the effect of about 4.5 years of age on memory,” said Dr. Manly.
“The bottom line is that our results suggest that exposure to racism is a substantial driver of later life memory function, even in middle age, and especially for Black people,” Dr. Manly added.
The results should alert physicians to the complexities of racism and its impact. “Health providers need to be aware that many accumulated risks are historical and structural, and not controlled by the individual. Maybe more importantly, the medical system itself may perpetuate discriminatory experiences that contribute to worse health,” said Dr. Manly.
Latinx concerns
Also at the press conference, Adriana Perez, PhD, emphasized the challenges that Spanish-speaking Latinxs have with health care. Just 5%-7% of nurses are Latinx. “The same could be said for physicians, for clinical psychologists ... as you look at the really critical positions to address brain health equity, we are not represented there,” said Dr. Perez, an assistant professor and senior fellow at the University of Pennsylvania School of Nursing in Philadelphia.
She also pointed out that Latinx representation in clinical trials is very low, even though surveys performed by the Alzheimer’s Association show that this population values medical science and is willing to participate. In fact, 85% said they would participate if invited. The trouble is that many clinical trial announcements state that participants must speak English. Even the many Latinos who are bilingual may be put off by that wording: “That is a message that you’re not invited. That’s how it’s perceived,” said Dr. Perez.
Racism and cognition in the elderly
At the press conference, Kristen George, PhD, presented results from a study of individuals over age 90. “Racial disparities in dementia have been well characterized, particularly among those people who are aged 65 and older, but we don’t know very much about the oldest old individuals who are aged 90 and older. This group is one of the fastest growing segments of the population, and it’s becoming increasingly diverse,” said Dr. George, assistant professor of epidemiology at the University of California, Davis.
The group enrolled 445 Asian, Black, Latinx, White, and multiracial individuals who were members of Kaiser Permanente Northern California, with a mean age of 92.7 years. They used the Major Experiences of Discrimination Scale to assess discrimination.
The researchers divided them into three groups based on gender, race, and responses to the 10-item scale. Class 1 included largely White men who had reported workplace discrimination, with an average of two major discrimination experiences. Class 2 was made up of White women and non-Whites who reported little or no discrimination, with an average of 0 experiences. Class 3 included all non-White participants, and they reported a mean of four discrimination experiences.
Using class 2 as a reference, executive function was better among class 1 individuals (beta = 0.28; 95% CI, 0.03-0.52) but there was no significant difference between class 3 and class 2. Class 1 had better baseline semantic memory than class 2 (beta = 0.33; 95% CI, 0.07-0.58), and those in class 3 performed significantly worse than class 2 (beta = –0.24; 95% CI, –0.48 to –0.00). There were no between-group differences in baseline verbal or episodic memory.
Dr. Perez, Dr. Manly, Dr. George, and Dr. Hill have no relevant financial disclosures.
FROM AAIC 2022
COVID smell loss tops disease severity as a predictor of long-term cognitive impairment
preliminary results of new research suggest.
The findings provide important insight into the long-term cognitive impact of COVID-19, said study investigator Gabriela Gonzalez-Alemán, PhD, professor at Pontifical Catholic University of Argentina, Buenos Aires.
The more information that can be gathered on factors increasing risks for this cognitive impact, “the better we can track it and begin to develop methods to prevent it,” she said.
The findings were presented at the Alzheimer’s Association International Conference.
Memory, attention problems
COVID-19 has infected more than 570 million people worldwide. Related infections may result in long-term sequelae, including neuropsychiatric symptoms, said Dr. Gonzalez-Alemán.
In older adults, COVID-19 sequelae may resemble early Alzheimer’s disease, and the two conditions may share risk factors and blood biomarkers.
The new study highlighted 1-year results from a large, prospective cohort study from Argentina. Researchers used measures to evaluate long-term consequences of COVID-19 in older adults recommended by the Alzheimer’s Association Consortium on Chronic Neuropsychiatric Sequelae of SARS-CoV-2 infection (CNS SC2).
Harmonizing definitions and methodologies for studying COVID-19’s impact on the brain allows consortium members to compare study results, said Dr. Gonzalez-Alemán.
The investigators used the health registry in the province of Jujuy, situated in the extreme northwestern part of Argentina. The registry includes all SARS-CoV-2 testing data for the entire region.
The investigators randomly invited adults aged 60 years and older from the registry to participate in the study. The current analysis included 766 adults aged 55-95 years (mean age 66.9 years; 57% female) with an average of 10.4 years of education. The education system in Argentina includes 12 years of school before university.
Investigators stratified subjects by polymerase chain reaction testing status. Of the total, 88.4% were infected with COVID and 11.6% were controls (subjects without COVID).
The neurocognitive assessment of participants included four cognitive domains: memory, attention, language, and executive function, and an olfactory test that determined degree of olfactory dysfunction. Cognitive impairment was defined as z scores below –2.
Researchers divided participants into groups according to cognitive performance. These included normal cognition, memory-only impairment (single domain; 11.7%), impairment in attention and executive function without memory impairment (two domains; 8.3%), and multiple domain impairment (11.6%).
“Our participants showed a predominance of memory impairment as would be seen in Alzheimer’s disease,” noted Dr. Gonzalez-Alemán. “And a large group showed a combination of memory and attention problems.”
About 40% of the study sample – but no controls – had olfactory dysfunction.
“All the subjects that had a severe cognitive impairment also had anosmia [loss of smell],” said Dr. Gonzalez-Alemán. “We established an association between olfactory dysfunction and cognitive performance and impairment.”
The analysis showed that severity of anosmia, but not clinical status, significantly predicted cognitive impairment. “So, anosmia could be a good predictor of cognitive impairment after COVID-19 infection,” said Dr. Gonzalez-Alemán.
For individuals older than 60 years, cognitive impairment can be persistent, as can be olfactory dysfunction, she added.
Results of a 1-year phone survey showed about 71.8% of subjects had received three vaccine doses and 24.9% two doses. About 12.5% of those with three doses were reinfected and 23.3% of those with two doses were reinfected.
Longest follow-up to date
Commenting on the research, Heather Snyder, PhD, vice president, medical and scientific relations at the Alzheimer’s Association, noted the study is “the longest follow-up we’ve seen” looking at the connection between persistent loss of smell and cognitive changes after a COVID-19 infection.
The study included a “fairly large” sample size and was “unique” in that it was set up in a part of the country with centralized testing, said Dr. Snyder.
The Argentinian group is among the most advanced of those connected to the CNS SC2, said Dr. Snyder.
Members of this Alzheimer’s Association consortium, said Dr. Snyder, regularly share updates of ongoing studies, which are at different stages and looking at various neuropsychiatric impacts of COVID-19. It is important to bring these groups together to determine what those impacts are “because no one group will be able to do this on their own,” she said. “We saw pretty early on that some individuals had changes in the brain, or changes in cognition, and loss of sense of smell or taste, which indicates there’s a connection to the brain.”
However, she added, “there’s still a lot we don’t know” about this connection.
The study was funded by Alzheimer’s Association and FULTRA.
A version of this article first appeared on Medscape.com.
preliminary results of new research suggest.
The findings provide important insight into the long-term cognitive impact of COVID-19, said study investigator Gabriela Gonzalez-Alemán, PhD, professor at Pontifical Catholic University of Argentina, Buenos Aires.
The more information that can be gathered on factors increasing risks for this cognitive impact, “the better we can track it and begin to develop methods to prevent it,” she said.
The findings were presented at the Alzheimer’s Association International Conference.
Memory, attention problems
COVID-19 has infected more than 570 million people worldwide. Related infections may result in long-term sequelae, including neuropsychiatric symptoms, said Dr. Gonzalez-Alemán.
In older adults, COVID-19 sequelae may resemble early Alzheimer’s disease, and the two conditions may share risk factors and blood biomarkers.
The new study highlighted 1-year results from a large, prospective cohort study from Argentina. Researchers used measures to evaluate long-term consequences of COVID-19 in older adults recommended by the Alzheimer’s Association Consortium on Chronic Neuropsychiatric Sequelae of SARS-CoV-2 infection (CNS SC2).
Harmonizing definitions and methodologies for studying COVID-19’s impact on the brain allows consortium members to compare study results, said Dr. Gonzalez-Alemán.
The investigators used the health registry in the province of Jujuy, situated in the extreme northwestern part of Argentina. The registry includes all SARS-CoV-2 testing data for the entire region.
The investigators randomly invited adults aged 60 years and older from the registry to participate in the study. The current analysis included 766 adults aged 55-95 years (mean age 66.9 years; 57% female) with an average of 10.4 years of education. The education system in Argentina includes 12 years of school before university.
Investigators stratified subjects by polymerase chain reaction testing status. Of the total, 88.4% were infected with COVID and 11.6% were controls (subjects without COVID).
The neurocognitive assessment of participants included four cognitive domains: memory, attention, language, and executive function, and an olfactory test that determined degree of olfactory dysfunction. Cognitive impairment was defined as z scores below –2.
Researchers divided participants into groups according to cognitive performance. These included normal cognition, memory-only impairment (single domain; 11.7%), impairment in attention and executive function without memory impairment (two domains; 8.3%), and multiple domain impairment (11.6%).
“Our participants showed a predominance of memory impairment as would be seen in Alzheimer’s disease,” noted Dr. Gonzalez-Alemán. “And a large group showed a combination of memory and attention problems.”
About 40% of the study sample – but no controls – had olfactory dysfunction.
“All the subjects that had a severe cognitive impairment also had anosmia [loss of smell],” said Dr. Gonzalez-Alemán. “We established an association between olfactory dysfunction and cognitive performance and impairment.”
The analysis showed that severity of anosmia, but not clinical status, significantly predicted cognitive impairment. “So, anosmia could be a good predictor of cognitive impairment after COVID-19 infection,” said Dr. Gonzalez-Alemán.
For individuals older than 60 years, cognitive impairment can be persistent, as can be olfactory dysfunction, she added.
Results of a 1-year phone survey showed about 71.8% of subjects had received three vaccine doses and 24.9% two doses. About 12.5% of those with three doses were reinfected and 23.3% of those with two doses were reinfected.
Longest follow-up to date
Commenting on the research, Heather Snyder, PhD, vice president, medical and scientific relations at the Alzheimer’s Association, noted the study is “the longest follow-up we’ve seen” looking at the connection between persistent loss of smell and cognitive changes after a COVID-19 infection.
The study included a “fairly large” sample size and was “unique” in that it was set up in a part of the country with centralized testing, said Dr. Snyder.
The Argentinian group is among the most advanced of those connected to the CNS SC2, said Dr. Snyder.
Members of this Alzheimer’s Association consortium, said Dr. Snyder, regularly share updates of ongoing studies, which are at different stages and looking at various neuropsychiatric impacts of COVID-19. It is important to bring these groups together to determine what those impacts are “because no one group will be able to do this on their own,” she said. “We saw pretty early on that some individuals had changes in the brain, or changes in cognition, and loss of sense of smell or taste, which indicates there’s a connection to the brain.”
However, she added, “there’s still a lot we don’t know” about this connection.
The study was funded by Alzheimer’s Association and FULTRA.
A version of this article first appeared on Medscape.com.
preliminary results of new research suggest.
The findings provide important insight into the long-term cognitive impact of COVID-19, said study investigator Gabriela Gonzalez-Alemán, PhD, professor at Pontifical Catholic University of Argentina, Buenos Aires.
The more information that can be gathered on factors increasing risks for this cognitive impact, “the better we can track it and begin to develop methods to prevent it,” she said.
The findings were presented at the Alzheimer’s Association International Conference.
Memory, attention problems
COVID-19 has infected more than 570 million people worldwide. Related infections may result in long-term sequelae, including neuropsychiatric symptoms, said Dr. Gonzalez-Alemán.
In older adults, COVID-19 sequelae may resemble early Alzheimer’s disease, and the two conditions may share risk factors and blood biomarkers.
The new study highlighted 1-year results from a large, prospective cohort study from Argentina. Researchers used measures to evaluate long-term consequences of COVID-19 in older adults recommended by the Alzheimer’s Association Consortium on Chronic Neuropsychiatric Sequelae of SARS-CoV-2 infection (CNS SC2).
Harmonizing definitions and methodologies for studying COVID-19’s impact on the brain allows consortium members to compare study results, said Dr. Gonzalez-Alemán.
The investigators used the health registry in the province of Jujuy, situated in the extreme northwestern part of Argentina. The registry includes all SARS-CoV-2 testing data for the entire region.
The investigators randomly invited adults aged 60 years and older from the registry to participate in the study. The current analysis included 766 adults aged 55-95 years (mean age 66.9 years; 57% female) with an average of 10.4 years of education. The education system in Argentina includes 12 years of school before university.
Investigators stratified subjects by polymerase chain reaction testing status. Of the total, 88.4% were infected with COVID and 11.6% were controls (subjects without COVID).
The neurocognitive assessment of participants included four cognitive domains: memory, attention, language, and executive function, and an olfactory test that determined degree of olfactory dysfunction. Cognitive impairment was defined as z scores below –2.
Researchers divided participants into groups according to cognitive performance. These included normal cognition, memory-only impairment (single domain; 11.7%), impairment in attention and executive function without memory impairment (two domains; 8.3%), and multiple domain impairment (11.6%).
“Our participants showed a predominance of memory impairment as would be seen in Alzheimer’s disease,” noted Dr. Gonzalez-Alemán. “And a large group showed a combination of memory and attention problems.”
About 40% of the study sample – but no controls – had olfactory dysfunction.
“All the subjects that had a severe cognitive impairment also had anosmia [loss of smell],” said Dr. Gonzalez-Alemán. “We established an association between olfactory dysfunction and cognitive performance and impairment.”
The analysis showed that severity of anosmia, but not clinical status, significantly predicted cognitive impairment. “So, anosmia could be a good predictor of cognitive impairment after COVID-19 infection,” said Dr. Gonzalez-Alemán.
For individuals older than 60 years, cognitive impairment can be persistent, as can be olfactory dysfunction, she added.
Results of a 1-year phone survey showed about 71.8% of subjects had received three vaccine doses and 24.9% two doses. About 12.5% of those with three doses were reinfected and 23.3% of those with two doses were reinfected.
Longest follow-up to date
Commenting on the research, Heather Snyder, PhD, vice president, medical and scientific relations at the Alzheimer’s Association, noted the study is “the longest follow-up we’ve seen” looking at the connection between persistent loss of smell and cognitive changes after a COVID-19 infection.
The study included a “fairly large” sample size and was “unique” in that it was set up in a part of the country with centralized testing, said Dr. Snyder.
The Argentinian group is among the most advanced of those connected to the CNS SC2, said Dr. Snyder.
Members of this Alzheimer’s Association consortium, said Dr. Snyder, regularly share updates of ongoing studies, which are at different stages and looking at various neuropsychiatric impacts of COVID-19. It is important to bring these groups together to determine what those impacts are “because no one group will be able to do this on their own,” she said. “We saw pretty early on that some individuals had changes in the brain, or changes in cognition, and loss of sense of smell or taste, which indicates there’s a connection to the brain.”
However, she added, “there’s still a lot we don’t know” about this connection.
The study was funded by Alzheimer’s Association and FULTRA.
A version of this article first appeared on Medscape.com.
FROM AAIC 2022