User login
A ‘promising target’ to improve outcomes in late-life depression
A new study sheds light on the neurologic underpinnings of late-life depression (LLD) with apathy and its frequently poor response to treatment.
Investigators headed by Faith Gunning, PhD, of the Institute of Geriatric Psychiatry, Weill Cornell Medicine, New York, analyzed baseline and posttreatment brain MRIs and functional MRIs (fMRIs) of older adults with depression who participated in a 12-week open-label nonrandomized clinical trial of escitalopram. Participants had undergone clinical and cognitive assessments.
Disturbances were found in resting state functional connectivity (rsFC) between the salience network (SN) and other large-scale networks that support goal-directed behavior, especially in patients with depression who also had features of apathy.
“This study suggests that, among older adults with depression, distinct network abnormalities may be associated with apathy and poor response to first-line pharmacotherapy and may serve as promising targets for novel interventions,” the investigators write.
The study was published online in JAMA Network Open.
A leading cause of disability
LLD is a “leading cause of disability and medical morbidity in older adulthood,” with one-third to one-half of patients with LLD also suffering from apathy, the authors write.
Older adults with depression and comorbid apathy have poorer outcomes, including lower remission rates and poorer response to first-line antidepressants, compared with those with LLD but who do not have apathy.
Despite the high prevalence of apathy in people with depression, “little is known about its optimal treatment and, more broadly, about the brain-based mechanisms of apathy,” the authors note.
An “emerging hypothesis” points to the role of a compromised SN and its large-scale connections between apathy and poor treatment response in LLD.
The SN (which includes the insula and the dorsal anterior cingulate cortex) “attributes motivational value to a stimulus” and “dynamically coordinates the activity of other large-scale networks, including the executive control network and default mode network (DMN).”
Preliminary studies of apathy in patients with depression report reduced volume in structures of the SN and suggest disruption in functional connectivity among the SN, DMN, and the executive control network; but the mechanisms linking apathy to poor antidepressant response in LLD “are not well understood.”
“Connectometry” is a “novel approach to diffusion MRI analysis that quantifies the local connectome of white matter pathways.” It has been used along with resting-state imagery, but it had not been used in studying apathy.
The researchers investigated the functional connectivity of the SN, hypothesizing that alterations in connectivity among key nodes of the SN and other core circuits that modulate goal-directed behavior (DMN and the executive control network) were implicated in individuals with depression and apathy.
They applied connectometry to “identify pathway-level disruptions in structural connectivity,” hypothesizing that compromise of frontoparietal and frontolimbic pathways would be associated with apathy in patients with LLD.
They also wanted to know whether apathy-related network abnormalities were associated with antidepressant response after 12 weeks of pharmacotherapy with the selective serotonin reuptake inhibitor escitalopram.
Emerging model
The study included 40 older adults (65% women; mean [SD] age, 70.0 [6.6] years) with DSM-IV–diagnosis major depressive disorder (without psychotic features) who were from a single-group, open-label escitalopram treatment trial.
The Hamilton-Depression (HAM-D) scale was used to assess depression, while the Apathy Evaluation Scale was used to assess apathy. On the Apathy Evaluation Scale, a score of greater than 40.5 represents “clinically significant apathy.” Participants completed these tests at baseline and after 12 weeks of escitalopram treatment.
They also completed a battery of neuropsychological tests to assess cognition and underwent MRI imaging. fMRI was used to map group differences in rsFC of the SN, and diffusion connectometry was used to “evaluate pathway-level disruptions in structural connectivity.”
Of the participants, 20 had clinically significant apathy. There were no differences in age, sex, educational level, or the severity of depression at baseline between those who did and those who did not have apathy.
Compared with participants with depression but not apathy, those with depression and comorbid apathy had lower rsFC of salience network seeds (specifically, the dorsolateral prefrontal cortex [DLPFC], premotor cortex, midcingulate cortex, and paracentral lobule).
They also had greater rsFC in the lateral temporal cortex and temporal pole (z > 2.7; Bonferroni-corrected threshold of P < .0125).
Additionally, participants with apathy had lower structural connectivity in the splenium, cingulum, and fronto-occipital fasciculus, compared with those without apathy (t > 2.5; false discovery rate–corrected P = .02).
Of the 27 participants who completed escitalopram treatment; 16 (59%) achieved remission (defined as an HAM-D score <10). Participants with apathy had poorer response to escitalopram treatment.
Lower insula-DLPFC/midcingulate cortex rsFC was associated with less improvement in depressive symptoms (HAM-D percentage change, beta [df] = .588 [26]; P = .001) as well as a greater likelihood that the participant would not achieve remission after treatment (odds ratio, 1.041; 95% confidence interval, 1.003-1.081; P = .04).
In regression models, lower insula-DLPFC/midcingulate cortex rsFC was found to be a mediator of the association between baseline apathy and persistence of depression.
The SN findings were also relevant to cognition. Lower dorsal anterior cingulate-DLPFC/paracentral rsFC was found to be associated with residual cognitive difficulties on measures of attention and executive function (beta [df] = .445 [26] and beta [df] = .384 [26], respectively; for each, P = .04).
“These findings support an emerging model of apathy, which proposes that apathy may arise from dysfunctional interactions among core networks (that is, SN, DMN, and executive control) that support motivated behavior,” the investigators write.
“This may cause a failure of network integration, leading to difficulties with salience processing, action planning, and behavioral initiation that manifests clinically as apathy,” they conclude.
One limitation they note was the lack of longitudinal follow-up after acute treatment and a “relatively limited neuropsychological battery.” Therefore, they could not “establish the persistence of treatment differences nor the specificity of cognitive associations.”
The investigators add that “novel interventions that modulate interactions among affected circuits may help to improve clinical outcomes in this distinct subgroup of older adults with depression, for whom few effective treatments exist.”
Commenting on the study, Helen Lavretsy, MD, professor of psychiatry in residence and director of the Late-Life Mood, Stress, and Wellness Research Program and the Integrative Psychiatry Clinic, Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, said, the findings “can be used in future studies targeting apathy and the underlying neural mechanisms of brain connectivity.” Dr. Lavretsy was not involved with the study.
The study was supported by grants from the National Institute of Mental Health. Dr. Gunning reported receiving grants from the National Institute of Mental Health during the conduct of the study and grants from Akili Interactive. The other authors’ disclosures are listed on the original article. Dr. Lavretsky reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A new study sheds light on the neurologic underpinnings of late-life depression (LLD) with apathy and its frequently poor response to treatment.
Investigators headed by Faith Gunning, PhD, of the Institute of Geriatric Psychiatry, Weill Cornell Medicine, New York, analyzed baseline and posttreatment brain MRIs and functional MRIs (fMRIs) of older adults with depression who participated in a 12-week open-label nonrandomized clinical trial of escitalopram. Participants had undergone clinical and cognitive assessments.
Disturbances were found in resting state functional connectivity (rsFC) between the salience network (SN) and other large-scale networks that support goal-directed behavior, especially in patients with depression who also had features of apathy.
“This study suggests that, among older adults with depression, distinct network abnormalities may be associated with apathy and poor response to first-line pharmacotherapy and may serve as promising targets for novel interventions,” the investigators write.
The study was published online in JAMA Network Open.
A leading cause of disability
LLD is a “leading cause of disability and medical morbidity in older adulthood,” with one-third to one-half of patients with LLD also suffering from apathy, the authors write.
Older adults with depression and comorbid apathy have poorer outcomes, including lower remission rates and poorer response to first-line antidepressants, compared with those with LLD but who do not have apathy.
Despite the high prevalence of apathy in people with depression, “little is known about its optimal treatment and, more broadly, about the brain-based mechanisms of apathy,” the authors note.
An “emerging hypothesis” points to the role of a compromised SN and its large-scale connections between apathy and poor treatment response in LLD.
The SN (which includes the insula and the dorsal anterior cingulate cortex) “attributes motivational value to a stimulus” and “dynamically coordinates the activity of other large-scale networks, including the executive control network and default mode network (DMN).”
Preliminary studies of apathy in patients with depression report reduced volume in structures of the SN and suggest disruption in functional connectivity among the SN, DMN, and the executive control network; but the mechanisms linking apathy to poor antidepressant response in LLD “are not well understood.”
“Connectometry” is a “novel approach to diffusion MRI analysis that quantifies the local connectome of white matter pathways.” It has been used along with resting-state imagery, but it had not been used in studying apathy.
The researchers investigated the functional connectivity of the SN, hypothesizing that alterations in connectivity among key nodes of the SN and other core circuits that modulate goal-directed behavior (DMN and the executive control network) were implicated in individuals with depression and apathy.
They applied connectometry to “identify pathway-level disruptions in structural connectivity,” hypothesizing that compromise of frontoparietal and frontolimbic pathways would be associated with apathy in patients with LLD.
They also wanted to know whether apathy-related network abnormalities were associated with antidepressant response after 12 weeks of pharmacotherapy with the selective serotonin reuptake inhibitor escitalopram.
Emerging model
The study included 40 older adults (65% women; mean [SD] age, 70.0 [6.6] years) with DSM-IV–diagnosis major depressive disorder (without psychotic features) who were from a single-group, open-label escitalopram treatment trial.
The Hamilton-Depression (HAM-D) scale was used to assess depression, while the Apathy Evaluation Scale was used to assess apathy. On the Apathy Evaluation Scale, a score of greater than 40.5 represents “clinically significant apathy.” Participants completed these tests at baseline and after 12 weeks of escitalopram treatment.
They also completed a battery of neuropsychological tests to assess cognition and underwent MRI imaging. fMRI was used to map group differences in rsFC of the SN, and diffusion connectometry was used to “evaluate pathway-level disruptions in structural connectivity.”
Of the participants, 20 had clinically significant apathy. There were no differences in age, sex, educational level, or the severity of depression at baseline between those who did and those who did not have apathy.
Compared with participants with depression but not apathy, those with depression and comorbid apathy had lower rsFC of salience network seeds (specifically, the dorsolateral prefrontal cortex [DLPFC], premotor cortex, midcingulate cortex, and paracentral lobule).
They also had greater rsFC in the lateral temporal cortex and temporal pole (z > 2.7; Bonferroni-corrected threshold of P < .0125).
Additionally, participants with apathy had lower structural connectivity in the splenium, cingulum, and fronto-occipital fasciculus, compared with those without apathy (t > 2.5; false discovery rate–corrected P = .02).
Of the 27 participants who completed escitalopram treatment; 16 (59%) achieved remission (defined as an HAM-D score <10). Participants with apathy had poorer response to escitalopram treatment.
Lower insula-DLPFC/midcingulate cortex rsFC was associated with less improvement in depressive symptoms (HAM-D percentage change, beta [df] = .588 [26]; P = .001) as well as a greater likelihood that the participant would not achieve remission after treatment (odds ratio, 1.041; 95% confidence interval, 1.003-1.081; P = .04).
In regression models, lower insula-DLPFC/midcingulate cortex rsFC was found to be a mediator of the association between baseline apathy and persistence of depression.
The SN findings were also relevant to cognition. Lower dorsal anterior cingulate-DLPFC/paracentral rsFC was found to be associated with residual cognitive difficulties on measures of attention and executive function (beta [df] = .445 [26] and beta [df] = .384 [26], respectively; for each, P = .04).
“These findings support an emerging model of apathy, which proposes that apathy may arise from dysfunctional interactions among core networks (that is, SN, DMN, and executive control) that support motivated behavior,” the investigators write.
“This may cause a failure of network integration, leading to difficulties with salience processing, action planning, and behavioral initiation that manifests clinically as apathy,” they conclude.
One limitation they note was the lack of longitudinal follow-up after acute treatment and a “relatively limited neuropsychological battery.” Therefore, they could not “establish the persistence of treatment differences nor the specificity of cognitive associations.”
The investigators add that “novel interventions that modulate interactions among affected circuits may help to improve clinical outcomes in this distinct subgroup of older adults with depression, for whom few effective treatments exist.”
Commenting on the study, Helen Lavretsy, MD, professor of psychiatry in residence and director of the Late-Life Mood, Stress, and Wellness Research Program and the Integrative Psychiatry Clinic, Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, said, the findings “can be used in future studies targeting apathy and the underlying neural mechanisms of brain connectivity.” Dr. Lavretsy was not involved with the study.
The study was supported by grants from the National Institute of Mental Health. Dr. Gunning reported receiving grants from the National Institute of Mental Health during the conduct of the study and grants from Akili Interactive. The other authors’ disclosures are listed on the original article. Dr. Lavretsky reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A new study sheds light on the neurologic underpinnings of late-life depression (LLD) with apathy and its frequently poor response to treatment.
Investigators headed by Faith Gunning, PhD, of the Institute of Geriatric Psychiatry, Weill Cornell Medicine, New York, analyzed baseline and posttreatment brain MRIs and functional MRIs (fMRIs) of older adults with depression who participated in a 12-week open-label nonrandomized clinical trial of escitalopram. Participants had undergone clinical and cognitive assessments.
Disturbances were found in resting state functional connectivity (rsFC) between the salience network (SN) and other large-scale networks that support goal-directed behavior, especially in patients with depression who also had features of apathy.
“This study suggests that, among older adults with depression, distinct network abnormalities may be associated with apathy and poor response to first-line pharmacotherapy and may serve as promising targets for novel interventions,” the investigators write.
The study was published online in JAMA Network Open.
A leading cause of disability
LLD is a “leading cause of disability and medical morbidity in older adulthood,” with one-third to one-half of patients with LLD also suffering from apathy, the authors write.
Older adults with depression and comorbid apathy have poorer outcomes, including lower remission rates and poorer response to first-line antidepressants, compared with those with LLD but who do not have apathy.
Despite the high prevalence of apathy in people with depression, “little is known about its optimal treatment and, more broadly, about the brain-based mechanisms of apathy,” the authors note.
An “emerging hypothesis” points to the role of a compromised SN and its large-scale connections between apathy and poor treatment response in LLD.
The SN (which includes the insula and the dorsal anterior cingulate cortex) “attributes motivational value to a stimulus” and “dynamically coordinates the activity of other large-scale networks, including the executive control network and default mode network (DMN).”
Preliminary studies of apathy in patients with depression report reduced volume in structures of the SN and suggest disruption in functional connectivity among the SN, DMN, and the executive control network; but the mechanisms linking apathy to poor antidepressant response in LLD “are not well understood.”
“Connectometry” is a “novel approach to diffusion MRI analysis that quantifies the local connectome of white matter pathways.” It has been used along with resting-state imagery, but it had not been used in studying apathy.
The researchers investigated the functional connectivity of the SN, hypothesizing that alterations in connectivity among key nodes of the SN and other core circuits that modulate goal-directed behavior (DMN and the executive control network) were implicated in individuals with depression and apathy.
They applied connectometry to “identify pathway-level disruptions in structural connectivity,” hypothesizing that compromise of frontoparietal and frontolimbic pathways would be associated with apathy in patients with LLD.
They also wanted to know whether apathy-related network abnormalities were associated with antidepressant response after 12 weeks of pharmacotherapy with the selective serotonin reuptake inhibitor escitalopram.
Emerging model
The study included 40 older adults (65% women; mean [SD] age, 70.0 [6.6] years) with DSM-IV–diagnosis major depressive disorder (without psychotic features) who were from a single-group, open-label escitalopram treatment trial.
The Hamilton-Depression (HAM-D) scale was used to assess depression, while the Apathy Evaluation Scale was used to assess apathy. On the Apathy Evaluation Scale, a score of greater than 40.5 represents “clinically significant apathy.” Participants completed these tests at baseline and after 12 weeks of escitalopram treatment.
They also completed a battery of neuropsychological tests to assess cognition and underwent MRI imaging. fMRI was used to map group differences in rsFC of the SN, and diffusion connectometry was used to “evaluate pathway-level disruptions in structural connectivity.”
Of the participants, 20 had clinically significant apathy. There were no differences in age, sex, educational level, or the severity of depression at baseline between those who did and those who did not have apathy.
Compared with participants with depression but not apathy, those with depression and comorbid apathy had lower rsFC of salience network seeds (specifically, the dorsolateral prefrontal cortex [DLPFC], premotor cortex, midcingulate cortex, and paracentral lobule).
They also had greater rsFC in the lateral temporal cortex and temporal pole (z > 2.7; Bonferroni-corrected threshold of P < .0125).
Additionally, participants with apathy had lower structural connectivity in the splenium, cingulum, and fronto-occipital fasciculus, compared with those without apathy (t > 2.5; false discovery rate–corrected P = .02).
Of the 27 participants who completed escitalopram treatment; 16 (59%) achieved remission (defined as an HAM-D score <10). Participants with apathy had poorer response to escitalopram treatment.
Lower insula-DLPFC/midcingulate cortex rsFC was associated with less improvement in depressive symptoms (HAM-D percentage change, beta [df] = .588 [26]; P = .001) as well as a greater likelihood that the participant would not achieve remission after treatment (odds ratio, 1.041; 95% confidence interval, 1.003-1.081; P = .04).
In regression models, lower insula-DLPFC/midcingulate cortex rsFC was found to be a mediator of the association between baseline apathy and persistence of depression.
The SN findings were also relevant to cognition. Lower dorsal anterior cingulate-DLPFC/paracentral rsFC was found to be associated with residual cognitive difficulties on measures of attention and executive function (beta [df] = .445 [26] and beta [df] = .384 [26], respectively; for each, P = .04).
“These findings support an emerging model of apathy, which proposes that apathy may arise from dysfunctional interactions among core networks (that is, SN, DMN, and executive control) that support motivated behavior,” the investigators write.
“This may cause a failure of network integration, leading to difficulties with salience processing, action planning, and behavioral initiation that manifests clinically as apathy,” they conclude.
One limitation they note was the lack of longitudinal follow-up after acute treatment and a “relatively limited neuropsychological battery.” Therefore, they could not “establish the persistence of treatment differences nor the specificity of cognitive associations.”
The investigators add that “novel interventions that modulate interactions among affected circuits may help to improve clinical outcomes in this distinct subgroup of older adults with depression, for whom few effective treatments exist.”
Commenting on the study, Helen Lavretsy, MD, professor of psychiatry in residence and director of the Late-Life Mood, Stress, and Wellness Research Program and the Integrative Psychiatry Clinic, Jane and Terry Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, said, the findings “can be used in future studies targeting apathy and the underlying neural mechanisms of brain connectivity.” Dr. Lavretsy was not involved with the study.
The study was supported by grants from the National Institute of Mental Health. Dr. Gunning reported receiving grants from the National Institute of Mental Health during the conduct of the study and grants from Akili Interactive. The other authors’ disclosures are listed on the original article. Dr. Lavretsky reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
AAP updates hyperbilirubinemia guideline
Raising phototherapy thresholds and revising risk assessment are among the key changes in the American Academy of Pediatrics’ updated guidelines for managing hyperbilirubinemia in infants 35 weeks’ gestation and older.
“More than 80% of newborn infants will have some degree of jaundice,” Alex R. Kemper, MD, of Nationwide Children’s Hospital, Columbus, Ohio, and coauthors wrote. Careful monitoring is needed manage high bilirubin concentrations and avoid acute bilirubin encephalopathy (ABE) and kernicterus, a disabling neurologic condition.
The current revision, published in Pediatrics, updates and replaces the 2004 AAP clinical practice guidelines for the management and prevention of hyperbilirubinemia in newborns of at least 35 weeks’ gestation.
The guideline committee reviewed evidence published since the previous guidelines were issued in 2004, and addressed similar issues of prevention, risk assessment, monitoring, and treatment.
A notable change from 2004 was the inclusion of a 2009 recommendation update for “universal predischarge bilirubin screening with measures of total serum bilirubin (TSB) or transcutaneous bilirubin (TcB) linked to specific recommendations for follow-up,” the authors wrote.
In terms of prevention, recommendations include a direct antiglobulin test (DAT) for infants whose mother’s antibody screen was positive or unknown. In addition, exclusive breastfeeding is known to be associated with hyperbilirubinemia, but clinicians should support breastfeeding while monitoring for signs of hyperbilirubinemia because of suboptimal feeding, the authors noted. However, the guidelines recommend against oral supplementation with water or dextrose water to prevent hyperbilirubinemia.
For assessment and monitoring, the guidelines advise the use of total serum bilirubin (TSB) as the definitive test for hyperbilirubinemia to guide phototherapy and escalation of care, including exchange transfusion. “The presence of hyperbilirubinemia neurotoxicity risk factors lowers the threshold for treatment with phototherapy and the level at which care should be escalated,” the authors wrote. They also emphasized the need to consider glucose-6-phosphate dehydrogenase deficiency, a genetic condition that decreases protection against oxidative stress and has been identified as a leading cause of hazardous hyperbilirubinemia worldwide.
The guidelines recommend assessing all infants for jaundice at least every 12 hours after delivery until discharge, with TSB or TcB measured as soon as possible for those with suspected jaundice. The complete guidelines include charts for TSB levels to guide escalation of care. “Blood for TSB can be obtained at the time it is collected for newborn screening tests to avoid an additional heel stick,” the authors noted.
The rate of increase in TSB or TcB, if more than one measure is available, may identify infants at higher risk of hyperbilirubinemia, according to the guidelines, and a possible delay of hospital discharge may be needed for infants if appropriate follow-up is not feasible.
In terms of treatment, new evidence that bilirubin neurotoxicity does not occur until concentrations well above those given in the 2004 guidelines justified raising the treatment thresholds, although by a narrow range. “With the increased phototherapy thresholds, appropriately following the current guidelines including bilirubin screening during the birth hospitalization and timely postdischarge follow-up is important,” the authors wrote. The new thresholds, outlined in the complete guidelines, are based on gestational age, hyperbilirubinemia neurotoxicity risk factors, and the age of the infant in hours. However, infants may be treated at lower levels, based on individual circumstances, family preferences, and shared decision-making with clinicians. Home-based phototherapy may be used in some infants, but should not be used if there is a question about the device quality, delivery time, and ability of caregivers to use the device correctly.
“Discontinuing phototherapy is an option when the TSB has decreased by at least 2 mg/dL below the hour-specific threshold at the initiation of phototherapy,” and follow-up should be based on risk of rebound hyperbilirubinemia, according to the guidelines.
“This clinical practice guideline provides indications and approaches for phototherapy and escalation of care and when treatment and monitoring can be safely discontinued,” However, clinicians should understand the rationale for the recommendations and combine them with their clinical judgment, including shared decision-making when appropriate, the authors concluded.
Updated evidence supports escalating care
The take-home message for pediatricians is that neonatal hyperbilirubinemia is a very common finding, and complications are rare, but the condition can result in devastating life-long results, Cathy Haut, DNP, CPNP-AC, CPNP-PC, a pediatric nurse practitioner in Rehoboth Beach, Del., said in an interview.
“Previous guidelines published in 2004 and updated in 2009 included evidence-based recommendations, but additional research was still needed to provide guidance for providers to prevent complications of hyperbilirubinemia,” said Dr. Haut, who was not involved in producing the guidelines.
“New data documenting additional risk factors, the importance of ongoing breastfeeding support, and addressing hyperbilirubinemia as an urgent problem” are additions to prevention methods in the latest published guidelines, she said.
“Acute encephalopathy and kernicterus can result from hyperbilirubinemia with severe and devastating neurologic effects, but are preventable by early identification and treatment,” said Dr. Haut. Therefore, “it is not surprising that the AAP utilized continuing and more recent evidence to support new recommendations. Both maternal and neonatal risk factors have long been considered in the development of neonatal hyperbilirubinemia, but recent recommendations incorporate additional risk factor evaluation and urgency in time to appropriate care. Detailed thresholds for phototherapy and exchange transfusion will benefit the families of full-term infants without other risk factors and escalate care for those neonates with risk factors.”
However, potential barriers to following the guidelines persist, Dr. Haut noted.
“Frequent infant follow-up can be challenging for busy primary care offices with outpatient laboratory results often taking much longer to obtain than in a hospital setting,” she said.
Also, “taking a newborn to the emergency department or an inpatient laboratory can be frightening for families with the risk of illness exposure. Frequent monitoring of serum bilirubin levels is disturbing for parents and inconvenient immediately postpartum,” Dr. Haut explained. “Few practices utilize transcutaneous bilirubin monitoring which may be one method of added screening.”
In addition, “despite the importance of breastfeeding, ongoing support is not readily available for mothers after hospital discharge. A lactation specialist in the office setting can take the burden off providers and add opportunity for family education.”
As for additional research, “continued evaluation of the comparison of transcutaneous bilirubin monitoring and serum levels along with the use of transcutaneous monitoring in facilities outside the hospital setting may be warranted,” Dr. Haut said. “Data collection on incidence and accompanying risk factors of neonates who develop acute hyperbilirubinemia encephalopathy and kernicterus is a long-term study opportunity.”
The guidelines received no external funding. Lead author Dr. Kemper had no financial conflicts to disclose. Dr. Haut had no financial conflicts to disclose and serves on the editorial advisory board of Pediatric News.
Raising phototherapy thresholds and revising risk assessment are among the key changes in the American Academy of Pediatrics’ updated guidelines for managing hyperbilirubinemia in infants 35 weeks’ gestation and older.
“More than 80% of newborn infants will have some degree of jaundice,” Alex R. Kemper, MD, of Nationwide Children’s Hospital, Columbus, Ohio, and coauthors wrote. Careful monitoring is needed manage high bilirubin concentrations and avoid acute bilirubin encephalopathy (ABE) and kernicterus, a disabling neurologic condition.
The current revision, published in Pediatrics, updates and replaces the 2004 AAP clinical practice guidelines for the management and prevention of hyperbilirubinemia in newborns of at least 35 weeks’ gestation.
The guideline committee reviewed evidence published since the previous guidelines were issued in 2004, and addressed similar issues of prevention, risk assessment, monitoring, and treatment.
A notable change from 2004 was the inclusion of a 2009 recommendation update for “universal predischarge bilirubin screening with measures of total serum bilirubin (TSB) or transcutaneous bilirubin (TcB) linked to specific recommendations for follow-up,” the authors wrote.
In terms of prevention, recommendations include a direct antiglobulin test (DAT) for infants whose mother’s antibody screen was positive or unknown. In addition, exclusive breastfeeding is known to be associated with hyperbilirubinemia, but clinicians should support breastfeeding while monitoring for signs of hyperbilirubinemia because of suboptimal feeding, the authors noted. However, the guidelines recommend against oral supplementation with water or dextrose water to prevent hyperbilirubinemia.
For assessment and monitoring, the guidelines advise the use of total serum bilirubin (TSB) as the definitive test for hyperbilirubinemia to guide phototherapy and escalation of care, including exchange transfusion. “The presence of hyperbilirubinemia neurotoxicity risk factors lowers the threshold for treatment with phototherapy and the level at which care should be escalated,” the authors wrote. They also emphasized the need to consider glucose-6-phosphate dehydrogenase deficiency, a genetic condition that decreases protection against oxidative stress and has been identified as a leading cause of hazardous hyperbilirubinemia worldwide.
The guidelines recommend assessing all infants for jaundice at least every 12 hours after delivery until discharge, with TSB or TcB measured as soon as possible for those with suspected jaundice. The complete guidelines include charts for TSB levels to guide escalation of care. “Blood for TSB can be obtained at the time it is collected for newborn screening tests to avoid an additional heel stick,” the authors noted.
The rate of increase in TSB or TcB, if more than one measure is available, may identify infants at higher risk of hyperbilirubinemia, according to the guidelines, and a possible delay of hospital discharge may be needed for infants if appropriate follow-up is not feasible.
In terms of treatment, new evidence that bilirubin neurotoxicity does not occur until concentrations well above those given in the 2004 guidelines justified raising the treatment thresholds, although by a narrow range. “With the increased phototherapy thresholds, appropriately following the current guidelines including bilirubin screening during the birth hospitalization and timely postdischarge follow-up is important,” the authors wrote. The new thresholds, outlined in the complete guidelines, are based on gestational age, hyperbilirubinemia neurotoxicity risk factors, and the age of the infant in hours. However, infants may be treated at lower levels, based on individual circumstances, family preferences, and shared decision-making with clinicians. Home-based phototherapy may be used in some infants, but should not be used if there is a question about the device quality, delivery time, and ability of caregivers to use the device correctly.
“Discontinuing phototherapy is an option when the TSB has decreased by at least 2 mg/dL below the hour-specific threshold at the initiation of phototherapy,” and follow-up should be based on risk of rebound hyperbilirubinemia, according to the guidelines.
“This clinical practice guideline provides indications and approaches for phototherapy and escalation of care and when treatment and monitoring can be safely discontinued,” However, clinicians should understand the rationale for the recommendations and combine them with their clinical judgment, including shared decision-making when appropriate, the authors concluded.
Updated evidence supports escalating care
The take-home message for pediatricians is that neonatal hyperbilirubinemia is a very common finding, and complications are rare, but the condition can result in devastating life-long results, Cathy Haut, DNP, CPNP-AC, CPNP-PC, a pediatric nurse practitioner in Rehoboth Beach, Del., said in an interview.
“Previous guidelines published in 2004 and updated in 2009 included evidence-based recommendations, but additional research was still needed to provide guidance for providers to prevent complications of hyperbilirubinemia,” said Dr. Haut, who was not involved in producing the guidelines.
“New data documenting additional risk factors, the importance of ongoing breastfeeding support, and addressing hyperbilirubinemia as an urgent problem” are additions to prevention methods in the latest published guidelines, she said.
“Acute encephalopathy and kernicterus can result from hyperbilirubinemia with severe and devastating neurologic effects, but are preventable by early identification and treatment,” said Dr. Haut. Therefore, “it is not surprising that the AAP utilized continuing and more recent evidence to support new recommendations. Both maternal and neonatal risk factors have long been considered in the development of neonatal hyperbilirubinemia, but recent recommendations incorporate additional risk factor evaluation and urgency in time to appropriate care. Detailed thresholds for phototherapy and exchange transfusion will benefit the families of full-term infants without other risk factors and escalate care for those neonates with risk factors.”
However, potential barriers to following the guidelines persist, Dr. Haut noted.
“Frequent infant follow-up can be challenging for busy primary care offices with outpatient laboratory results often taking much longer to obtain than in a hospital setting,” she said.
Also, “taking a newborn to the emergency department or an inpatient laboratory can be frightening for families with the risk of illness exposure. Frequent monitoring of serum bilirubin levels is disturbing for parents and inconvenient immediately postpartum,” Dr. Haut explained. “Few practices utilize transcutaneous bilirubin monitoring which may be one method of added screening.”
In addition, “despite the importance of breastfeeding, ongoing support is not readily available for mothers after hospital discharge. A lactation specialist in the office setting can take the burden off providers and add opportunity for family education.”
As for additional research, “continued evaluation of the comparison of transcutaneous bilirubin monitoring and serum levels along with the use of transcutaneous monitoring in facilities outside the hospital setting may be warranted,” Dr. Haut said. “Data collection on incidence and accompanying risk factors of neonates who develop acute hyperbilirubinemia encephalopathy and kernicterus is a long-term study opportunity.”
The guidelines received no external funding. Lead author Dr. Kemper had no financial conflicts to disclose. Dr. Haut had no financial conflicts to disclose and serves on the editorial advisory board of Pediatric News.
Raising phototherapy thresholds and revising risk assessment are among the key changes in the American Academy of Pediatrics’ updated guidelines for managing hyperbilirubinemia in infants 35 weeks’ gestation and older.
“More than 80% of newborn infants will have some degree of jaundice,” Alex R. Kemper, MD, of Nationwide Children’s Hospital, Columbus, Ohio, and coauthors wrote. Careful monitoring is needed manage high bilirubin concentrations and avoid acute bilirubin encephalopathy (ABE) and kernicterus, a disabling neurologic condition.
The current revision, published in Pediatrics, updates and replaces the 2004 AAP clinical practice guidelines for the management and prevention of hyperbilirubinemia in newborns of at least 35 weeks’ gestation.
The guideline committee reviewed evidence published since the previous guidelines were issued in 2004, and addressed similar issues of prevention, risk assessment, monitoring, and treatment.
A notable change from 2004 was the inclusion of a 2009 recommendation update for “universal predischarge bilirubin screening with measures of total serum bilirubin (TSB) or transcutaneous bilirubin (TcB) linked to specific recommendations for follow-up,” the authors wrote.
In terms of prevention, recommendations include a direct antiglobulin test (DAT) for infants whose mother’s antibody screen was positive or unknown. In addition, exclusive breastfeeding is known to be associated with hyperbilirubinemia, but clinicians should support breastfeeding while monitoring for signs of hyperbilirubinemia because of suboptimal feeding, the authors noted. However, the guidelines recommend against oral supplementation with water or dextrose water to prevent hyperbilirubinemia.
For assessment and monitoring, the guidelines advise the use of total serum bilirubin (TSB) as the definitive test for hyperbilirubinemia to guide phototherapy and escalation of care, including exchange transfusion. “The presence of hyperbilirubinemia neurotoxicity risk factors lowers the threshold for treatment with phototherapy and the level at which care should be escalated,” the authors wrote. They also emphasized the need to consider glucose-6-phosphate dehydrogenase deficiency, a genetic condition that decreases protection against oxidative stress and has been identified as a leading cause of hazardous hyperbilirubinemia worldwide.
The guidelines recommend assessing all infants for jaundice at least every 12 hours after delivery until discharge, with TSB or TcB measured as soon as possible for those with suspected jaundice. The complete guidelines include charts for TSB levels to guide escalation of care. “Blood for TSB can be obtained at the time it is collected for newborn screening tests to avoid an additional heel stick,” the authors noted.
The rate of increase in TSB or TcB, if more than one measure is available, may identify infants at higher risk of hyperbilirubinemia, according to the guidelines, and a possible delay of hospital discharge may be needed for infants if appropriate follow-up is not feasible.
In terms of treatment, new evidence that bilirubin neurotoxicity does not occur until concentrations well above those given in the 2004 guidelines justified raising the treatment thresholds, although by a narrow range. “With the increased phototherapy thresholds, appropriately following the current guidelines including bilirubin screening during the birth hospitalization and timely postdischarge follow-up is important,” the authors wrote. The new thresholds, outlined in the complete guidelines, are based on gestational age, hyperbilirubinemia neurotoxicity risk factors, and the age of the infant in hours. However, infants may be treated at lower levels, based on individual circumstances, family preferences, and shared decision-making with clinicians. Home-based phototherapy may be used in some infants, but should not be used if there is a question about the device quality, delivery time, and ability of caregivers to use the device correctly.
“Discontinuing phototherapy is an option when the TSB has decreased by at least 2 mg/dL below the hour-specific threshold at the initiation of phototherapy,” and follow-up should be based on risk of rebound hyperbilirubinemia, according to the guidelines.
“This clinical practice guideline provides indications and approaches for phototherapy and escalation of care and when treatment and monitoring can be safely discontinued,” However, clinicians should understand the rationale for the recommendations and combine them with their clinical judgment, including shared decision-making when appropriate, the authors concluded.
Updated evidence supports escalating care
The take-home message for pediatricians is that neonatal hyperbilirubinemia is a very common finding, and complications are rare, but the condition can result in devastating life-long results, Cathy Haut, DNP, CPNP-AC, CPNP-PC, a pediatric nurse practitioner in Rehoboth Beach, Del., said in an interview.
“Previous guidelines published in 2004 and updated in 2009 included evidence-based recommendations, but additional research was still needed to provide guidance for providers to prevent complications of hyperbilirubinemia,” said Dr. Haut, who was not involved in producing the guidelines.
“New data documenting additional risk factors, the importance of ongoing breastfeeding support, and addressing hyperbilirubinemia as an urgent problem” are additions to prevention methods in the latest published guidelines, she said.
“Acute encephalopathy and kernicterus can result from hyperbilirubinemia with severe and devastating neurologic effects, but are preventable by early identification and treatment,” said Dr. Haut. Therefore, “it is not surprising that the AAP utilized continuing and more recent evidence to support new recommendations. Both maternal and neonatal risk factors have long been considered in the development of neonatal hyperbilirubinemia, but recent recommendations incorporate additional risk factor evaluation and urgency in time to appropriate care. Detailed thresholds for phototherapy and exchange transfusion will benefit the families of full-term infants without other risk factors and escalate care for those neonates with risk factors.”
However, potential barriers to following the guidelines persist, Dr. Haut noted.
“Frequent infant follow-up can be challenging for busy primary care offices with outpatient laboratory results often taking much longer to obtain than in a hospital setting,” she said.
Also, “taking a newborn to the emergency department or an inpatient laboratory can be frightening for families with the risk of illness exposure. Frequent monitoring of serum bilirubin levels is disturbing for parents and inconvenient immediately postpartum,” Dr. Haut explained. “Few practices utilize transcutaneous bilirubin monitoring which may be one method of added screening.”
In addition, “despite the importance of breastfeeding, ongoing support is not readily available for mothers after hospital discharge. A lactation specialist in the office setting can take the burden off providers and add opportunity for family education.”
As for additional research, “continued evaluation of the comparison of transcutaneous bilirubin monitoring and serum levels along with the use of transcutaneous monitoring in facilities outside the hospital setting may be warranted,” Dr. Haut said. “Data collection on incidence and accompanying risk factors of neonates who develop acute hyperbilirubinemia encephalopathy and kernicterus is a long-term study opportunity.”
The guidelines received no external funding. Lead author Dr. Kemper had no financial conflicts to disclose. Dr. Haut had no financial conflicts to disclose and serves on the editorial advisory board of Pediatric News.
FROM PEDIATRICS
Regular exercise appears to slow cognitive decline in MCI
(MCI), new research from the largest study of its kind suggests. Topline results from the EXERT trial showed patients with MCI who participated regularly in either aerobic exercise or stretching/balance/range-of-motion exercises maintained stable global cognitive function over 12 months of follow-up – with no differences between the two types of exercise.
“We’re excited about these findings, because these types of exercises that we’re seeing can protect against cognitive decline are accessible to everyone and therefore scalable to the public,” study investigator Laura Baker, PhD, Wake Forest University School of Medicine, Winston-Salem, N.C., said at a press briefing.
The topline results were presented at the 2022 Alzheimer’s Association International Conference.
No decline
The 18-month EXERT trial was designed to be the definitive study to answer the question about whether exercise can slow cognitive decline in older adults with amnestic MCI, Dr. Baker reported. Investigators enrolled 296 sedentary men and women with MCI (mean age, about 75 years). All were randomly allocated to either an aerobic exercise group (maintaining a heart rate at about 70%-85%) or a stretching and balance group (maintaining heart rate less than 35%).
Both groups exercised four times per week for about 30-40 minutes. In the first 12 months they were supervised by a trainer at the YMCA and then they exercised independently for the final 6 months.
Participants were assessed at baseline and every 6 months. The primary endpoint was change from baseline on the ADAS-Cog-Exec, a validated measure of global cognitive function, at the end of the 12 months of supervised exercise.
During the first 12 months, participants completed over 31,000 sessions of exercise, which is “quite impressive,” Dr. Baker said.
Over the first 12 months, neither the aerobic group nor the stretch/balance group showed a decline on the ADAS-Cog-Exec.
“We saw no group differences, and importantly, no decline after 12 months,” Dr. Baker reported.
Supported exercise is ‘crucial’
To help “make sense” of these findings, Dr. Baker noted that 12-month changes in the ADAS-Cog-Exec for the EXERT intervention groups were also compared with a “usual care” cohort of adults matched for age, sex, education, baseline cognitive status, and APOE4 genotype.
In this “apples-to-apples” comparison, the usual care cohort showed the expected decline or worsening of cognitive function over 12 months on the ADAS-Cog-Exec, but the EXERT exercise groups did not.
Dr. Baker noted that both exercise groups received equal amounts of weekly socialization, which may have contributed to the apparent protective effects on the brain.
A greater volume of exercise in EXERT, compared with other trials, may also be a factor. Each individual participant in EXERT completed more than 100 hours of exercise.
“The take-home message is that an increased amount of either low-intensity or high-intensity exercise for 120-150 minutes per week for 12 months may slow cognitive decline in sedentary older adults with MCI,” Dr. Baker said.
“What’s critical is that this regular exercise must be supported in these older [patients] with MCI. It must be supervised. There has to be some social component,” she added.
In her view, 120 minutes of regular supported exercise for sedentary individuals with MCI “needs to be part of the recommendation for risk reduction.”
Important study
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that several studies over the years have suggested that different types of exercise can have benefits on the brain.
“What’s important about this study is that it’s in a population of people that have MCI and are already experiencing memory changes,” Dr. Snyder said.
“The results suggest that engaging in both of these types of exercise may be beneficial for our brain. And given that this is the largest study of its kind in a population of people with MCI, it suggests it’s ‘never too late’ to start exercising,” she added.
Dr. Snyder noted the importance of continuing this work and to continue following these individuals “over time as well.”
The study was funded by the National Institutes of Health, National Institute on Aging. Dr. Baker and Dr. Snyder have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(MCI), new research from the largest study of its kind suggests. Topline results from the EXERT trial showed patients with MCI who participated regularly in either aerobic exercise or stretching/balance/range-of-motion exercises maintained stable global cognitive function over 12 months of follow-up – with no differences between the two types of exercise.
“We’re excited about these findings, because these types of exercises that we’re seeing can protect against cognitive decline are accessible to everyone and therefore scalable to the public,” study investigator Laura Baker, PhD, Wake Forest University School of Medicine, Winston-Salem, N.C., said at a press briefing.
The topline results were presented at the 2022 Alzheimer’s Association International Conference.
No decline
The 18-month EXERT trial was designed to be the definitive study to answer the question about whether exercise can slow cognitive decline in older adults with amnestic MCI, Dr. Baker reported. Investigators enrolled 296 sedentary men and women with MCI (mean age, about 75 years). All were randomly allocated to either an aerobic exercise group (maintaining a heart rate at about 70%-85%) or a stretching and balance group (maintaining heart rate less than 35%).
Both groups exercised four times per week for about 30-40 minutes. In the first 12 months they were supervised by a trainer at the YMCA and then they exercised independently for the final 6 months.
Participants were assessed at baseline and every 6 months. The primary endpoint was change from baseline on the ADAS-Cog-Exec, a validated measure of global cognitive function, at the end of the 12 months of supervised exercise.
During the first 12 months, participants completed over 31,000 sessions of exercise, which is “quite impressive,” Dr. Baker said.
Over the first 12 months, neither the aerobic group nor the stretch/balance group showed a decline on the ADAS-Cog-Exec.
“We saw no group differences, and importantly, no decline after 12 months,” Dr. Baker reported.
Supported exercise is ‘crucial’
To help “make sense” of these findings, Dr. Baker noted that 12-month changes in the ADAS-Cog-Exec for the EXERT intervention groups were also compared with a “usual care” cohort of adults matched for age, sex, education, baseline cognitive status, and APOE4 genotype.
In this “apples-to-apples” comparison, the usual care cohort showed the expected decline or worsening of cognitive function over 12 months on the ADAS-Cog-Exec, but the EXERT exercise groups did not.
Dr. Baker noted that both exercise groups received equal amounts of weekly socialization, which may have contributed to the apparent protective effects on the brain.
A greater volume of exercise in EXERT, compared with other trials, may also be a factor. Each individual participant in EXERT completed more than 100 hours of exercise.
“The take-home message is that an increased amount of either low-intensity or high-intensity exercise for 120-150 minutes per week for 12 months may slow cognitive decline in sedentary older adults with MCI,” Dr. Baker said.
“What’s critical is that this regular exercise must be supported in these older [patients] with MCI. It must be supervised. There has to be some social component,” she added.
In her view, 120 minutes of regular supported exercise for sedentary individuals with MCI “needs to be part of the recommendation for risk reduction.”
Important study
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that several studies over the years have suggested that different types of exercise can have benefits on the brain.
“What’s important about this study is that it’s in a population of people that have MCI and are already experiencing memory changes,” Dr. Snyder said.
“The results suggest that engaging in both of these types of exercise may be beneficial for our brain. And given that this is the largest study of its kind in a population of people with MCI, it suggests it’s ‘never too late’ to start exercising,” she added.
Dr. Snyder noted the importance of continuing this work and to continue following these individuals “over time as well.”
The study was funded by the National Institutes of Health, National Institute on Aging. Dr. Baker and Dr. Snyder have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(MCI), new research from the largest study of its kind suggests. Topline results from the EXERT trial showed patients with MCI who participated regularly in either aerobic exercise or stretching/balance/range-of-motion exercises maintained stable global cognitive function over 12 months of follow-up – with no differences between the two types of exercise.
“We’re excited about these findings, because these types of exercises that we’re seeing can protect against cognitive decline are accessible to everyone and therefore scalable to the public,” study investigator Laura Baker, PhD, Wake Forest University School of Medicine, Winston-Salem, N.C., said at a press briefing.
The topline results were presented at the 2022 Alzheimer’s Association International Conference.
No decline
The 18-month EXERT trial was designed to be the definitive study to answer the question about whether exercise can slow cognitive decline in older adults with amnestic MCI, Dr. Baker reported. Investigators enrolled 296 sedentary men and women with MCI (mean age, about 75 years). All were randomly allocated to either an aerobic exercise group (maintaining a heart rate at about 70%-85%) or a stretching and balance group (maintaining heart rate less than 35%).
Both groups exercised four times per week for about 30-40 minutes. In the first 12 months they were supervised by a trainer at the YMCA and then they exercised independently for the final 6 months.
Participants were assessed at baseline and every 6 months. The primary endpoint was change from baseline on the ADAS-Cog-Exec, a validated measure of global cognitive function, at the end of the 12 months of supervised exercise.
During the first 12 months, participants completed over 31,000 sessions of exercise, which is “quite impressive,” Dr. Baker said.
Over the first 12 months, neither the aerobic group nor the stretch/balance group showed a decline on the ADAS-Cog-Exec.
“We saw no group differences, and importantly, no decline after 12 months,” Dr. Baker reported.
Supported exercise is ‘crucial’
To help “make sense” of these findings, Dr. Baker noted that 12-month changes in the ADAS-Cog-Exec for the EXERT intervention groups were also compared with a “usual care” cohort of adults matched for age, sex, education, baseline cognitive status, and APOE4 genotype.
In this “apples-to-apples” comparison, the usual care cohort showed the expected decline or worsening of cognitive function over 12 months on the ADAS-Cog-Exec, but the EXERT exercise groups did not.
Dr. Baker noted that both exercise groups received equal amounts of weekly socialization, which may have contributed to the apparent protective effects on the brain.
A greater volume of exercise in EXERT, compared with other trials, may also be a factor. Each individual participant in EXERT completed more than 100 hours of exercise.
“The take-home message is that an increased amount of either low-intensity or high-intensity exercise for 120-150 minutes per week for 12 months may slow cognitive decline in sedentary older adults with MCI,” Dr. Baker said.
“What’s critical is that this regular exercise must be supported in these older [patients] with MCI. It must be supervised. There has to be some social component,” she added.
In her view, 120 minutes of regular supported exercise for sedentary individuals with MCI “needs to be part of the recommendation for risk reduction.”
Important study
Commenting on the findings, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, noted that several studies over the years have suggested that different types of exercise can have benefits on the brain.
“What’s important about this study is that it’s in a population of people that have MCI and are already experiencing memory changes,” Dr. Snyder said.
“The results suggest that engaging in both of these types of exercise may be beneficial for our brain. And given that this is the largest study of its kind in a population of people with MCI, it suggests it’s ‘never too late’ to start exercising,” she added.
Dr. Snyder noted the importance of continuing this work and to continue following these individuals “over time as well.”
The study was funded by the National Institutes of Health, National Institute on Aging. Dr. Baker and Dr. Snyder have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
From AAIC 2022
Waking up at night could be your brain boosting your memory
We tend to think a good night’s sleep should be uninterrupted, but surprising new research from the University of Copenhagen suggests just the opposite:
The study, done on mice, found that the stress transmitter noradrenaline wakes up the brain many times a night. These “microarousals” were linked to memory consolidation, meaning they help you remember the previous day’s events. In fact, the more “awake” you are during a microarousal, the better the memory boost, suggests the research, which was published in Nature Neuroscience.
“Every time I wake up in the middle of the night now, I think – ah, nice, I probably just had great memory-boosting sleep,” said study author Celia Kjaerby, PhD, an assistant professor at the university’s Center for Translational Neuromedicine.
The findings add insight to what happens in the brain during sleep and may help pave the way for new treatments for those who have sleep disorders.
Waves of noradrenaline
Previous research has suggested that noradrenaline – a hormone that increases during stress but also helps you stay focused – is inactive during sleep. So, the researchers were surprised to see high levels of it in the brains of the sleeping rodents.
“I still remember seeing the first traces showing the brain activity of the norepinephrine stress system during sleep. We could not believe our eyes,” Dr. Kjaerby said. “Everyone had thought the system would be quiet. And now we have found out that it completely controls the microarchitecture of sleep.”
Those noradrenaline levels rise and fall like waves every 30 seconds during non-REM (NREM) sleep. At each “peak” the brain is briefly awake, and at each “valley” it is asleep. Typically, these awakenings are so brief that the sleeping subject does not notice. But the higher the rise, the longer the awakening – and the more likely the sleeper may notice.
During the valleys, or when norepinephrine drops, so-called sleep spindles occur.
“These are short oscillatory bursts of brain activity linked to memory consolidation,” Dr. Kjaerby said. Occasionally there is a “deep valley,” lasting 3-5 minutes, leading to more sleep spindles. The mice with the most deep valleys also had the best memories, the researchers noted.
“We have shown that the amount of these super-boosts of sleep spindles, and not REM sleep, defines how well you remember the experiences you had prior to going to sleep,” said Dr. Kjaerby.
Deep valleys were followed by longer awakenings, the researchers observed. So, the longer the valley, the longer the awakening – and the better the memory boost. This means that, though restless sleep is not good, waking up briefly may be a natural part of memory-related sleep phases and may even mean you’ve slept well.
What happens in our brains when we sleep: Piecing it together
The findings fit with previous clinical data that shows we wake up roughly 100-plus times a night, mostly during NREM sleep stage 2 (the spindle-rich sleep stage), Dr. Kjaerby said.
Still, more research on these small awakenings is needed, Dr. Kjaerby said, noting that professor Maiken Nedergaard, MD, another author of this study, has found that the brain cleans up waste products through a rinsing fluid system.
“It remains a puzzle why the fluid system is so active when we sleep,” Dr. Kjaerby said. “We believe these short awakenings could potentially be the key to answering this question.”
A version of this article first appeared on WebMD.com.
We tend to think a good night’s sleep should be uninterrupted, but surprising new research from the University of Copenhagen suggests just the opposite:
The study, done on mice, found that the stress transmitter noradrenaline wakes up the brain many times a night. These “microarousals” were linked to memory consolidation, meaning they help you remember the previous day’s events. In fact, the more “awake” you are during a microarousal, the better the memory boost, suggests the research, which was published in Nature Neuroscience.
“Every time I wake up in the middle of the night now, I think – ah, nice, I probably just had great memory-boosting sleep,” said study author Celia Kjaerby, PhD, an assistant professor at the university’s Center for Translational Neuromedicine.
The findings add insight to what happens in the brain during sleep and may help pave the way for new treatments for those who have sleep disorders.
Waves of noradrenaline
Previous research has suggested that noradrenaline – a hormone that increases during stress but also helps you stay focused – is inactive during sleep. So, the researchers were surprised to see high levels of it in the brains of the sleeping rodents.
“I still remember seeing the first traces showing the brain activity of the norepinephrine stress system during sleep. We could not believe our eyes,” Dr. Kjaerby said. “Everyone had thought the system would be quiet. And now we have found out that it completely controls the microarchitecture of sleep.”
Those noradrenaline levels rise and fall like waves every 30 seconds during non-REM (NREM) sleep. At each “peak” the brain is briefly awake, and at each “valley” it is asleep. Typically, these awakenings are so brief that the sleeping subject does not notice. But the higher the rise, the longer the awakening – and the more likely the sleeper may notice.
During the valleys, or when norepinephrine drops, so-called sleep spindles occur.
“These are short oscillatory bursts of brain activity linked to memory consolidation,” Dr. Kjaerby said. Occasionally there is a “deep valley,” lasting 3-5 minutes, leading to more sleep spindles. The mice with the most deep valleys also had the best memories, the researchers noted.
“We have shown that the amount of these super-boosts of sleep spindles, and not REM sleep, defines how well you remember the experiences you had prior to going to sleep,” said Dr. Kjaerby.
Deep valleys were followed by longer awakenings, the researchers observed. So, the longer the valley, the longer the awakening – and the better the memory boost. This means that, though restless sleep is not good, waking up briefly may be a natural part of memory-related sleep phases and may even mean you’ve slept well.
What happens in our brains when we sleep: Piecing it together
The findings fit with previous clinical data that shows we wake up roughly 100-plus times a night, mostly during NREM sleep stage 2 (the spindle-rich sleep stage), Dr. Kjaerby said.
Still, more research on these small awakenings is needed, Dr. Kjaerby said, noting that professor Maiken Nedergaard, MD, another author of this study, has found that the brain cleans up waste products through a rinsing fluid system.
“It remains a puzzle why the fluid system is so active when we sleep,” Dr. Kjaerby said. “We believe these short awakenings could potentially be the key to answering this question.”
A version of this article first appeared on WebMD.com.
We tend to think a good night’s sleep should be uninterrupted, but surprising new research from the University of Copenhagen suggests just the opposite:
The study, done on mice, found that the stress transmitter noradrenaline wakes up the brain many times a night. These “microarousals” were linked to memory consolidation, meaning they help you remember the previous day’s events. In fact, the more “awake” you are during a microarousal, the better the memory boost, suggests the research, which was published in Nature Neuroscience.
“Every time I wake up in the middle of the night now, I think – ah, nice, I probably just had great memory-boosting sleep,” said study author Celia Kjaerby, PhD, an assistant professor at the university’s Center for Translational Neuromedicine.
The findings add insight to what happens in the brain during sleep and may help pave the way for new treatments for those who have sleep disorders.
Waves of noradrenaline
Previous research has suggested that noradrenaline – a hormone that increases during stress but also helps you stay focused – is inactive during sleep. So, the researchers were surprised to see high levels of it in the brains of the sleeping rodents.
“I still remember seeing the first traces showing the brain activity of the norepinephrine stress system during sleep. We could not believe our eyes,” Dr. Kjaerby said. “Everyone had thought the system would be quiet. And now we have found out that it completely controls the microarchitecture of sleep.”
Those noradrenaline levels rise and fall like waves every 30 seconds during non-REM (NREM) sleep. At each “peak” the brain is briefly awake, and at each “valley” it is asleep. Typically, these awakenings are so brief that the sleeping subject does not notice. But the higher the rise, the longer the awakening – and the more likely the sleeper may notice.
During the valleys, or when norepinephrine drops, so-called sleep spindles occur.
“These are short oscillatory bursts of brain activity linked to memory consolidation,” Dr. Kjaerby said. Occasionally there is a “deep valley,” lasting 3-5 minutes, leading to more sleep spindles. The mice with the most deep valleys also had the best memories, the researchers noted.
“We have shown that the amount of these super-boosts of sleep spindles, and not REM sleep, defines how well you remember the experiences you had prior to going to sleep,” said Dr. Kjaerby.
Deep valleys were followed by longer awakenings, the researchers observed. So, the longer the valley, the longer the awakening – and the better the memory boost. This means that, though restless sleep is not good, waking up briefly may be a natural part of memory-related sleep phases and may even mean you’ve slept well.
What happens in our brains when we sleep: Piecing it together
The findings fit with previous clinical data that shows we wake up roughly 100-plus times a night, mostly during NREM sleep stage 2 (the spindle-rich sleep stage), Dr. Kjaerby said.
Still, more research on these small awakenings is needed, Dr. Kjaerby said, noting that professor Maiken Nedergaard, MD, another author of this study, has found that the brain cleans up waste products through a rinsing fluid system.
“It remains a puzzle why the fluid system is so active when we sleep,” Dr. Kjaerby said. “We believe these short awakenings could potentially be the key to answering this question.”
A version of this article first appeared on WebMD.com.
FROM NATURE NEUROSCIENCE
Chronically low wages linked to subsequent memory decline
, new research suggests. In a new analysis of more than 3,000 participants in the Health and Retirement Study, those who sustained low wages in midlife showed significantly faster memory decline than their peers who never earned low wages.
The findings could have implications for future public policy and research initiatives, the investigators noted.
“Our findings, which suggest a pattern of sustained low-wage earning is harmful for cognitive health, [are] broadly applicable to researchers across numerous health disciplines,” said co-investigator Katrina Kezios, PhD, postdoctoral researcher, department of epidemiology, Mailman School of Public Health, Columbia University, New York.
The findings were presented at the 2022 Alzheimer’s Association International Conference.
Growing number of low-wage workers
Low-wage workers make up a growing share of the U.S. labor market. Yet little research has examined the long-term relationship between earning low wages and memory decline.
The current investigators assessed 1992-2016 data from the Health and Retirement Study, a longitudinal survey of nationally representative samples of Americans aged 50 years and older. Study participants are interviewed every 2 years and provide, among other things, information on work-related factors, including hourly wages.
Memory function was measured at each visit from 2004 to 2016 using a memory composite score. The score included immediate and delayed word recall memory assessments. For those who became too impaired to complete cognitive assessment, memory tests by proxy informants were utilized.
On average, participants completed 4.8 memory assessments over the course of the study.
Researchers defined “low wage” as an hourly wage lower than two-thirds of the federal median wage for the corresponding year. They categorized low-wage exposure history as “never” or “intermittent” or “sustained” on the basis of wages earned from 1992 to 2004.
The current analysis included 3,803 participants, 1,913 of whom were men. All participants were born from 1936 to 1941. In 2004, the average age was 65 years, and the mean memory score was 1.15 standard units.
The investigators adjusted for factors that could confound the relationship between wages and cognition, including the participant’s education, parental education, household wealth, and marital status. Later, whether the participants’ occupation type was of low skill or not was also included.
Cognitive harm
The confounder-adjusted annual rate of memory decline among workers who never earned low wages was –0.12 standard units (95% confidence interval, –0.14 to –0.10).
Compared with these workers, memory decline was significantly faster among participants with sustained low wage–earning during midlife (beta for interaction between time and exposure group, –0.012; 95% CI, –0.02 to 0.01), corresponding to an annual rate of –0.13 standard units.
Put another way, the cognitive aging experienced by workers earning low wages over a 10-year period was equivalent to what workers who never earned low wages would experience over 11 years.
Although similar associations were found for men and women, it was stronger in magnitude for men – a finding Dr. Kezios said was somewhat surprising. She noted that women are commonly more at risk for dementia than men.
However, she advises caution in interpreting this finding, as there were so few men in the sustained low-wage group. “Women disproportionately make up the group of workers earning low wages,” she said.
The negative low coefficient found for those who persistently earned low wages was also observed for those who intermittently earned low wages, but this was not statistically significant.
“We can speculate or hypothesize the cumulative effect of earning low wages at each exposure interval produces more cognitive harm than maybe earning low wages at some time points over that exposure period,” said Dr. Kezios.
A sensitivity analysis that examined wage earning at the same ages but in two different birth cohorts showed similar results for the two groups. When researchers removed self-employed workers from the study sample, the same association between sustained low wages and memory decline was found.
“Our findings held up, which gave us a little more reassurance that what we were seeing is at least signaling there might be something there,” said Dr. Kezios.
She described the study as a “first pass” for documenting the harmful cognitive effects of consistently earning low wages.
It would be interesting, she said, to now determine whether there’s a “dose effect” for having a low salary. However, other studies with different designs would be needed to determine at what income level cognitive health starts to be protected and the impact of raising the minimum wage, she added.
Unique study
Heather Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said the study was unique. “I don’t think we have seen anything like this before,” said Dr. Snyder.
The study, which links sustained low-wage earning in midlife to later memory decline, “is looking beyond some of the other measures we’ve seen when we looked at socioeconomic status,” she noted.
The results “beg the question” of whether people who earn low wages have less access to health care, she added.
“We should think about how to ensure access and equity around health care and around potential ways that may address components of risk individuals have during their life course,” Dr. Snyder said.
She noted that the study provides a “start” at considering potential policies to address the impact of sustained low wages on overall health, particularly cognitive health, throughout life.
The study had no outside funding. Dr. Kezios has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. In a new analysis of more than 3,000 participants in the Health and Retirement Study, those who sustained low wages in midlife showed significantly faster memory decline than their peers who never earned low wages.
The findings could have implications for future public policy and research initiatives, the investigators noted.
“Our findings, which suggest a pattern of sustained low-wage earning is harmful for cognitive health, [are] broadly applicable to researchers across numerous health disciplines,” said co-investigator Katrina Kezios, PhD, postdoctoral researcher, department of epidemiology, Mailman School of Public Health, Columbia University, New York.
The findings were presented at the 2022 Alzheimer’s Association International Conference.
Growing number of low-wage workers
Low-wage workers make up a growing share of the U.S. labor market. Yet little research has examined the long-term relationship between earning low wages and memory decline.
The current investigators assessed 1992-2016 data from the Health and Retirement Study, a longitudinal survey of nationally representative samples of Americans aged 50 years and older. Study participants are interviewed every 2 years and provide, among other things, information on work-related factors, including hourly wages.
Memory function was measured at each visit from 2004 to 2016 using a memory composite score. The score included immediate and delayed word recall memory assessments. For those who became too impaired to complete cognitive assessment, memory tests by proxy informants were utilized.
On average, participants completed 4.8 memory assessments over the course of the study.
Researchers defined “low wage” as an hourly wage lower than two-thirds of the federal median wage for the corresponding year. They categorized low-wage exposure history as “never” or “intermittent” or “sustained” on the basis of wages earned from 1992 to 2004.
The current analysis included 3,803 participants, 1,913 of whom were men. All participants were born from 1936 to 1941. In 2004, the average age was 65 years, and the mean memory score was 1.15 standard units.
The investigators adjusted for factors that could confound the relationship between wages and cognition, including the participant’s education, parental education, household wealth, and marital status. Later, whether the participants’ occupation type was of low skill or not was also included.
Cognitive harm
The confounder-adjusted annual rate of memory decline among workers who never earned low wages was –0.12 standard units (95% confidence interval, –0.14 to –0.10).
Compared with these workers, memory decline was significantly faster among participants with sustained low wage–earning during midlife (beta for interaction between time and exposure group, –0.012; 95% CI, –0.02 to 0.01), corresponding to an annual rate of –0.13 standard units.
Put another way, the cognitive aging experienced by workers earning low wages over a 10-year period was equivalent to what workers who never earned low wages would experience over 11 years.
Although similar associations were found for men and women, it was stronger in magnitude for men – a finding Dr. Kezios said was somewhat surprising. She noted that women are commonly more at risk for dementia than men.
However, she advises caution in interpreting this finding, as there were so few men in the sustained low-wage group. “Women disproportionately make up the group of workers earning low wages,” she said.
The negative low coefficient found for those who persistently earned low wages was also observed for those who intermittently earned low wages, but this was not statistically significant.
“We can speculate or hypothesize the cumulative effect of earning low wages at each exposure interval produces more cognitive harm than maybe earning low wages at some time points over that exposure period,” said Dr. Kezios.
A sensitivity analysis that examined wage earning at the same ages but in two different birth cohorts showed similar results for the two groups. When researchers removed self-employed workers from the study sample, the same association between sustained low wages and memory decline was found.
“Our findings held up, which gave us a little more reassurance that what we were seeing is at least signaling there might be something there,” said Dr. Kezios.
She described the study as a “first pass” for documenting the harmful cognitive effects of consistently earning low wages.
It would be interesting, she said, to now determine whether there’s a “dose effect” for having a low salary. However, other studies with different designs would be needed to determine at what income level cognitive health starts to be protected and the impact of raising the minimum wage, she added.
Unique study
Heather Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said the study was unique. “I don’t think we have seen anything like this before,” said Dr. Snyder.
The study, which links sustained low-wage earning in midlife to later memory decline, “is looking beyond some of the other measures we’ve seen when we looked at socioeconomic status,” she noted.
The results “beg the question” of whether people who earn low wages have less access to health care, she added.
“We should think about how to ensure access and equity around health care and around potential ways that may address components of risk individuals have during their life course,” Dr. Snyder said.
She noted that the study provides a “start” at considering potential policies to address the impact of sustained low wages on overall health, particularly cognitive health, throughout life.
The study had no outside funding. Dr. Kezios has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. In a new analysis of more than 3,000 participants in the Health and Retirement Study, those who sustained low wages in midlife showed significantly faster memory decline than their peers who never earned low wages.
The findings could have implications for future public policy and research initiatives, the investigators noted.
“Our findings, which suggest a pattern of sustained low-wage earning is harmful for cognitive health, [are] broadly applicable to researchers across numerous health disciplines,” said co-investigator Katrina Kezios, PhD, postdoctoral researcher, department of epidemiology, Mailman School of Public Health, Columbia University, New York.
The findings were presented at the 2022 Alzheimer’s Association International Conference.
Growing number of low-wage workers
Low-wage workers make up a growing share of the U.S. labor market. Yet little research has examined the long-term relationship between earning low wages and memory decline.
The current investigators assessed 1992-2016 data from the Health and Retirement Study, a longitudinal survey of nationally representative samples of Americans aged 50 years and older. Study participants are interviewed every 2 years and provide, among other things, information on work-related factors, including hourly wages.
Memory function was measured at each visit from 2004 to 2016 using a memory composite score. The score included immediate and delayed word recall memory assessments. For those who became too impaired to complete cognitive assessment, memory tests by proxy informants were utilized.
On average, participants completed 4.8 memory assessments over the course of the study.
Researchers defined “low wage” as an hourly wage lower than two-thirds of the federal median wage for the corresponding year. They categorized low-wage exposure history as “never” or “intermittent” or “sustained” on the basis of wages earned from 1992 to 2004.
The current analysis included 3,803 participants, 1,913 of whom were men. All participants were born from 1936 to 1941. In 2004, the average age was 65 years, and the mean memory score was 1.15 standard units.
The investigators adjusted for factors that could confound the relationship between wages and cognition, including the participant’s education, parental education, household wealth, and marital status. Later, whether the participants’ occupation type was of low skill or not was also included.
Cognitive harm
The confounder-adjusted annual rate of memory decline among workers who never earned low wages was –0.12 standard units (95% confidence interval, –0.14 to –0.10).
Compared with these workers, memory decline was significantly faster among participants with sustained low wage–earning during midlife (beta for interaction between time and exposure group, –0.012; 95% CI, –0.02 to 0.01), corresponding to an annual rate of –0.13 standard units.
Put another way, the cognitive aging experienced by workers earning low wages over a 10-year period was equivalent to what workers who never earned low wages would experience over 11 years.
Although similar associations were found for men and women, it was stronger in magnitude for men – a finding Dr. Kezios said was somewhat surprising. She noted that women are commonly more at risk for dementia than men.
However, she advises caution in interpreting this finding, as there were so few men in the sustained low-wage group. “Women disproportionately make up the group of workers earning low wages,” she said.
The negative low coefficient found for those who persistently earned low wages was also observed for those who intermittently earned low wages, but this was not statistically significant.
“We can speculate or hypothesize the cumulative effect of earning low wages at each exposure interval produces more cognitive harm than maybe earning low wages at some time points over that exposure period,” said Dr. Kezios.
A sensitivity analysis that examined wage earning at the same ages but in two different birth cohorts showed similar results for the two groups. When researchers removed self-employed workers from the study sample, the same association between sustained low wages and memory decline was found.
“Our findings held up, which gave us a little more reassurance that what we were seeing is at least signaling there might be something there,” said Dr. Kezios.
She described the study as a “first pass” for documenting the harmful cognitive effects of consistently earning low wages.
It would be interesting, she said, to now determine whether there’s a “dose effect” for having a low salary. However, other studies with different designs would be needed to determine at what income level cognitive health starts to be protected and the impact of raising the minimum wage, she added.
Unique study
Heather Snyder, PhD, vice president of medical and scientific relations, Alzheimer’s Association, said the study was unique. “I don’t think we have seen anything like this before,” said Dr. Snyder.
The study, which links sustained low-wage earning in midlife to later memory decline, “is looking beyond some of the other measures we’ve seen when we looked at socioeconomic status,” she noted.
The results “beg the question” of whether people who earn low wages have less access to health care, she added.
“We should think about how to ensure access and equity around health care and around potential ways that may address components of risk individuals have during their life course,” Dr. Snyder said.
She noted that the study provides a “start” at considering potential policies to address the impact of sustained low wages on overall health, particularly cognitive health, throughout life.
The study had no outside funding. Dr. Kezios has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
From AAIC 2022
More evidence that ultraprocessed foods are detrimental for the brain
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included participants aged 35 and older, showed that higher intake of UPFs was significantly associated with a faster rate of decline in both executive and global cognitive function.
“Based on these findings, doctors might counsel patients to prefer cooking at home [and] choosing fresher ingredients instead of buying ready-made meals and snacks,” said coinvestigator Natalia Gonçalves, PhD, University of São Paulo, Brazil.
Presented at the Alzheimer’s Association International Conference, the findings align with those from a recent study in Neurology. That study linked a diet high in UPFs to an increased risk for dementia.
Increasing worldwide consumption
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs include soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, fries, and many more.
Over the past 30 years, there has been a steady increase in consumption of UPFs worldwide. They are thought to induce systemic inflammation and oxidative stress and have been linked to a variety of ailments, such as overweight/obesity, cardiovascular disease, and cancer.
UPFs may also be a risk factor for cognitive decline, although data are scarce as to their effects on the brain.
To investigate, Dr. Gonçalves and colleagues evaluated longitudinal data on 10,775 adults (mean age, 50.6 years; 56% women; 55% White) who participated in the ELSA-Brasil study. They were evaluated in three waves (2008-2010, 2012-2014, and 2017-2019).
Information on diet was obtained via food frequency questionnaires and included information regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
Significant decline
Using linear mixed effects models that were adjusted for sociodemographic, lifestyle, and clinical variables, the investigators assessed the association of dietary UPFs as a percentage of total daily calories with cognitive performance over time.
During a median follow-up of 8 years, UPF intake in quartiles 2 to 4 (vs. quartile 1) was associated with a significant decline in global cognition (P = .003) and executive function (P = .015).
“Participants who reported consumption of more than 20% of daily calories from ultraprocessed foods had a 28% faster rate of global cognitive decline and a 25% faster decrease of the executive function compared to those who reported eating less than 20% of daily calories from ultraprocessed foods,” Dr. Gonçalves reported.
“Considering a person who eats a total of 2,000 kcal per day, 20% of daily calories from ultraprocessed foods are about two 1.5-ounce bars of KitKat, or five slices of bread, or about a third of an 8.5-ounce package of chips,” she explained.
Dr. Gonçalves noted that the reasons UPFs may harm the brain remain a “very relevant but not yet well-studied topic.”
Hypotheses include secondary effects from cerebrovascular lesions or chronic inflammation processes. More studies are needed to investigate the possible mechanisms that might explain the harm of UPFs to the brain, she said.
‘Troubling but not surprising’
Commenting on the study, Percy Griffin, PhD, director of scientific engagement for the Alzheimer’s Association, said there is “growing evidence that what we eat can impact our brains as we age.”
He added that many previous studies have suggested it is best for the brain for one to eat a heart-healthy, balanced diet that is low in processed foods and high in whole, nutritional foods, such as vegetables and fruits.
“These new data from the Alzheimer’s Association International Conference suggest eating a large amount of ultraprocessed food can significantly accelerate cognitive decline,” said Dr. Griffin, who was not involved with the research.
He noted that an increase in the availability and consumption of fast foods, processed foods, and UPFs is due to a number of socioeconomic factors, including low access to healthy foods, less time to prepare foods from scratch, and an inability to afford whole foods.
“Ultraprocessed foods make up more than half of American diets. It’s troubling but not surprising to see new data suggesting these foods can significantly accelerate cognitive decline,” Dr. Griffin said.
“The good news is there are steps we can take to reduce risk of cognitive decline as we age. These include eating a balanced diet, exercising regularly, getting good sleep, staying cognitively engaged, protecting from head injury, not smoking, and managing heart health,” he added.
Past research has suggested that the greatest benefit is from engaging in combinations of these lifestyle changes and that they are beneficial at any age, he noted.
“Even if you begin with one or two healthful actions, you’re moving in the right direction. It’s never too early or too late to incorporate these habits into your life,” Dr. Griffin said.
The study had no specific funding. Dr. Gonçalves and Dr. Griffin have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included participants aged 35 and older, showed that higher intake of UPFs was significantly associated with a faster rate of decline in both executive and global cognitive function.
“Based on these findings, doctors might counsel patients to prefer cooking at home [and] choosing fresher ingredients instead of buying ready-made meals and snacks,” said coinvestigator Natalia Gonçalves, PhD, University of São Paulo, Brazil.
Presented at the Alzheimer’s Association International Conference, the findings align with those from a recent study in Neurology. That study linked a diet high in UPFs to an increased risk for dementia.
Increasing worldwide consumption
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs include soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, fries, and many more.
Over the past 30 years, there has been a steady increase in consumption of UPFs worldwide. They are thought to induce systemic inflammation and oxidative stress and have been linked to a variety of ailments, such as overweight/obesity, cardiovascular disease, and cancer.
UPFs may also be a risk factor for cognitive decline, although data are scarce as to their effects on the brain.
To investigate, Dr. Gonçalves and colleagues evaluated longitudinal data on 10,775 adults (mean age, 50.6 years; 56% women; 55% White) who participated in the ELSA-Brasil study. They were evaluated in three waves (2008-2010, 2012-2014, and 2017-2019).
Information on diet was obtained via food frequency questionnaires and included information regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
Significant decline
Using linear mixed effects models that were adjusted for sociodemographic, lifestyle, and clinical variables, the investigators assessed the association of dietary UPFs as a percentage of total daily calories with cognitive performance over time.
During a median follow-up of 8 years, UPF intake in quartiles 2 to 4 (vs. quartile 1) was associated with a significant decline in global cognition (P = .003) and executive function (P = .015).
“Participants who reported consumption of more than 20% of daily calories from ultraprocessed foods had a 28% faster rate of global cognitive decline and a 25% faster decrease of the executive function compared to those who reported eating less than 20% of daily calories from ultraprocessed foods,” Dr. Gonçalves reported.
“Considering a person who eats a total of 2,000 kcal per day, 20% of daily calories from ultraprocessed foods are about two 1.5-ounce bars of KitKat, or five slices of bread, or about a third of an 8.5-ounce package of chips,” she explained.
Dr. Gonçalves noted that the reasons UPFs may harm the brain remain a “very relevant but not yet well-studied topic.”
Hypotheses include secondary effects from cerebrovascular lesions or chronic inflammation processes. More studies are needed to investigate the possible mechanisms that might explain the harm of UPFs to the brain, she said.
‘Troubling but not surprising’
Commenting on the study, Percy Griffin, PhD, director of scientific engagement for the Alzheimer’s Association, said there is “growing evidence that what we eat can impact our brains as we age.”
He added that many previous studies have suggested it is best for the brain for one to eat a heart-healthy, balanced diet that is low in processed foods and high in whole, nutritional foods, such as vegetables and fruits.
“These new data from the Alzheimer’s Association International Conference suggest eating a large amount of ultraprocessed food can significantly accelerate cognitive decline,” said Dr. Griffin, who was not involved with the research.
He noted that an increase in the availability and consumption of fast foods, processed foods, and UPFs is due to a number of socioeconomic factors, including low access to healthy foods, less time to prepare foods from scratch, and an inability to afford whole foods.
“Ultraprocessed foods make up more than half of American diets. It’s troubling but not surprising to see new data suggesting these foods can significantly accelerate cognitive decline,” Dr. Griffin said.
“The good news is there are steps we can take to reduce risk of cognitive decline as we age. These include eating a balanced diet, exercising regularly, getting good sleep, staying cognitively engaged, protecting from head injury, not smoking, and managing heart health,” he added.
Past research has suggested that the greatest benefit is from engaging in combinations of these lifestyle changes and that they are beneficial at any age, he noted.
“Even if you begin with one or two healthful actions, you’re moving in the right direction. It’s never too early or too late to incorporate these habits into your life,” Dr. Griffin said.
The study had no specific funding. Dr. Gonçalves and Dr. Griffin have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil), which included participants aged 35 and older, showed that higher intake of UPFs was significantly associated with a faster rate of decline in both executive and global cognitive function.
“Based on these findings, doctors might counsel patients to prefer cooking at home [and] choosing fresher ingredients instead of buying ready-made meals and snacks,” said coinvestigator Natalia Gonçalves, PhD, University of São Paulo, Brazil.
Presented at the Alzheimer’s Association International Conference, the findings align with those from a recent study in Neurology. That study linked a diet high in UPFs to an increased risk for dementia.
Increasing worldwide consumption
UPFs are highly manipulated, are packed with added ingredients, including sugar, fat, and salt, and are low in protein and fiber. Examples of UPFs include soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, fries, and many more.
Over the past 30 years, there has been a steady increase in consumption of UPFs worldwide. They are thought to induce systemic inflammation and oxidative stress and have been linked to a variety of ailments, such as overweight/obesity, cardiovascular disease, and cancer.
UPFs may also be a risk factor for cognitive decline, although data are scarce as to their effects on the brain.
To investigate, Dr. Gonçalves and colleagues evaluated longitudinal data on 10,775 adults (mean age, 50.6 years; 56% women; 55% White) who participated in the ELSA-Brasil study. They were evaluated in three waves (2008-2010, 2012-2014, and 2017-2019).
Information on diet was obtained via food frequency questionnaires and included information regarding consumption of unprocessed foods, minimally processed foods, and UPFs.
Participants were grouped according to UPF consumption quartiles (lowest to highest). Cognitive performance was evaluated by use of a standardized battery of tests.
Significant decline
Using linear mixed effects models that were adjusted for sociodemographic, lifestyle, and clinical variables, the investigators assessed the association of dietary UPFs as a percentage of total daily calories with cognitive performance over time.
During a median follow-up of 8 years, UPF intake in quartiles 2 to 4 (vs. quartile 1) was associated with a significant decline in global cognition (P = .003) and executive function (P = .015).
“Participants who reported consumption of more than 20% of daily calories from ultraprocessed foods had a 28% faster rate of global cognitive decline and a 25% faster decrease of the executive function compared to those who reported eating less than 20% of daily calories from ultraprocessed foods,” Dr. Gonçalves reported.
“Considering a person who eats a total of 2,000 kcal per day, 20% of daily calories from ultraprocessed foods are about two 1.5-ounce bars of KitKat, or five slices of bread, or about a third of an 8.5-ounce package of chips,” she explained.
Dr. Gonçalves noted that the reasons UPFs may harm the brain remain a “very relevant but not yet well-studied topic.”
Hypotheses include secondary effects from cerebrovascular lesions or chronic inflammation processes. More studies are needed to investigate the possible mechanisms that might explain the harm of UPFs to the brain, she said.
‘Troubling but not surprising’
Commenting on the study, Percy Griffin, PhD, director of scientific engagement for the Alzheimer’s Association, said there is “growing evidence that what we eat can impact our brains as we age.”
He added that many previous studies have suggested it is best for the brain for one to eat a heart-healthy, balanced diet that is low in processed foods and high in whole, nutritional foods, such as vegetables and fruits.
“These new data from the Alzheimer’s Association International Conference suggest eating a large amount of ultraprocessed food can significantly accelerate cognitive decline,” said Dr. Griffin, who was not involved with the research.
He noted that an increase in the availability and consumption of fast foods, processed foods, and UPFs is due to a number of socioeconomic factors, including low access to healthy foods, less time to prepare foods from scratch, and an inability to afford whole foods.
“Ultraprocessed foods make up more than half of American diets. It’s troubling but not surprising to see new data suggesting these foods can significantly accelerate cognitive decline,” Dr. Griffin said.
“The good news is there are steps we can take to reduce risk of cognitive decline as we age. These include eating a balanced diet, exercising regularly, getting good sleep, staying cognitively engaged, protecting from head injury, not smoking, and managing heart health,” he added.
Past research has suggested that the greatest benefit is from engaging in combinations of these lifestyle changes and that they are beneficial at any age, he noted.
“Even if you begin with one or two healthful actions, you’re moving in the right direction. It’s never too early or too late to incorporate these habits into your life,” Dr. Griffin said.
The study had no specific funding. Dr. Gonçalves and Dr. Griffin have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
From AAIC 2022
Racism tied to cognition in middle-aged, elderly
It is generally understood that racism, whether structural or personal, harms the well-being of the individual who experiences it. It has harmful health effects, and it contributes to ethnic inequality.
That was the fundamental message behind two studies presented at a press conference at the Alzheimer’s Association International Conference.
“We know that there are communities like black African Americans and Hispanic Latinos who are at greater risk for developing Alzheimer’s or another dementia,” said Carl Hill, PhD, who served as a moderator during the press conference. He pointed out that the genetic and lifestyle factors linked to dementia tell only part of the story. “It’s important that the science also examines the unique experiences of those at greater risk for dementia in our society,” said Dr. Hill, who is Alzheimer’s Association Chief Diversity Equity and Inclusion Officer.
Racism, memory, and cognition in middle-aged patients
Jennifer J. Manly, PhD, professor of neuropsychology at Columbia University, New York, presented a study of experience of racism and memory scores among a highly diverse, middle-aged cohort.
“There’s little understanding of how the multiple levels of racism – including intrapersonal, institutional, and structural racism – influence cognitive aging and dementia risk,” Dr. Manly said during the press conference.
Among 1,095 participants, 19.5% were non-Latinx White (61% female, mean age 57), 26.0% were non-Latinx Black (63% female, mean age 56), 32.3% were English-speaking Latinx (66% female, mean age 50), and 21.2% were Spanish-speaking Latinx (68% female, mean age 58).
The researchers used the Everyday Discrimination (ED) scale to measure experience of individual racism, the Major Discrimination (MD) scale to measure experience of institutional racism, and residential segregation of the census block group for an individual’s parents to measure residential segregation. Outcome measures included the Digit Span to assess attention and working memory, and the Selective Reminding Test to assess episodic memory.
The study found a clear association between racism and cognition. “The association of interpersonal racism to memory corresponds to 3 years of chronological age, and was driven by non-Hispanic black participants. Next, there was a reliable relationship between institutional racism and memory scores among non-Hispanic black participants, such that each reported civil rights violation corresponded to the effect of about 4.5 years of age on memory,” said Dr. Manly.
“The bottom line is that our results suggest that exposure to racism is a substantial driver of later life memory function, even in middle age, and especially for Black people,” Dr. Manly added.
The results should alert physicians to the complexities of racism and its impact. “Health providers need to be aware that many accumulated risks are historical and structural, and not controlled by the individual. Maybe more importantly, the medical system itself may perpetuate discriminatory experiences that contribute to worse health,” said Dr. Manly.
Latinx concerns
Also at the press conference, Adriana Perez, PhD, emphasized the challenges that Spanish-speaking Latinxs have with health care. Just 5%-7% of nurses are Latinx. “The same could be said for physicians, for clinical psychologists ... as you look at the really critical positions to address brain health equity, we are not represented there,” said Dr. Perez, an assistant professor and senior fellow at the University of Pennsylvania School of Nursing in Philadelphia.
She also pointed out that Latinx representation in clinical trials is very low, even though surveys performed by the Alzheimer’s Association show that this population values medical science and is willing to participate. In fact, 85% said they would participate if invited. The trouble is that many clinical trial announcements state that participants must speak English. Even the many Latinos who are bilingual may be put off by that wording: “That is a message that you’re not invited. That’s how it’s perceived,” said Dr. Perez.
Racism and cognition in the elderly
At the press conference, Kristen George, PhD, presented results from a study of individuals over age 90. “Racial disparities in dementia have been well characterized, particularly among those people who are aged 65 and older, but we don’t know very much about the oldest old individuals who are aged 90 and older. This group is one of the fastest growing segments of the population, and it’s becoming increasingly diverse,” said Dr. George, assistant professor of epidemiology at the University of California, Davis.
The group enrolled 445 Asian, Black, Latinx, White, and multiracial individuals who were members of Kaiser Permanente Northern California, with a mean age of 92.7 years. They used the Major Experiences of Discrimination Scale to assess discrimination.
The researchers divided them into three groups based on gender, race, and responses to the 10-item scale. Class 1 included largely White men who had reported workplace discrimination, with an average of two major discrimination experiences. Class 2 was made up of White women and non-Whites who reported little or no discrimination, with an average of 0 experiences. Class 3 included all non-White participants, and they reported a mean of four discrimination experiences.
Using class 2 as a reference, executive function was better among class 1 individuals (beta = 0.28; 95% CI, 0.03-0.52) but there was no significant difference between class 3 and class 2. Class 1 had better baseline semantic memory than class 2 (beta = 0.33; 95% CI, 0.07-0.58), and those in class 3 performed significantly worse than class 2 (beta = –0.24; 95% CI, –0.48 to –0.00). There were no between-group differences in baseline verbal or episodic memory.
Dr. Perez, Dr. Manly, Dr. George, and Dr. Hill have no relevant financial disclosures.
It is generally understood that racism, whether structural or personal, harms the well-being of the individual who experiences it. It has harmful health effects, and it contributes to ethnic inequality.
That was the fundamental message behind two studies presented at a press conference at the Alzheimer’s Association International Conference.
“We know that there are communities like black African Americans and Hispanic Latinos who are at greater risk for developing Alzheimer’s or another dementia,” said Carl Hill, PhD, who served as a moderator during the press conference. He pointed out that the genetic and lifestyle factors linked to dementia tell only part of the story. “It’s important that the science also examines the unique experiences of those at greater risk for dementia in our society,” said Dr. Hill, who is Alzheimer’s Association Chief Diversity Equity and Inclusion Officer.
Racism, memory, and cognition in middle-aged patients
Jennifer J. Manly, PhD, professor of neuropsychology at Columbia University, New York, presented a study of experience of racism and memory scores among a highly diverse, middle-aged cohort.
“There’s little understanding of how the multiple levels of racism – including intrapersonal, institutional, and structural racism – influence cognitive aging and dementia risk,” Dr. Manly said during the press conference.
Among 1,095 participants, 19.5% were non-Latinx White (61% female, mean age 57), 26.0% were non-Latinx Black (63% female, mean age 56), 32.3% were English-speaking Latinx (66% female, mean age 50), and 21.2% were Spanish-speaking Latinx (68% female, mean age 58).
The researchers used the Everyday Discrimination (ED) scale to measure experience of individual racism, the Major Discrimination (MD) scale to measure experience of institutional racism, and residential segregation of the census block group for an individual’s parents to measure residential segregation. Outcome measures included the Digit Span to assess attention and working memory, and the Selective Reminding Test to assess episodic memory.
The study found a clear association between racism and cognition. “The association of interpersonal racism to memory corresponds to 3 years of chronological age, and was driven by non-Hispanic black participants. Next, there was a reliable relationship between institutional racism and memory scores among non-Hispanic black participants, such that each reported civil rights violation corresponded to the effect of about 4.5 years of age on memory,” said Dr. Manly.
“The bottom line is that our results suggest that exposure to racism is a substantial driver of later life memory function, even in middle age, and especially for Black people,” Dr. Manly added.
The results should alert physicians to the complexities of racism and its impact. “Health providers need to be aware that many accumulated risks are historical and structural, and not controlled by the individual. Maybe more importantly, the medical system itself may perpetuate discriminatory experiences that contribute to worse health,” said Dr. Manly.
Latinx concerns
Also at the press conference, Adriana Perez, PhD, emphasized the challenges that Spanish-speaking Latinxs have with health care. Just 5%-7% of nurses are Latinx. “The same could be said for physicians, for clinical psychologists ... as you look at the really critical positions to address brain health equity, we are not represented there,” said Dr. Perez, an assistant professor and senior fellow at the University of Pennsylvania School of Nursing in Philadelphia.
She also pointed out that Latinx representation in clinical trials is very low, even though surveys performed by the Alzheimer’s Association show that this population values medical science and is willing to participate. In fact, 85% said they would participate if invited. The trouble is that many clinical trial announcements state that participants must speak English. Even the many Latinos who are bilingual may be put off by that wording: “That is a message that you’re not invited. That’s how it’s perceived,” said Dr. Perez.
Racism and cognition in the elderly
At the press conference, Kristen George, PhD, presented results from a study of individuals over age 90. “Racial disparities in dementia have been well characterized, particularly among those people who are aged 65 and older, but we don’t know very much about the oldest old individuals who are aged 90 and older. This group is one of the fastest growing segments of the population, and it’s becoming increasingly diverse,” said Dr. George, assistant professor of epidemiology at the University of California, Davis.
The group enrolled 445 Asian, Black, Latinx, White, and multiracial individuals who were members of Kaiser Permanente Northern California, with a mean age of 92.7 years. They used the Major Experiences of Discrimination Scale to assess discrimination.
The researchers divided them into three groups based on gender, race, and responses to the 10-item scale. Class 1 included largely White men who had reported workplace discrimination, with an average of two major discrimination experiences. Class 2 was made up of White women and non-Whites who reported little or no discrimination, with an average of 0 experiences. Class 3 included all non-White participants, and they reported a mean of four discrimination experiences.
Using class 2 as a reference, executive function was better among class 1 individuals (beta = 0.28; 95% CI, 0.03-0.52) but there was no significant difference between class 3 and class 2. Class 1 had better baseline semantic memory than class 2 (beta = 0.33; 95% CI, 0.07-0.58), and those in class 3 performed significantly worse than class 2 (beta = –0.24; 95% CI, –0.48 to –0.00). There were no between-group differences in baseline verbal or episodic memory.
Dr. Perez, Dr. Manly, Dr. George, and Dr. Hill have no relevant financial disclosures.
It is generally understood that racism, whether structural or personal, harms the well-being of the individual who experiences it. It has harmful health effects, and it contributes to ethnic inequality.
That was the fundamental message behind two studies presented at a press conference at the Alzheimer’s Association International Conference.
“We know that there are communities like black African Americans and Hispanic Latinos who are at greater risk for developing Alzheimer’s or another dementia,” said Carl Hill, PhD, who served as a moderator during the press conference. He pointed out that the genetic and lifestyle factors linked to dementia tell only part of the story. “It’s important that the science also examines the unique experiences of those at greater risk for dementia in our society,” said Dr. Hill, who is Alzheimer’s Association Chief Diversity Equity and Inclusion Officer.
Racism, memory, and cognition in middle-aged patients
Jennifer J. Manly, PhD, professor of neuropsychology at Columbia University, New York, presented a study of experience of racism and memory scores among a highly diverse, middle-aged cohort.
“There’s little understanding of how the multiple levels of racism – including intrapersonal, institutional, and structural racism – influence cognitive aging and dementia risk,” Dr. Manly said during the press conference.
Among 1,095 participants, 19.5% were non-Latinx White (61% female, mean age 57), 26.0% were non-Latinx Black (63% female, mean age 56), 32.3% were English-speaking Latinx (66% female, mean age 50), and 21.2% were Spanish-speaking Latinx (68% female, mean age 58).
The researchers used the Everyday Discrimination (ED) scale to measure experience of individual racism, the Major Discrimination (MD) scale to measure experience of institutional racism, and residential segregation of the census block group for an individual’s parents to measure residential segregation. Outcome measures included the Digit Span to assess attention and working memory, and the Selective Reminding Test to assess episodic memory.
The study found a clear association between racism and cognition. “The association of interpersonal racism to memory corresponds to 3 years of chronological age, and was driven by non-Hispanic black participants. Next, there was a reliable relationship between institutional racism and memory scores among non-Hispanic black participants, such that each reported civil rights violation corresponded to the effect of about 4.5 years of age on memory,” said Dr. Manly.
“The bottom line is that our results suggest that exposure to racism is a substantial driver of later life memory function, even in middle age, and especially for Black people,” Dr. Manly added.
The results should alert physicians to the complexities of racism and its impact. “Health providers need to be aware that many accumulated risks are historical and structural, and not controlled by the individual. Maybe more importantly, the medical system itself may perpetuate discriminatory experiences that contribute to worse health,” said Dr. Manly.
Latinx concerns
Also at the press conference, Adriana Perez, PhD, emphasized the challenges that Spanish-speaking Latinxs have with health care. Just 5%-7% of nurses are Latinx. “The same could be said for physicians, for clinical psychologists ... as you look at the really critical positions to address brain health equity, we are not represented there,” said Dr. Perez, an assistant professor and senior fellow at the University of Pennsylvania School of Nursing in Philadelphia.
She also pointed out that Latinx representation in clinical trials is very low, even though surveys performed by the Alzheimer’s Association show that this population values medical science and is willing to participate. In fact, 85% said they would participate if invited. The trouble is that many clinical trial announcements state that participants must speak English. Even the many Latinos who are bilingual may be put off by that wording: “That is a message that you’re not invited. That’s how it’s perceived,” said Dr. Perez.
Racism and cognition in the elderly
At the press conference, Kristen George, PhD, presented results from a study of individuals over age 90. “Racial disparities in dementia have been well characterized, particularly among those people who are aged 65 and older, but we don’t know very much about the oldest old individuals who are aged 90 and older. This group is one of the fastest growing segments of the population, and it’s becoming increasingly diverse,” said Dr. George, assistant professor of epidemiology at the University of California, Davis.
The group enrolled 445 Asian, Black, Latinx, White, and multiracial individuals who were members of Kaiser Permanente Northern California, with a mean age of 92.7 years. They used the Major Experiences of Discrimination Scale to assess discrimination.
The researchers divided them into three groups based on gender, race, and responses to the 10-item scale. Class 1 included largely White men who had reported workplace discrimination, with an average of two major discrimination experiences. Class 2 was made up of White women and non-Whites who reported little or no discrimination, with an average of 0 experiences. Class 3 included all non-White participants, and they reported a mean of four discrimination experiences.
Using class 2 as a reference, executive function was better among class 1 individuals (beta = 0.28; 95% CI, 0.03-0.52) but there was no significant difference between class 3 and class 2. Class 1 had better baseline semantic memory than class 2 (beta = 0.33; 95% CI, 0.07-0.58), and those in class 3 performed significantly worse than class 2 (beta = –0.24; 95% CI, –0.48 to –0.00). There were no between-group differences in baseline verbal or episodic memory.
Dr. Perez, Dr. Manly, Dr. George, and Dr. Hill have no relevant financial disclosures.
FROM AAIC 2022
COVID smell loss tops disease severity as a predictor of long-term cognitive impairment
preliminary results of new research suggest.
The findings provide important insight into the long-term cognitive impact of COVID-19, said study investigator Gabriela Gonzalez-Alemán, PhD, professor at Pontifical Catholic University of Argentina, Buenos Aires.
The more information that can be gathered on factors increasing risks for this cognitive impact, “the better we can track it and begin to develop methods to prevent it,” she said.
The findings were presented at the Alzheimer’s Association International Conference.
Memory, attention problems
COVID-19 has infected more than 570 million people worldwide. Related infections may result in long-term sequelae, including neuropsychiatric symptoms, said Dr. Gonzalez-Alemán.
In older adults, COVID-19 sequelae may resemble early Alzheimer’s disease, and the two conditions may share risk factors and blood biomarkers.
The new study highlighted 1-year results from a large, prospective cohort study from Argentina. Researchers used measures to evaluate long-term consequences of COVID-19 in older adults recommended by the Alzheimer’s Association Consortium on Chronic Neuropsychiatric Sequelae of SARS-CoV-2 infection (CNS SC2).
Harmonizing definitions and methodologies for studying COVID-19’s impact on the brain allows consortium members to compare study results, said Dr. Gonzalez-Alemán.
The investigators used the health registry in the province of Jujuy, situated in the extreme northwestern part of Argentina. The registry includes all SARS-CoV-2 testing data for the entire region.
The investigators randomly invited adults aged 60 years and older from the registry to participate in the study. The current analysis included 766 adults aged 55-95 years (mean age 66.9 years; 57% female) with an average of 10.4 years of education. The education system in Argentina includes 12 years of school before university.
Investigators stratified subjects by polymerase chain reaction testing status. Of the total, 88.4% were infected with COVID and 11.6% were controls (subjects without COVID).
The neurocognitive assessment of participants included four cognitive domains: memory, attention, language, and executive function, and an olfactory test that determined degree of olfactory dysfunction. Cognitive impairment was defined as z scores below –2.
Researchers divided participants into groups according to cognitive performance. These included normal cognition, memory-only impairment (single domain; 11.7%), impairment in attention and executive function without memory impairment (two domains; 8.3%), and multiple domain impairment (11.6%).
“Our participants showed a predominance of memory impairment as would be seen in Alzheimer’s disease,” noted Dr. Gonzalez-Alemán. “And a large group showed a combination of memory and attention problems.”
About 40% of the study sample – but no controls – had olfactory dysfunction.
“All the subjects that had a severe cognitive impairment also had anosmia [loss of smell],” said Dr. Gonzalez-Alemán. “We established an association between olfactory dysfunction and cognitive performance and impairment.”
The analysis showed that severity of anosmia, but not clinical status, significantly predicted cognitive impairment. “So, anosmia could be a good predictor of cognitive impairment after COVID-19 infection,” said Dr. Gonzalez-Alemán.
For individuals older than 60 years, cognitive impairment can be persistent, as can be olfactory dysfunction, she added.
Results of a 1-year phone survey showed about 71.8% of subjects had received three vaccine doses and 24.9% two doses. About 12.5% of those with three doses were reinfected and 23.3% of those with two doses were reinfected.
Longest follow-up to date
Commenting on the research, Heather Snyder, PhD, vice president, medical and scientific relations at the Alzheimer’s Association, noted the study is “the longest follow-up we’ve seen” looking at the connection between persistent loss of smell and cognitive changes after a COVID-19 infection.
The study included a “fairly large” sample size and was “unique” in that it was set up in a part of the country with centralized testing, said Dr. Snyder.
The Argentinian group is among the most advanced of those connected to the CNS SC2, said Dr. Snyder.
Members of this Alzheimer’s Association consortium, said Dr. Snyder, regularly share updates of ongoing studies, which are at different stages and looking at various neuropsychiatric impacts of COVID-19. It is important to bring these groups together to determine what those impacts are “because no one group will be able to do this on their own,” she said. “We saw pretty early on that some individuals had changes in the brain, or changes in cognition, and loss of sense of smell or taste, which indicates there’s a connection to the brain.”
However, she added, “there’s still a lot we don’t know” about this connection.
The study was funded by Alzheimer’s Association and FULTRA.
A version of this article first appeared on Medscape.com.
preliminary results of new research suggest.
The findings provide important insight into the long-term cognitive impact of COVID-19, said study investigator Gabriela Gonzalez-Alemán, PhD, professor at Pontifical Catholic University of Argentina, Buenos Aires.
The more information that can be gathered on factors increasing risks for this cognitive impact, “the better we can track it and begin to develop methods to prevent it,” she said.
The findings were presented at the Alzheimer’s Association International Conference.
Memory, attention problems
COVID-19 has infected more than 570 million people worldwide. Related infections may result in long-term sequelae, including neuropsychiatric symptoms, said Dr. Gonzalez-Alemán.
In older adults, COVID-19 sequelae may resemble early Alzheimer’s disease, and the two conditions may share risk factors and blood biomarkers.
The new study highlighted 1-year results from a large, prospective cohort study from Argentina. Researchers used measures to evaluate long-term consequences of COVID-19 in older adults recommended by the Alzheimer’s Association Consortium on Chronic Neuropsychiatric Sequelae of SARS-CoV-2 infection (CNS SC2).
Harmonizing definitions and methodologies for studying COVID-19’s impact on the brain allows consortium members to compare study results, said Dr. Gonzalez-Alemán.
The investigators used the health registry in the province of Jujuy, situated in the extreme northwestern part of Argentina. The registry includes all SARS-CoV-2 testing data for the entire region.
The investigators randomly invited adults aged 60 years and older from the registry to participate in the study. The current analysis included 766 adults aged 55-95 years (mean age 66.9 years; 57% female) with an average of 10.4 years of education. The education system in Argentina includes 12 years of school before university.
Investigators stratified subjects by polymerase chain reaction testing status. Of the total, 88.4% were infected with COVID and 11.6% were controls (subjects without COVID).
The neurocognitive assessment of participants included four cognitive domains: memory, attention, language, and executive function, and an olfactory test that determined degree of olfactory dysfunction. Cognitive impairment was defined as z scores below –2.
Researchers divided participants into groups according to cognitive performance. These included normal cognition, memory-only impairment (single domain; 11.7%), impairment in attention and executive function without memory impairment (two domains; 8.3%), and multiple domain impairment (11.6%).
“Our participants showed a predominance of memory impairment as would be seen in Alzheimer’s disease,” noted Dr. Gonzalez-Alemán. “And a large group showed a combination of memory and attention problems.”
About 40% of the study sample – but no controls – had olfactory dysfunction.
“All the subjects that had a severe cognitive impairment also had anosmia [loss of smell],” said Dr. Gonzalez-Alemán. “We established an association between olfactory dysfunction and cognitive performance and impairment.”
The analysis showed that severity of anosmia, but not clinical status, significantly predicted cognitive impairment. “So, anosmia could be a good predictor of cognitive impairment after COVID-19 infection,” said Dr. Gonzalez-Alemán.
For individuals older than 60 years, cognitive impairment can be persistent, as can be olfactory dysfunction, she added.
Results of a 1-year phone survey showed about 71.8% of subjects had received three vaccine doses and 24.9% two doses. About 12.5% of those with three doses were reinfected and 23.3% of those with two doses were reinfected.
Longest follow-up to date
Commenting on the research, Heather Snyder, PhD, vice president, medical and scientific relations at the Alzheimer’s Association, noted the study is “the longest follow-up we’ve seen” looking at the connection between persistent loss of smell and cognitive changes after a COVID-19 infection.
The study included a “fairly large” sample size and was “unique” in that it was set up in a part of the country with centralized testing, said Dr. Snyder.
The Argentinian group is among the most advanced of those connected to the CNS SC2, said Dr. Snyder.
Members of this Alzheimer’s Association consortium, said Dr. Snyder, regularly share updates of ongoing studies, which are at different stages and looking at various neuropsychiatric impacts of COVID-19. It is important to bring these groups together to determine what those impacts are “because no one group will be able to do this on their own,” she said. “We saw pretty early on that some individuals had changes in the brain, or changes in cognition, and loss of sense of smell or taste, which indicates there’s a connection to the brain.”
However, she added, “there’s still a lot we don’t know” about this connection.
The study was funded by Alzheimer’s Association and FULTRA.
A version of this article first appeared on Medscape.com.
preliminary results of new research suggest.
The findings provide important insight into the long-term cognitive impact of COVID-19, said study investigator Gabriela Gonzalez-Alemán, PhD, professor at Pontifical Catholic University of Argentina, Buenos Aires.
The more information that can be gathered on factors increasing risks for this cognitive impact, “the better we can track it and begin to develop methods to prevent it,” she said.
The findings were presented at the Alzheimer’s Association International Conference.
Memory, attention problems
COVID-19 has infected more than 570 million people worldwide. Related infections may result in long-term sequelae, including neuropsychiatric symptoms, said Dr. Gonzalez-Alemán.
In older adults, COVID-19 sequelae may resemble early Alzheimer’s disease, and the two conditions may share risk factors and blood biomarkers.
The new study highlighted 1-year results from a large, prospective cohort study from Argentina. Researchers used measures to evaluate long-term consequences of COVID-19 in older adults recommended by the Alzheimer’s Association Consortium on Chronic Neuropsychiatric Sequelae of SARS-CoV-2 infection (CNS SC2).
Harmonizing definitions and methodologies for studying COVID-19’s impact on the brain allows consortium members to compare study results, said Dr. Gonzalez-Alemán.
The investigators used the health registry in the province of Jujuy, situated in the extreme northwestern part of Argentina. The registry includes all SARS-CoV-2 testing data for the entire region.
The investigators randomly invited adults aged 60 years and older from the registry to participate in the study. The current analysis included 766 adults aged 55-95 years (mean age 66.9 years; 57% female) with an average of 10.4 years of education. The education system in Argentina includes 12 years of school before university.
Investigators stratified subjects by polymerase chain reaction testing status. Of the total, 88.4% were infected with COVID and 11.6% were controls (subjects without COVID).
The neurocognitive assessment of participants included four cognitive domains: memory, attention, language, and executive function, and an olfactory test that determined degree of olfactory dysfunction. Cognitive impairment was defined as z scores below –2.
Researchers divided participants into groups according to cognitive performance. These included normal cognition, memory-only impairment (single domain; 11.7%), impairment in attention and executive function without memory impairment (two domains; 8.3%), and multiple domain impairment (11.6%).
“Our participants showed a predominance of memory impairment as would be seen in Alzheimer’s disease,” noted Dr. Gonzalez-Alemán. “And a large group showed a combination of memory and attention problems.”
About 40% of the study sample – but no controls – had olfactory dysfunction.
“All the subjects that had a severe cognitive impairment also had anosmia [loss of smell],” said Dr. Gonzalez-Alemán. “We established an association between olfactory dysfunction and cognitive performance and impairment.”
The analysis showed that severity of anosmia, but not clinical status, significantly predicted cognitive impairment. “So, anosmia could be a good predictor of cognitive impairment after COVID-19 infection,” said Dr. Gonzalez-Alemán.
For individuals older than 60 years, cognitive impairment can be persistent, as can be olfactory dysfunction, she added.
Results of a 1-year phone survey showed about 71.8% of subjects had received three vaccine doses and 24.9% two doses. About 12.5% of those with three doses were reinfected and 23.3% of those with two doses were reinfected.
Longest follow-up to date
Commenting on the research, Heather Snyder, PhD, vice president, medical and scientific relations at the Alzheimer’s Association, noted the study is “the longest follow-up we’ve seen” looking at the connection between persistent loss of smell and cognitive changes after a COVID-19 infection.
The study included a “fairly large” sample size and was “unique” in that it was set up in a part of the country with centralized testing, said Dr. Snyder.
The Argentinian group is among the most advanced of those connected to the CNS SC2, said Dr. Snyder.
Members of this Alzheimer’s Association consortium, said Dr. Snyder, regularly share updates of ongoing studies, which are at different stages and looking at various neuropsychiatric impacts of COVID-19. It is important to bring these groups together to determine what those impacts are “because no one group will be able to do this on their own,” she said. “We saw pretty early on that some individuals had changes in the brain, or changes in cognition, and loss of sense of smell or taste, which indicates there’s a connection to the brain.”
However, she added, “there’s still a lot we don’t know” about this connection.
The study was funded by Alzheimer’s Association and FULTRA.
A version of this article first appeared on Medscape.com.
FROM AAIC 2022
Smell loss may be a biomarker of Alzheimer’s disease risk
, according to new research findings.
Olfactory dysfunction is common in late life and well documented among people with Alzheimer’s disease. However, it was unknown whether faster olfactory decline predicts either onset of Alzheimer’s disease or structural brain changes associated with Alzheimer’s disease.
In a study published online in Alzheimer’s and Dementia, Jayant M. Pinto, MD, and his colleagues at the University of Chicago Medical Center reported that among older adults with normal cognition at baseline, people who experienced rapid loss of sense of smell were more likely to be subsequently diagnosed with mild cognitive impairment (MCI) or dementia, compared with those who did not.
Participants were recruited from Rush University’s Memory and Aging Project, a longitudinal cohort of older adults who undergo yearly cognitive and sensory exams, including a scratch test of 12 common smells to identify. The Rush study “was ahead of the curve in looking at smell,” Dr. Pinto said in an interview. “It gave us a very valuable resource with which to attack these questions.”
Dr. Pinto has long investigated links between smell and accelerated aging; in 2014 his group published the finding that olfactory dysfunction could predict death within 5 years in older adults, and in 2018 they reported that olfactory dysfunction could predict dementia.
Smell and cognition over time
For the current study, Dr. Pinto said, “we were able to look at the question not just using a single point in time, but a more granular trajectory of smell loss. Measuring change year by year showed that the faster people’s sense of smell declined, the more likely they were to be diagnosed with MCI or Alzheimer’s disease.”
Dr. Pinto and his colleagues evaluated results from 515 adults (mean age 76.6, 78% female, 94% White) with no cognitive impairment and at least 3 years of normal results on smell tests at baseline. The subjects were followed for a mean 8 years. One hundred subjects (19%) were diagnosed with MCI or dementia by the end of the study period. A subset of the cohort (n = 121) underwent structural magnetic resonance imaging (MRI) between their final smell tests and the study’s end. Of these, most still had normal cognition; 17 individuals had MCI.
Patients’ individual trajectories of smell loss were mapped as slopes. After adjusting for expected differences in age and sex, the investigators found steeper decline associated with greater risk of incident MCI or dementia (odds ratio, 1.89; 95% confidence interval, 1.26-2.90; P < .01). The risk was comparable to that of carrying an apo E ε4 allele, the key risk variant for late-onset Alzheimer’s disease, but was independent of apo E status. The association was strongest among subjects younger than 76 years.
Olfactory decline and brain volume
Dr. Pinto and his colleagues, including lead author Rachel R. Pacyna, a 4th-year medical student at the University of Chicago, also sought to identify brain volume changes corresponding with olfactory decline and Alzheimer’s disease. The researchers hypothesized that certain brain regions not seen affected in Alzheimer’s disease would remain unchanged regardless of olfactory status, but that regions associated with smell and Alzheimer’s disease would see smaller volumes linked with olfactory decline.
Faster olfactory decline did predict lower gray matter volume in olfactory regions, even after controlling for apo E status and other known risk factors. Conversely, cognitively unimpaired patients undergoing MRI saw more gray matter volume in primary olfactory and temporal brain regions, compared with those with cognitive symptoms.
Taken together, the findings suggest that “change in sense of smell is better than looking at sense of smell at one time point,” Dr. Pinto commented. “There are other reasons people have impaired sense of smell: car accidents, COVID, other viruses and infections. But if you identify on a time course those who are starting to lose it faster, these are the people on whom we need to focus.”
Not yet diagnostic
More work needs to be done to establish thresholds for smell loss that could be useful in clinical or investigative settings as a marker of dementia risk, Dr. Pinto acknowledged. “Everyone gets their hearing tested; everyone gets their vision tested. It’s not as easy to get your sense of smell tested. But this study is telling people that if we were to start measuring it routinely, we could actually use it.”
Smell testing “could become a component of a diagnostic battery that includes things like genotyping and cerebrospinal fluid markers, but adds a little more information. It could be useful in clinical prevention trials to identify people at the highest risk, as smell loss presents quite a few years before MCI or Alzheimer’s disease.”
The investigators acknowledged that their findings need to be replicated in more diverse cohorts that better represent the Alzheimer’s population in the United States. Another limitation of their study, they said, was that the method used to calculate the rate of olfactory decline “was based on slope of measured time points assuming linearity, which may oversimplify the complexity of olfactory changes in normal aging and during the preclinical Alzheimer’s disease period.” The study was funded by the National Institutes of Health. Dr. Pinto disclosed receiving consulting fees from Sanofi/Regeneron, Optinose, and Genentech not related to this work.
, according to new research findings.
Olfactory dysfunction is common in late life and well documented among people with Alzheimer’s disease. However, it was unknown whether faster olfactory decline predicts either onset of Alzheimer’s disease or structural brain changes associated with Alzheimer’s disease.
In a study published online in Alzheimer’s and Dementia, Jayant M. Pinto, MD, and his colleagues at the University of Chicago Medical Center reported that among older adults with normal cognition at baseline, people who experienced rapid loss of sense of smell were more likely to be subsequently diagnosed with mild cognitive impairment (MCI) or dementia, compared with those who did not.
Participants were recruited from Rush University’s Memory and Aging Project, a longitudinal cohort of older adults who undergo yearly cognitive and sensory exams, including a scratch test of 12 common smells to identify. The Rush study “was ahead of the curve in looking at smell,” Dr. Pinto said in an interview. “It gave us a very valuable resource with which to attack these questions.”
Dr. Pinto has long investigated links between smell and accelerated aging; in 2014 his group published the finding that olfactory dysfunction could predict death within 5 years in older adults, and in 2018 they reported that olfactory dysfunction could predict dementia.
Smell and cognition over time
For the current study, Dr. Pinto said, “we were able to look at the question not just using a single point in time, but a more granular trajectory of smell loss. Measuring change year by year showed that the faster people’s sense of smell declined, the more likely they were to be diagnosed with MCI or Alzheimer’s disease.”
Dr. Pinto and his colleagues evaluated results from 515 adults (mean age 76.6, 78% female, 94% White) with no cognitive impairment and at least 3 years of normal results on smell tests at baseline. The subjects were followed for a mean 8 years. One hundred subjects (19%) were diagnosed with MCI or dementia by the end of the study period. A subset of the cohort (n = 121) underwent structural magnetic resonance imaging (MRI) between their final smell tests and the study’s end. Of these, most still had normal cognition; 17 individuals had MCI.
Patients’ individual trajectories of smell loss were mapped as slopes. After adjusting for expected differences in age and sex, the investigators found steeper decline associated with greater risk of incident MCI or dementia (odds ratio, 1.89; 95% confidence interval, 1.26-2.90; P < .01). The risk was comparable to that of carrying an apo E ε4 allele, the key risk variant for late-onset Alzheimer’s disease, but was independent of apo E status. The association was strongest among subjects younger than 76 years.
Olfactory decline and brain volume
Dr. Pinto and his colleagues, including lead author Rachel R. Pacyna, a 4th-year medical student at the University of Chicago, also sought to identify brain volume changes corresponding with olfactory decline and Alzheimer’s disease. The researchers hypothesized that certain brain regions not seen affected in Alzheimer’s disease would remain unchanged regardless of olfactory status, but that regions associated with smell and Alzheimer’s disease would see smaller volumes linked with olfactory decline.
Faster olfactory decline did predict lower gray matter volume in olfactory regions, even after controlling for apo E status and other known risk factors. Conversely, cognitively unimpaired patients undergoing MRI saw more gray matter volume in primary olfactory and temporal brain regions, compared with those with cognitive symptoms.
Taken together, the findings suggest that “change in sense of smell is better than looking at sense of smell at one time point,” Dr. Pinto commented. “There are other reasons people have impaired sense of smell: car accidents, COVID, other viruses and infections. But if you identify on a time course those who are starting to lose it faster, these are the people on whom we need to focus.”
Not yet diagnostic
More work needs to be done to establish thresholds for smell loss that could be useful in clinical or investigative settings as a marker of dementia risk, Dr. Pinto acknowledged. “Everyone gets their hearing tested; everyone gets their vision tested. It’s not as easy to get your sense of smell tested. But this study is telling people that if we were to start measuring it routinely, we could actually use it.”
Smell testing “could become a component of a diagnostic battery that includes things like genotyping and cerebrospinal fluid markers, but adds a little more information. It could be useful in clinical prevention trials to identify people at the highest risk, as smell loss presents quite a few years before MCI or Alzheimer’s disease.”
The investigators acknowledged that their findings need to be replicated in more diverse cohorts that better represent the Alzheimer’s population in the United States. Another limitation of their study, they said, was that the method used to calculate the rate of olfactory decline “was based on slope of measured time points assuming linearity, which may oversimplify the complexity of olfactory changes in normal aging and during the preclinical Alzheimer’s disease period.” The study was funded by the National Institutes of Health. Dr. Pinto disclosed receiving consulting fees from Sanofi/Regeneron, Optinose, and Genentech not related to this work.
, according to new research findings.
Olfactory dysfunction is common in late life and well documented among people with Alzheimer’s disease. However, it was unknown whether faster olfactory decline predicts either onset of Alzheimer’s disease or structural brain changes associated with Alzheimer’s disease.
In a study published online in Alzheimer’s and Dementia, Jayant M. Pinto, MD, and his colleagues at the University of Chicago Medical Center reported that among older adults with normal cognition at baseline, people who experienced rapid loss of sense of smell were more likely to be subsequently diagnosed with mild cognitive impairment (MCI) or dementia, compared with those who did not.
Participants were recruited from Rush University’s Memory and Aging Project, a longitudinal cohort of older adults who undergo yearly cognitive and sensory exams, including a scratch test of 12 common smells to identify. The Rush study “was ahead of the curve in looking at smell,” Dr. Pinto said in an interview. “It gave us a very valuable resource with which to attack these questions.”
Dr. Pinto has long investigated links between smell and accelerated aging; in 2014 his group published the finding that olfactory dysfunction could predict death within 5 years in older adults, and in 2018 they reported that olfactory dysfunction could predict dementia.
Smell and cognition over time
For the current study, Dr. Pinto said, “we were able to look at the question not just using a single point in time, but a more granular trajectory of smell loss. Measuring change year by year showed that the faster people’s sense of smell declined, the more likely they were to be diagnosed with MCI or Alzheimer’s disease.”
Dr. Pinto and his colleagues evaluated results from 515 adults (mean age 76.6, 78% female, 94% White) with no cognitive impairment and at least 3 years of normal results on smell tests at baseline. The subjects were followed for a mean 8 years. One hundred subjects (19%) were diagnosed with MCI or dementia by the end of the study period. A subset of the cohort (n = 121) underwent structural magnetic resonance imaging (MRI) between their final smell tests and the study’s end. Of these, most still had normal cognition; 17 individuals had MCI.
Patients’ individual trajectories of smell loss were mapped as slopes. After adjusting for expected differences in age and sex, the investigators found steeper decline associated with greater risk of incident MCI or dementia (odds ratio, 1.89; 95% confidence interval, 1.26-2.90; P < .01). The risk was comparable to that of carrying an apo E ε4 allele, the key risk variant for late-onset Alzheimer’s disease, but was independent of apo E status. The association was strongest among subjects younger than 76 years.
Olfactory decline and brain volume
Dr. Pinto and his colleagues, including lead author Rachel R. Pacyna, a 4th-year medical student at the University of Chicago, also sought to identify brain volume changes corresponding with olfactory decline and Alzheimer’s disease. The researchers hypothesized that certain brain regions not seen affected in Alzheimer’s disease would remain unchanged regardless of olfactory status, but that regions associated with smell and Alzheimer’s disease would see smaller volumes linked with olfactory decline.
Faster olfactory decline did predict lower gray matter volume in olfactory regions, even after controlling for apo E status and other known risk factors. Conversely, cognitively unimpaired patients undergoing MRI saw more gray matter volume in primary olfactory and temporal brain regions, compared with those with cognitive symptoms.
Taken together, the findings suggest that “change in sense of smell is better than looking at sense of smell at one time point,” Dr. Pinto commented. “There are other reasons people have impaired sense of smell: car accidents, COVID, other viruses and infections. But if you identify on a time course those who are starting to lose it faster, these are the people on whom we need to focus.”
Not yet diagnostic
More work needs to be done to establish thresholds for smell loss that could be useful in clinical or investigative settings as a marker of dementia risk, Dr. Pinto acknowledged. “Everyone gets their hearing tested; everyone gets their vision tested. It’s not as easy to get your sense of smell tested. But this study is telling people that if we were to start measuring it routinely, we could actually use it.”
Smell testing “could become a component of a diagnostic battery that includes things like genotyping and cerebrospinal fluid markers, but adds a little more information. It could be useful in clinical prevention trials to identify people at the highest risk, as smell loss presents quite a few years before MCI or Alzheimer’s disease.”
The investigators acknowledged that their findings need to be replicated in more diverse cohorts that better represent the Alzheimer’s population in the United States. Another limitation of their study, they said, was that the method used to calculate the rate of olfactory decline “was based on slope of measured time points assuming linearity, which may oversimplify the complexity of olfactory changes in normal aging and during the preclinical Alzheimer’s disease period.” The study was funded by the National Institutes of Health. Dr. Pinto disclosed receiving consulting fees from Sanofi/Regeneron, Optinose, and Genentech not related to this work.
FROM ALZHEIMER’S & DEMENTIA
Geriatric-Centered Interdisciplinary Care Pathway Reduces Delirium in Hospitalized Older Adults With Traumatic Injury
Study 1 Overview (Park et al)
Objective: To examine whether implementation of a geriatric trauma clinical pathway is associated with reduced rates of delirium in older adults with traumatic injury.
Design: Retrospective case-control study of electronic health records.
Setting and participants: Eligible patients were persons aged 65 years or older who were admitted to the trauma service and did not undergo an operation. A Geriatric Trauma Care Pathway was developed by a multidisciplinary Stanford Quality Pathways team and formally launched on November 1, 2018. The clinical pathway was designed to incorporate geriatric best practices, which included order sets (eg, age-appropriate nonpharmacological interventions and pharmacological dosages), guidelines (eg, Institute for Healthcare Improvement Age-Friendly Health systems 4M framework), automated consultations (comprehensive geriatric assessment), and escalation pathways executed by a multidisciplinary team (eg, pain, bowel, and sleep regulation). The clinical pathway began with admission to the emergency department (ED) (ie, automatic trigger of geriatric trauma care admission order set), daily multidisciplinary team meetings during acute hospitalization, and a transitional care team consultation for postdischarge follow-up or home visit.
Main outcome measures: The primary outcome was delirium as determined by a positive Confusion Assessment Method (CAM) score or a diagnosis of delirium by the clinical team. The secondary outcome was hospital length of stay (LOS). Process measures for pathway compliance (eg, achieving adequate pain control, early mobilization, advance care planning) were assessed. Outcome measures were compared between patients who underwent the Geriatric Trauma Care Pathway intervention (postimplementation group) vs patients who were treated prior to pathway implementation (baseline pre-implementation group).
Main results: Of the 859 eligible patients, 712 were included in the analysis (442 [62.1%] in the baseline pre-implementation group and 270 [37.9%] in the postimplementation group); mean (SD) age was 81.4 (9.1) years, and 394 (55.3%) were women. The injury mechanism was similar between groups, with falls being the most common cause of injury (247 [55.9%] in the baseline group vs 162 [60.0%] in the postimplementation group; P = .43). Injuries as measured by Injury Severity Score (ISS) were minor or moderate in both groups (261 [59.0%] in baseline group vs 168 [62.2%] in postimplementation group; P = .87). The adjusted odds ratio (OR) for delirium in the postimplementation group was lower compared to the baseline pre-implementation group (OR, 0.54; 95% CI, 0.37-0.80; P < .001). Measures of advance care planning in the postimplementation group improved, including more frequent goals-of-care documentation (53.7% in postimplementation group vs 16.7% in baseline group; P < .001) and a shortened time to first goals-of-care discussion upon presenting to the ED (36 hours in postimplementation group vs 50 hours in baseline group; P = .03).
Conclusion: Implementation of a multidisciplinary geriatric trauma clinical pathway for older adults with traumatic injury at a single level I trauma center was associated with reduced rates of delirium.
Study 2 Overview (Bryant et al)
Objective: To determine whether an interdisciplinary care pathway for frail trauma patients can improve in-hospital mortality, complications, and 30-day readmissions.
Design: Retrospective cohort study of frail patients.
Setting and participants: Eligible patients were persons aged 65 years or older who were admitted to the trauma service and survived more than 24 hours; admitted to and discharged from the trauma unit; and determined to be pre-frail or frail by a geriatrician’s assessment. A Frailty Identification and Care Pathway designed to reduce delirium and complications in frail older trauma patients was developed by a multidisciplinary team and implemented in 2016. The standardized evidence-based interdisciplinary care pathway included utilization of order sets and interventions for delirium prevention, early ambulation, bowel and pain regimens, nutrition and physical therapy consults, medication management, care-goal setting, and geriatric assessments.
Main outcome measures: The main outcomes were delirium as determined by a positive CAM score, major complications as defined by the Trauma Quality Improvement Project, in-hospital mortality, and 30-day hospital readmission. Outcome measures were compared between patients who underwent Frailty Identification and Care Pathway intervention (postintervention group) vs patients who were treated prior to pathway implementation (pre-intervention group).
Main results: A total of 269 frail patients were included in the analysis (125 in pre-intervention group vs 144 in postintervention group). Patient demographic and admission characteristics were similar between the 2 groups: mean age was 83.5 (7.1) years, 60.6% were women, and median ISS was 10 (interquartile range [IQR], 9-14). The injury mechanism was similar between groups, with falls accounting for 92.8% and 86.1% of injuries in the pre-intervention and postintervention groups, respectively (P = .07). In univariate analysis, the Frailty Identification and Care Pathway intervention was associated with a significant reduction in delirium (12.5% vs 21.6%, P = .04) and 30-day hospital readmission (2.7% vs 9.6%, P = .01) compared to patients in the pre-intervention group. However, rates of major complications (28.5% vs 28.0%, P = 0.93) and in-hospital mortality (4.2% vs 7.2%, P = .28) were similar between the pre-intervention and postintervention groups. In multivariate logistic regression models adjusted for patient characteristics (age, sex, race, ISS), patients in the postintervention group had lower delirium (OR, 0.44; 95% CI, 0.22-0.88; P = .02) and 30-day hospital readmission (OR, 0.25; 95% CI, 0.07-0.84; P = .02) rates compared to those in the pre-intervention group.
Conclusion: Implementation of an interdisciplinary care protocol for frail geriatric trauma patients significantly decreased their risks for in-hospital delirium and 30-day hospital readmission.
Commentary
Traumatic injuries in older adults are associated with higher morbidity and mortality compared to younger patients, with falls and motor vehicle accidents accounting for a majority of these injuries. Astoundingly, up to one-third of this vulnerable population presenting to hospitals with an ISS greater than 15 may die during hospitalization.1 As a result, a large number of studies and clinical trials have focused on interventions that are designed to reduce fall risks, and hence reduce adverse consequences of traumatic injuries that may arise after falls.2 However, this emphasis on falls prevention has overshadowed a need to develop effective geriatric-centered clinical interventions that aim to improve outcomes in older adults who present to hospitals with traumatic injuries. Furthermore, frailty—a geriatric syndrome indicative of an increased state of vulnerability and predictive of adverse outcomes such as delirium—is highly prevalent in older patients with traumatic injury.3 Thus, there is an urgent need to develop novel, hospital-based, traumatic injury–targeting strategies that incorporate a thoughtful redesign of the care framework that includes evidence-based interventions for geriatric syndromes such as delirium and frailty.
The study reported by Park et al (Study 1) represents the latest effort to evaluate inpatient management strategies designed to improve outcomes in hospitalized older adults who have sustained traumatic injury. Through the implementation of a novel multidisciplinary Geriatric Trauma Care Pathway that incorporates geriatric best practices, this intervention was found to be associated with a 46% lower risk of in-hospital delirium. Because of the inclusion of all age-eligible patients across all strata of traumatic injuries, rather than preselecting for those at the highest risk for poor clinical outcomes, the benefits of this intervention extend to those with minor or moderate injury severity. Furthermore, the improvement in delirium (ie, the primary outcome) is particularly meaningful given that delirium is one of the most common hospital-associated complications that increase hospital LOS, discharge to an institution, and mortality in older adults. Finally, the study’s observed reduced time to a first goals-of-care discussion and increased frequency of goals-of-care documentation after intervention should not be overlooked. The improvements in these 2 process measures are highly significant given that advanced care planning, an intervention that helps to align patients’ values, goals, and treatments, is completed at substantially lower rates in older adults in the acute hospital setting.4
Similarly, in an earlier published study, Bryant and colleagues (Study 2) also show that a geriatric-focused interdisciplinary trauma care pathway is associated with delirium risk reduction in hospitalized older trauma patients. Much like Study 1, the Frailty Identification and Care Pathway utilized in Study 2 is an evidence-based interdisciplinary care pathway that includes the use of geriatric assessments, order sets, and geriatric best practices. Moreover, its exclusive inclusion of pre-frail and frail older patients (ie, those at higher risk for poor outcomes) with moderate injury severity (median ISS of 10 [IQR, 9-14]) suggests that this type of care pathway benefits hospitalized older trauma patients, who are particularly vulnerable to adverse complications such as delirium. Moreover, the successful utilization of the FRAIL questionnaire, a validated frailty screening tool, by surgical residents in the ED to initiate this care pathway demonstrates the feasibility of its use in expediting frailty screening in older patients in trauma care.
Application for Clinical Practice and System Implementation
Findings from the 2 studies discussed in this review indicate that implementation of interdisciplinary clinical care pathways predicated on evidence-based geriatric principles and best practices is a promising approach to reduce delirium in hospitalized older trauma patients. These studies have helped to lay the groundwork in outlining the roadmaps (eg, processes and infrastructures) needed to create such clinical pathways. These key elements include: (1) integration of a multidisciplinary committee (eg, representation from trauma, emergency, and geriatric medicine, nursing, physical and occupational therapy, pharmacy, social work) in pathway design and implementation; (2) adaption of evidence-based geriatric best practices (eg, the Institute for Healthcare Improvement Age-Friendly Health System 4M framework [medication, mentation, mobility, what matters]) to prioritize interventions and to design a pathway that incorporates these features; (3) incorporation of comprehensive geriatric assessment by interdisciplinary providers; (4) utilization of validated clinical instruments to assess physical and cognitive functions, frailty, delirium, and social determinants of health; (5) modification of electronic health record systems to encompass order sets that incorporate evidence-based, nonpharmacological and pharmacological interventions to manage symptoms (eg, delirium, pain, bowel movement, sleep, immobility, polypharmacy) essential to quality geriatric care; and (6) integration of patient and caregiver preferences via goals-of-care discussions and corresponding documentation and communication of these goals.
Additionally, these 2 studies imparted some strategies that may facilitate the implementation of interdisciplinary clinical care pathways in trauma care. Examples of such facilitators include: (1) collaboration with champions within each specialty to reinforce education and buy-in; (2) creation of automatically triggered order sets upon patient presentation to the ED that unites distinct features of clinical pathways; (3) adaption and reorganization of existing hospital infrastructures and resources to meet the needs of clinical pathways implementation (eg, utilizing information technology resources to develop electronic health record order sets; using quality department to develop clinical pathway guidelines and electronic outcome dashboards); and (4) development of individualized patient and caregiver education materials based on care needs (eg, principles of delirium prevention and preservation of mobility during hospitalization) to prepare and engage these stakeholders in patient care and recovery.
Practice Points
- A geriatric interdisciplinary care model can be effectively applied to the management of acute trauma in older patients.
- Interdisciplinary clinical pathways should incorporate geriatric best practices and guidelines and age-appropriate order sets to prioritize and integrate care.
—Fred Ko, MD, MS
1. Hashmi A, Ibrahim-Zada I, Rhee P, et al. Predictors of mortality in geriatric trauma patients: a systematic review and meta-analysis. J Trauma Acute Care Surg. 2014;76(3):894-901. doi:10.1097/TA.0b013e3182ab0763
2. Hopewell S, Adedire O, Copsey BJ, et al. Multifactorial and multiple component interventions for preventing falls in older people living in the community. Cochrane Database Syst Rev. 2018;7(7):CD012221. doi:10.1002/14651858.CD012221.pub2
3. Joseph B, Pandit V, Zangbar B, et al. Superiority of frailty over age in predicting outcomes among geriatric trauma patients: a prospective analysis. JAMA Surg. 2014;149(8):766-772. doi:10.1001/jamasurg.2014.296
4. Hopkins SA, Bentley A, Phillips V, Barclay S. Advance care plans and hospitalized frail older adults: a systematic review. BMJ Support Palliat Care. 2020;10(2):164-174. doi:10.1136/bmjspcare-2019-002093
Study 1 Overview (Park et al)
Objective: To examine whether implementation of a geriatric trauma clinical pathway is associated with reduced rates of delirium in older adults with traumatic injury.
Design: Retrospective case-control study of electronic health records.
Setting and participants: Eligible patients were persons aged 65 years or older who were admitted to the trauma service and did not undergo an operation. A Geriatric Trauma Care Pathway was developed by a multidisciplinary Stanford Quality Pathways team and formally launched on November 1, 2018. The clinical pathway was designed to incorporate geriatric best practices, which included order sets (eg, age-appropriate nonpharmacological interventions and pharmacological dosages), guidelines (eg, Institute for Healthcare Improvement Age-Friendly Health systems 4M framework), automated consultations (comprehensive geriatric assessment), and escalation pathways executed by a multidisciplinary team (eg, pain, bowel, and sleep regulation). The clinical pathway began with admission to the emergency department (ED) (ie, automatic trigger of geriatric trauma care admission order set), daily multidisciplinary team meetings during acute hospitalization, and a transitional care team consultation for postdischarge follow-up or home visit.
Main outcome measures: The primary outcome was delirium as determined by a positive Confusion Assessment Method (CAM) score or a diagnosis of delirium by the clinical team. The secondary outcome was hospital length of stay (LOS). Process measures for pathway compliance (eg, achieving adequate pain control, early mobilization, advance care planning) were assessed. Outcome measures were compared between patients who underwent the Geriatric Trauma Care Pathway intervention (postimplementation group) vs patients who were treated prior to pathway implementation (baseline pre-implementation group).
Main results: Of the 859 eligible patients, 712 were included in the analysis (442 [62.1%] in the baseline pre-implementation group and 270 [37.9%] in the postimplementation group); mean (SD) age was 81.4 (9.1) years, and 394 (55.3%) were women. The injury mechanism was similar between groups, with falls being the most common cause of injury (247 [55.9%] in the baseline group vs 162 [60.0%] in the postimplementation group; P = .43). Injuries as measured by Injury Severity Score (ISS) were minor or moderate in both groups (261 [59.0%] in baseline group vs 168 [62.2%] in postimplementation group; P = .87). The adjusted odds ratio (OR) for delirium in the postimplementation group was lower compared to the baseline pre-implementation group (OR, 0.54; 95% CI, 0.37-0.80; P < .001). Measures of advance care planning in the postimplementation group improved, including more frequent goals-of-care documentation (53.7% in postimplementation group vs 16.7% in baseline group; P < .001) and a shortened time to first goals-of-care discussion upon presenting to the ED (36 hours in postimplementation group vs 50 hours in baseline group; P = .03).
Conclusion: Implementation of a multidisciplinary geriatric trauma clinical pathway for older adults with traumatic injury at a single level I trauma center was associated with reduced rates of delirium.
Study 2 Overview (Bryant et al)
Objective: To determine whether an interdisciplinary care pathway for frail trauma patients can improve in-hospital mortality, complications, and 30-day readmissions.
Design: Retrospective cohort study of frail patients.
Setting and participants: Eligible patients were persons aged 65 years or older who were admitted to the trauma service and survived more than 24 hours; admitted to and discharged from the trauma unit; and determined to be pre-frail or frail by a geriatrician’s assessment. A Frailty Identification and Care Pathway designed to reduce delirium and complications in frail older trauma patients was developed by a multidisciplinary team and implemented in 2016. The standardized evidence-based interdisciplinary care pathway included utilization of order sets and interventions for delirium prevention, early ambulation, bowel and pain regimens, nutrition and physical therapy consults, medication management, care-goal setting, and geriatric assessments.
Main outcome measures: The main outcomes were delirium as determined by a positive CAM score, major complications as defined by the Trauma Quality Improvement Project, in-hospital mortality, and 30-day hospital readmission. Outcome measures were compared between patients who underwent Frailty Identification and Care Pathway intervention (postintervention group) vs patients who were treated prior to pathway implementation (pre-intervention group).
Main results: A total of 269 frail patients were included in the analysis (125 in pre-intervention group vs 144 in postintervention group). Patient demographic and admission characteristics were similar between the 2 groups: mean age was 83.5 (7.1) years, 60.6% were women, and median ISS was 10 (interquartile range [IQR], 9-14). The injury mechanism was similar between groups, with falls accounting for 92.8% and 86.1% of injuries in the pre-intervention and postintervention groups, respectively (P = .07). In univariate analysis, the Frailty Identification and Care Pathway intervention was associated with a significant reduction in delirium (12.5% vs 21.6%, P = .04) and 30-day hospital readmission (2.7% vs 9.6%, P = .01) compared to patients in the pre-intervention group. However, rates of major complications (28.5% vs 28.0%, P = 0.93) and in-hospital mortality (4.2% vs 7.2%, P = .28) were similar between the pre-intervention and postintervention groups. In multivariate logistic regression models adjusted for patient characteristics (age, sex, race, ISS), patients in the postintervention group had lower delirium (OR, 0.44; 95% CI, 0.22-0.88; P = .02) and 30-day hospital readmission (OR, 0.25; 95% CI, 0.07-0.84; P = .02) rates compared to those in the pre-intervention group.
Conclusion: Implementation of an interdisciplinary care protocol for frail geriatric trauma patients significantly decreased their risks for in-hospital delirium and 30-day hospital readmission.
Commentary
Traumatic injuries in older adults are associated with higher morbidity and mortality compared to younger patients, with falls and motor vehicle accidents accounting for a majority of these injuries. Astoundingly, up to one-third of this vulnerable population presenting to hospitals with an ISS greater than 15 may die during hospitalization.1 As a result, a large number of studies and clinical trials have focused on interventions that are designed to reduce fall risks, and hence reduce adverse consequences of traumatic injuries that may arise after falls.2 However, this emphasis on falls prevention has overshadowed a need to develop effective geriatric-centered clinical interventions that aim to improve outcomes in older adults who present to hospitals with traumatic injuries. Furthermore, frailty—a geriatric syndrome indicative of an increased state of vulnerability and predictive of adverse outcomes such as delirium—is highly prevalent in older patients with traumatic injury.3 Thus, there is an urgent need to develop novel, hospital-based, traumatic injury–targeting strategies that incorporate a thoughtful redesign of the care framework that includes evidence-based interventions for geriatric syndromes such as delirium and frailty.
The study reported by Park et al (Study 1) represents the latest effort to evaluate inpatient management strategies designed to improve outcomes in hospitalized older adults who have sustained traumatic injury. Through the implementation of a novel multidisciplinary Geriatric Trauma Care Pathway that incorporates geriatric best practices, this intervention was found to be associated with a 46% lower risk of in-hospital delirium. Because of the inclusion of all age-eligible patients across all strata of traumatic injuries, rather than preselecting for those at the highest risk for poor clinical outcomes, the benefits of this intervention extend to those with minor or moderate injury severity. Furthermore, the improvement in delirium (ie, the primary outcome) is particularly meaningful given that delirium is one of the most common hospital-associated complications that increase hospital LOS, discharge to an institution, and mortality in older adults. Finally, the study’s observed reduced time to a first goals-of-care discussion and increased frequency of goals-of-care documentation after intervention should not be overlooked. The improvements in these 2 process measures are highly significant given that advanced care planning, an intervention that helps to align patients’ values, goals, and treatments, is completed at substantially lower rates in older adults in the acute hospital setting.4
Similarly, in an earlier published study, Bryant and colleagues (Study 2) also show that a geriatric-focused interdisciplinary trauma care pathway is associated with delirium risk reduction in hospitalized older trauma patients. Much like Study 1, the Frailty Identification and Care Pathway utilized in Study 2 is an evidence-based interdisciplinary care pathway that includes the use of geriatric assessments, order sets, and geriatric best practices. Moreover, its exclusive inclusion of pre-frail and frail older patients (ie, those at higher risk for poor outcomes) with moderate injury severity (median ISS of 10 [IQR, 9-14]) suggests that this type of care pathway benefits hospitalized older trauma patients, who are particularly vulnerable to adverse complications such as delirium. Moreover, the successful utilization of the FRAIL questionnaire, a validated frailty screening tool, by surgical residents in the ED to initiate this care pathway demonstrates the feasibility of its use in expediting frailty screening in older patients in trauma care.
Application for Clinical Practice and System Implementation
Findings from the 2 studies discussed in this review indicate that implementation of interdisciplinary clinical care pathways predicated on evidence-based geriatric principles and best practices is a promising approach to reduce delirium in hospitalized older trauma patients. These studies have helped to lay the groundwork in outlining the roadmaps (eg, processes and infrastructures) needed to create such clinical pathways. These key elements include: (1) integration of a multidisciplinary committee (eg, representation from trauma, emergency, and geriatric medicine, nursing, physical and occupational therapy, pharmacy, social work) in pathway design and implementation; (2) adaption of evidence-based geriatric best practices (eg, the Institute for Healthcare Improvement Age-Friendly Health System 4M framework [medication, mentation, mobility, what matters]) to prioritize interventions and to design a pathway that incorporates these features; (3) incorporation of comprehensive geriatric assessment by interdisciplinary providers; (4) utilization of validated clinical instruments to assess physical and cognitive functions, frailty, delirium, and social determinants of health; (5) modification of electronic health record systems to encompass order sets that incorporate evidence-based, nonpharmacological and pharmacological interventions to manage symptoms (eg, delirium, pain, bowel movement, sleep, immobility, polypharmacy) essential to quality geriatric care; and (6) integration of patient and caregiver preferences via goals-of-care discussions and corresponding documentation and communication of these goals.
Additionally, these 2 studies imparted some strategies that may facilitate the implementation of interdisciplinary clinical care pathways in trauma care. Examples of such facilitators include: (1) collaboration with champions within each specialty to reinforce education and buy-in; (2) creation of automatically triggered order sets upon patient presentation to the ED that unites distinct features of clinical pathways; (3) adaption and reorganization of existing hospital infrastructures and resources to meet the needs of clinical pathways implementation (eg, utilizing information technology resources to develop electronic health record order sets; using quality department to develop clinical pathway guidelines and electronic outcome dashboards); and (4) development of individualized patient and caregiver education materials based on care needs (eg, principles of delirium prevention and preservation of mobility during hospitalization) to prepare and engage these stakeholders in patient care and recovery.
Practice Points
- A geriatric interdisciplinary care model can be effectively applied to the management of acute trauma in older patients.
- Interdisciplinary clinical pathways should incorporate geriatric best practices and guidelines and age-appropriate order sets to prioritize and integrate care.
—Fred Ko, MD, MS
Study 1 Overview (Park et al)
Objective: To examine whether implementation of a geriatric trauma clinical pathway is associated with reduced rates of delirium in older adults with traumatic injury.
Design: Retrospective case-control study of electronic health records.
Setting and participants: Eligible patients were persons aged 65 years or older who were admitted to the trauma service and did not undergo an operation. A Geriatric Trauma Care Pathway was developed by a multidisciplinary Stanford Quality Pathways team and formally launched on November 1, 2018. The clinical pathway was designed to incorporate geriatric best practices, which included order sets (eg, age-appropriate nonpharmacological interventions and pharmacological dosages), guidelines (eg, Institute for Healthcare Improvement Age-Friendly Health systems 4M framework), automated consultations (comprehensive geriatric assessment), and escalation pathways executed by a multidisciplinary team (eg, pain, bowel, and sleep regulation). The clinical pathway began with admission to the emergency department (ED) (ie, automatic trigger of geriatric trauma care admission order set), daily multidisciplinary team meetings during acute hospitalization, and a transitional care team consultation for postdischarge follow-up or home visit.
Main outcome measures: The primary outcome was delirium as determined by a positive Confusion Assessment Method (CAM) score or a diagnosis of delirium by the clinical team. The secondary outcome was hospital length of stay (LOS). Process measures for pathway compliance (eg, achieving adequate pain control, early mobilization, advance care planning) were assessed. Outcome measures were compared between patients who underwent the Geriatric Trauma Care Pathway intervention (postimplementation group) vs patients who were treated prior to pathway implementation (baseline pre-implementation group).
Main results: Of the 859 eligible patients, 712 were included in the analysis (442 [62.1%] in the baseline pre-implementation group and 270 [37.9%] in the postimplementation group); mean (SD) age was 81.4 (9.1) years, and 394 (55.3%) were women. The injury mechanism was similar between groups, with falls being the most common cause of injury (247 [55.9%] in the baseline group vs 162 [60.0%] in the postimplementation group; P = .43). Injuries as measured by Injury Severity Score (ISS) were minor or moderate in both groups (261 [59.0%] in baseline group vs 168 [62.2%] in postimplementation group; P = .87). The adjusted odds ratio (OR) for delirium in the postimplementation group was lower compared to the baseline pre-implementation group (OR, 0.54; 95% CI, 0.37-0.80; P < .001). Measures of advance care planning in the postimplementation group improved, including more frequent goals-of-care documentation (53.7% in postimplementation group vs 16.7% in baseline group; P < .001) and a shortened time to first goals-of-care discussion upon presenting to the ED (36 hours in postimplementation group vs 50 hours in baseline group; P = .03).
Conclusion: Implementation of a multidisciplinary geriatric trauma clinical pathway for older adults with traumatic injury at a single level I trauma center was associated with reduced rates of delirium.
Study 2 Overview (Bryant et al)
Objective: To determine whether an interdisciplinary care pathway for frail trauma patients can improve in-hospital mortality, complications, and 30-day readmissions.
Design: Retrospective cohort study of frail patients.
Setting and participants: Eligible patients were persons aged 65 years or older who were admitted to the trauma service and survived more than 24 hours; admitted to and discharged from the trauma unit; and determined to be pre-frail or frail by a geriatrician’s assessment. A Frailty Identification and Care Pathway designed to reduce delirium and complications in frail older trauma patients was developed by a multidisciplinary team and implemented in 2016. The standardized evidence-based interdisciplinary care pathway included utilization of order sets and interventions for delirium prevention, early ambulation, bowel and pain regimens, nutrition and physical therapy consults, medication management, care-goal setting, and geriatric assessments.
Main outcome measures: The main outcomes were delirium as determined by a positive CAM score, major complications as defined by the Trauma Quality Improvement Project, in-hospital mortality, and 30-day hospital readmission. Outcome measures were compared between patients who underwent Frailty Identification and Care Pathway intervention (postintervention group) vs patients who were treated prior to pathway implementation (pre-intervention group).
Main results: A total of 269 frail patients were included in the analysis (125 in pre-intervention group vs 144 in postintervention group). Patient demographic and admission characteristics were similar between the 2 groups: mean age was 83.5 (7.1) years, 60.6% were women, and median ISS was 10 (interquartile range [IQR], 9-14). The injury mechanism was similar between groups, with falls accounting for 92.8% and 86.1% of injuries in the pre-intervention and postintervention groups, respectively (P = .07). In univariate analysis, the Frailty Identification and Care Pathway intervention was associated with a significant reduction in delirium (12.5% vs 21.6%, P = .04) and 30-day hospital readmission (2.7% vs 9.6%, P = .01) compared to patients in the pre-intervention group. However, rates of major complications (28.5% vs 28.0%, P = 0.93) and in-hospital mortality (4.2% vs 7.2%, P = .28) were similar between the pre-intervention and postintervention groups. In multivariate logistic regression models adjusted for patient characteristics (age, sex, race, ISS), patients in the postintervention group had lower delirium (OR, 0.44; 95% CI, 0.22-0.88; P = .02) and 30-day hospital readmission (OR, 0.25; 95% CI, 0.07-0.84; P = .02) rates compared to those in the pre-intervention group.
Conclusion: Implementation of an interdisciplinary care protocol for frail geriatric trauma patients significantly decreased their risks for in-hospital delirium and 30-day hospital readmission.
Commentary
Traumatic injuries in older adults are associated with higher morbidity and mortality compared to younger patients, with falls and motor vehicle accidents accounting for a majority of these injuries. Astoundingly, up to one-third of this vulnerable population presenting to hospitals with an ISS greater than 15 may die during hospitalization.1 As a result, a large number of studies and clinical trials have focused on interventions that are designed to reduce fall risks, and hence reduce adverse consequences of traumatic injuries that may arise after falls.2 However, this emphasis on falls prevention has overshadowed a need to develop effective geriatric-centered clinical interventions that aim to improve outcomes in older adults who present to hospitals with traumatic injuries. Furthermore, frailty—a geriatric syndrome indicative of an increased state of vulnerability and predictive of adverse outcomes such as delirium—is highly prevalent in older patients with traumatic injury.3 Thus, there is an urgent need to develop novel, hospital-based, traumatic injury–targeting strategies that incorporate a thoughtful redesign of the care framework that includes evidence-based interventions for geriatric syndromes such as delirium and frailty.
The study reported by Park et al (Study 1) represents the latest effort to evaluate inpatient management strategies designed to improve outcomes in hospitalized older adults who have sustained traumatic injury. Through the implementation of a novel multidisciplinary Geriatric Trauma Care Pathway that incorporates geriatric best practices, this intervention was found to be associated with a 46% lower risk of in-hospital delirium. Because of the inclusion of all age-eligible patients across all strata of traumatic injuries, rather than preselecting for those at the highest risk for poor clinical outcomes, the benefits of this intervention extend to those with minor or moderate injury severity. Furthermore, the improvement in delirium (ie, the primary outcome) is particularly meaningful given that delirium is one of the most common hospital-associated complications that increase hospital LOS, discharge to an institution, and mortality in older adults. Finally, the study’s observed reduced time to a first goals-of-care discussion and increased frequency of goals-of-care documentation after intervention should not be overlooked. The improvements in these 2 process measures are highly significant given that advanced care planning, an intervention that helps to align patients’ values, goals, and treatments, is completed at substantially lower rates in older adults in the acute hospital setting.4
Similarly, in an earlier published study, Bryant and colleagues (Study 2) also show that a geriatric-focused interdisciplinary trauma care pathway is associated with delirium risk reduction in hospitalized older trauma patients. Much like Study 1, the Frailty Identification and Care Pathway utilized in Study 2 is an evidence-based interdisciplinary care pathway that includes the use of geriatric assessments, order sets, and geriatric best practices. Moreover, its exclusive inclusion of pre-frail and frail older patients (ie, those at higher risk for poor outcomes) with moderate injury severity (median ISS of 10 [IQR, 9-14]) suggests that this type of care pathway benefits hospitalized older trauma patients, who are particularly vulnerable to adverse complications such as delirium. Moreover, the successful utilization of the FRAIL questionnaire, a validated frailty screening tool, by surgical residents in the ED to initiate this care pathway demonstrates the feasibility of its use in expediting frailty screening in older patients in trauma care.
Application for Clinical Practice and System Implementation
Findings from the 2 studies discussed in this review indicate that implementation of interdisciplinary clinical care pathways predicated on evidence-based geriatric principles and best practices is a promising approach to reduce delirium in hospitalized older trauma patients. These studies have helped to lay the groundwork in outlining the roadmaps (eg, processes and infrastructures) needed to create such clinical pathways. These key elements include: (1) integration of a multidisciplinary committee (eg, representation from trauma, emergency, and geriatric medicine, nursing, physical and occupational therapy, pharmacy, social work) in pathway design and implementation; (2) adaption of evidence-based geriatric best practices (eg, the Institute for Healthcare Improvement Age-Friendly Health System 4M framework [medication, mentation, mobility, what matters]) to prioritize interventions and to design a pathway that incorporates these features; (3) incorporation of comprehensive geriatric assessment by interdisciplinary providers; (4) utilization of validated clinical instruments to assess physical and cognitive functions, frailty, delirium, and social determinants of health; (5) modification of electronic health record systems to encompass order sets that incorporate evidence-based, nonpharmacological and pharmacological interventions to manage symptoms (eg, delirium, pain, bowel movement, sleep, immobility, polypharmacy) essential to quality geriatric care; and (6) integration of patient and caregiver preferences via goals-of-care discussions and corresponding documentation and communication of these goals.
Additionally, these 2 studies imparted some strategies that may facilitate the implementation of interdisciplinary clinical care pathways in trauma care. Examples of such facilitators include: (1) collaboration with champions within each specialty to reinforce education and buy-in; (2) creation of automatically triggered order sets upon patient presentation to the ED that unites distinct features of clinical pathways; (3) adaption and reorganization of existing hospital infrastructures and resources to meet the needs of clinical pathways implementation (eg, utilizing information technology resources to develop electronic health record order sets; using quality department to develop clinical pathway guidelines and electronic outcome dashboards); and (4) development of individualized patient and caregiver education materials based on care needs (eg, principles of delirium prevention and preservation of mobility during hospitalization) to prepare and engage these stakeholders in patient care and recovery.
Practice Points
- A geriatric interdisciplinary care model can be effectively applied to the management of acute trauma in older patients.
- Interdisciplinary clinical pathways should incorporate geriatric best practices and guidelines and age-appropriate order sets to prioritize and integrate care.
—Fred Ko, MD, MS
1. Hashmi A, Ibrahim-Zada I, Rhee P, et al. Predictors of mortality in geriatric trauma patients: a systematic review and meta-analysis. J Trauma Acute Care Surg. 2014;76(3):894-901. doi:10.1097/TA.0b013e3182ab0763
2. Hopewell S, Adedire O, Copsey BJ, et al. Multifactorial and multiple component interventions for preventing falls in older people living in the community. Cochrane Database Syst Rev. 2018;7(7):CD012221. doi:10.1002/14651858.CD012221.pub2
3. Joseph B, Pandit V, Zangbar B, et al. Superiority of frailty over age in predicting outcomes among geriatric trauma patients: a prospective analysis. JAMA Surg. 2014;149(8):766-772. doi:10.1001/jamasurg.2014.296
4. Hopkins SA, Bentley A, Phillips V, Barclay S. Advance care plans and hospitalized frail older adults: a systematic review. BMJ Support Palliat Care. 2020;10(2):164-174. doi:10.1136/bmjspcare-2019-002093
1. Hashmi A, Ibrahim-Zada I, Rhee P, et al. Predictors of mortality in geriatric trauma patients: a systematic review and meta-analysis. J Trauma Acute Care Surg. 2014;76(3):894-901. doi:10.1097/TA.0b013e3182ab0763
2. Hopewell S, Adedire O, Copsey BJ, et al. Multifactorial and multiple component interventions for preventing falls in older people living in the community. Cochrane Database Syst Rev. 2018;7(7):CD012221. doi:10.1002/14651858.CD012221.pub2
3. Joseph B, Pandit V, Zangbar B, et al. Superiority of frailty over age in predicting outcomes among geriatric trauma patients: a prospective analysis. JAMA Surg. 2014;149(8):766-772. doi:10.1001/jamasurg.2014.296
4. Hopkins SA, Bentley A, Phillips V, Barclay S. Advance care plans and hospitalized frail older adults: a systematic review. BMJ Support Palliat Care. 2020;10(2):164-174. doi:10.1136/bmjspcare-2019-002093