Does Headache Surgery Really Work? Neurologists Remain Unconvinced

Article Type
Changed
Wed, 08/07/2024 - 13:06

Jeffrey E. Janis, MD, is on a mission. The professor of plastic surgery, surgery, neurosurgery, and neurology at The Ohio State University Wexner Medical Center, Columbus, Ohio, wants to convince neurologists of the safety and efficacy of nerve decompression surgery for treatment-resistant headache. However, many neurologists remain unconvinced.

“Headache surgery is a viable alternative for complex patients who have failed medical management and who could benefit from this treatment approach. There’s 24 years of evidence behind this surgical technique across hundreds of different studies with different study designs,” Dr. Janis said. 

Yet this treatment approach — surgery on peripheral nerves rather than the brain or spinal cord — hasn’t garnered much support from neurologists. A scan of the agenda of topics at the recently held 2024 annual meeting of the American Headache Society showed few if any studies or presentations on this topic. And neurologists this news organization spoke to said they believe the surgery is experimental and unproven.

Experts do agree drugs don’t work for all patients with migraines. Up to 30% of patients don’t respond to the “laundry list of medications” available to treat the condition, said Dr. Janis.

Many patients have also tried, and failed, alternative treatment approaches such as massage, acupuncture, craniosacral therapy, transdermal patches, electrical stimulation, cryoablation, neurostimulation, and radiofrequency ablation.

If nothing else works, is surgery for headaches the answer?
 

Long-Held Theory

The idea that pinched, irritated, or compressed peripheral nerves can trigger migraine attacks has been around for nearly 25 years. Studies suggest that in addition to migraine, nerve compression can lead to other headache conditions, including occipital neuralgia, supraorbital neuralgia , and post-traumatic headaches.

This has led to the development of surgical techniques to deactivate various compression trigger sites — what Dr. Janis calls “pinch points” — which could involve muscles, bone, fascia, blood vessels, or scar tissue from prior trauma or surgery.

The procedure is predominantly performed by plastic surgeons, but to a lesser degree by neurosurgeons and ear, nose, and throat specialists.

Target nerves in surgical interventions include those in the frontal region of the head above the eye, temporal region, neck region, and nasal region. Affected areas are usually identified either through patient self-reports or by using a nerve block agent such as lidocaine or Botox at specific points, Dr. Janis noted. If pain subsides after an injection, that location is marked as a target.

One of the barriers to referring complicated patients for surgery is that neurologists evaluating migraine treatments “speak a different language” than surgeons performing the procedure, said Dr. Janis.

Neurologists tend to focus on reduction in monthly migraine days (MMD), while surgeons typically use the Migraine Headache Index that incorporates the frequency, intensity, and duration of migraine attacks.

“Rather than try to convince somebody to speak a different language, we thought, why don’t we just learn their language so we can build bridges and take down barriers,” said Dr. Janis, coauthor of a systematic review and meta-analysis published online recently in Plastic and Reconstructive Surgery.

Investigators examined 19 studies in the review, including five randomized controlled trials (RCTs), published from January 2020 to September 2023, with a total of 1603 participants who were mostly female and ranged in age from 9 to 72 years. Study follow-ups extended from 6 to 38 months. All but three studies were carried out in the United States, and six different compression sites were addressed during surgery.

Investigators found that across studies and by a number of measures, migraine frequency and severity improved after surgery.

Monthly migraine days decreased by 36%-92% and the number of overall migraine attacks per month dropped 25%-87.5%. Patients also reported decreases in attack duration of 41%-75% and intensity of 28%-82% across studies.

“Even using the neurologist-standard language of monthly migraine days, this surgery works,” said Dr. Janis. “Now this is documented both in the surgical literature and the nonsurgical literature.”

The most common complications were ecchymosis, hair loss or thinning, itching, dryness, and rhinorrhea, all of which Dr. Janis described as “fairly minor.” Major complications such as intraoperative bleeding and wound dehiscence were rare, occurring in 1% or less of participants.
 

 

 

‘One And Done?’

These surgeries are usually done on an outpatient basis and generally offer long-term results, Dr. Janis said.

“The idea is one and done,” he said. “The literature around this type of surgery says that whatever type of effect you get at 1 year is likely to be permanent.”

The American Society of Plastic Surgeons agrees. A 2018 position paper developed by experts and commissioned by the society reports that the intervention is safe and effective for appropriate patients, based on a comprehensive literature search and review of a large body of peer-reviewed scientific evidence.

“There is substantial, extensively replicated clinical data that demonstrates a significant reduction in [migraine headache] symptoms and frequency (even complete elimination of headache pain) following trigger site surgery,” the authors noted.

Pamela Blake, MD, a neurologist, board-certified headache specialist, and medical director at the Headache Center of River Oaks, Houston, is a proponent of what she said can be “lifesaving” headache surgery.

“If a doctor told you that you can either treat this problem with medications that you’ll need to take for the rest of your life or you can have a surgical procedure as an outpatient that has extremely low risk and has, in my experience, a 75% chance of reducing or eliminating your pain, you probably would be interested in surgery,” she said.
 

Continued Skepticism

However, other neurologists and clinicians appear doubtful about this intervention, including Hans-Christoph Diener, MD, PhD, professor of neurology and director, Essen Headache Centre, University of Duisburg-Essen in Germany.

During a debate on the topic a decade ago at the International Headache Congress, Dr. Diener argued that, as migraine is a complex multigene-related disorder of the brain, it doesn’t make sense that surgery would affect the epigenetics of 22 different genes.

Recently, he said that his views have not changed.

The topic remains controversial, and some neurologists are uncomfortable even openly discussing the procedure. Two clinicians who previously commented on this article later asked not to be included.

One neurologist, who asked to remain anonymous, said that Dr. Janis’s review article is “merely a review collecting 19 studies over the previous 10-plus years.”

Other limitations cited by this neurologist are the lack of consistency in procedures among the various studies and the inclusion of only four RCTs, the most recent of which was published 8 years ago, suggesting “the study was probably done closer to 9 or 10 years ago,” the neurologist said.

Dr. Blake suggested some neurologists’ reluctance could be due to limited background on the procedure, which she said isn’t widely discussed at headache meetings and is covered mostly in plastic surgery journals, not neurology literature. Access to surgery is further limited by a lack of specialists who perform the procedure and inconsistent insurance coverage.

A closer collaboration between neurologists and surgeons who perform the procedure could benefit patients, Dr. Blake noted.

“The headache doctor’s role is to identify who’s a candidate for surgery, who meets the criteria for nerve compression, and then follow that patient postoperatively, managing their medications, although usually we get them off their medications,” she added.

From Dr. Janis’s perspective, things are starting to change.

“I’m definitely seeing a greater comfort level among neurologists who are understanding where this sits in the algorithm for treatment, especially for complicated patients,” he said.

Dr. Janis receives royalties from Thieme and Springer Publishing. Dr. Blake reported no relevant conflicts. Dr. Diener received research support from the German Research Council; serves on the editorial boards of CephalalgiaLancet Neurology, and Drugs; and has received honoraria for participation in clinical trials, contribution to advisory boards, or oral presentations from AbbVie, Lilly, Lundbeck, Novartis, Pfizer, Teva, Weber & Weber, and WebMD.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Jeffrey E. Janis, MD, is on a mission. The professor of plastic surgery, surgery, neurosurgery, and neurology at The Ohio State University Wexner Medical Center, Columbus, Ohio, wants to convince neurologists of the safety and efficacy of nerve decompression surgery for treatment-resistant headache. However, many neurologists remain unconvinced.

“Headache surgery is a viable alternative for complex patients who have failed medical management and who could benefit from this treatment approach. There’s 24 years of evidence behind this surgical technique across hundreds of different studies with different study designs,” Dr. Janis said. 

Yet this treatment approach — surgery on peripheral nerves rather than the brain or spinal cord — hasn’t garnered much support from neurologists. A scan of the agenda of topics at the recently held 2024 annual meeting of the American Headache Society showed few if any studies or presentations on this topic. And neurologists this news organization spoke to said they believe the surgery is experimental and unproven.

Experts do agree drugs don’t work for all patients with migraines. Up to 30% of patients don’t respond to the “laundry list of medications” available to treat the condition, said Dr. Janis.

Many patients have also tried, and failed, alternative treatment approaches such as massage, acupuncture, craniosacral therapy, transdermal patches, electrical stimulation, cryoablation, neurostimulation, and radiofrequency ablation.

If nothing else works, is surgery for headaches the answer?
 

Long-Held Theory

The idea that pinched, irritated, or compressed peripheral nerves can trigger migraine attacks has been around for nearly 25 years. Studies suggest that in addition to migraine, nerve compression can lead to other headache conditions, including occipital neuralgia, supraorbital neuralgia , and post-traumatic headaches.

This has led to the development of surgical techniques to deactivate various compression trigger sites — what Dr. Janis calls “pinch points” — which could involve muscles, bone, fascia, blood vessels, or scar tissue from prior trauma or surgery.

The procedure is predominantly performed by plastic surgeons, but to a lesser degree by neurosurgeons and ear, nose, and throat specialists.

Target nerves in surgical interventions include those in the frontal region of the head above the eye, temporal region, neck region, and nasal region. Affected areas are usually identified either through patient self-reports or by using a nerve block agent such as lidocaine or Botox at specific points, Dr. Janis noted. If pain subsides after an injection, that location is marked as a target.

One of the barriers to referring complicated patients for surgery is that neurologists evaluating migraine treatments “speak a different language” than surgeons performing the procedure, said Dr. Janis.

Neurologists tend to focus on reduction in monthly migraine days (MMD), while surgeons typically use the Migraine Headache Index that incorporates the frequency, intensity, and duration of migraine attacks.

“Rather than try to convince somebody to speak a different language, we thought, why don’t we just learn their language so we can build bridges and take down barriers,” said Dr. Janis, coauthor of a systematic review and meta-analysis published online recently in Plastic and Reconstructive Surgery.

Investigators examined 19 studies in the review, including five randomized controlled trials (RCTs), published from January 2020 to September 2023, with a total of 1603 participants who were mostly female and ranged in age from 9 to 72 years. Study follow-ups extended from 6 to 38 months. All but three studies were carried out in the United States, and six different compression sites were addressed during surgery.

Investigators found that across studies and by a number of measures, migraine frequency and severity improved after surgery.

Monthly migraine days decreased by 36%-92% and the number of overall migraine attacks per month dropped 25%-87.5%. Patients also reported decreases in attack duration of 41%-75% and intensity of 28%-82% across studies.

“Even using the neurologist-standard language of monthly migraine days, this surgery works,” said Dr. Janis. “Now this is documented both in the surgical literature and the nonsurgical literature.”

The most common complications were ecchymosis, hair loss or thinning, itching, dryness, and rhinorrhea, all of which Dr. Janis described as “fairly minor.” Major complications such as intraoperative bleeding and wound dehiscence were rare, occurring in 1% or less of participants.
 

 

 

‘One And Done?’

These surgeries are usually done on an outpatient basis and generally offer long-term results, Dr. Janis said.

“The idea is one and done,” he said. “The literature around this type of surgery says that whatever type of effect you get at 1 year is likely to be permanent.”

The American Society of Plastic Surgeons agrees. A 2018 position paper developed by experts and commissioned by the society reports that the intervention is safe and effective for appropriate patients, based on a comprehensive literature search and review of a large body of peer-reviewed scientific evidence.

“There is substantial, extensively replicated clinical data that demonstrates a significant reduction in [migraine headache] symptoms and frequency (even complete elimination of headache pain) following trigger site surgery,” the authors noted.

Pamela Blake, MD, a neurologist, board-certified headache specialist, and medical director at the Headache Center of River Oaks, Houston, is a proponent of what she said can be “lifesaving” headache surgery.

“If a doctor told you that you can either treat this problem with medications that you’ll need to take for the rest of your life or you can have a surgical procedure as an outpatient that has extremely low risk and has, in my experience, a 75% chance of reducing or eliminating your pain, you probably would be interested in surgery,” she said.
 

Continued Skepticism

However, other neurologists and clinicians appear doubtful about this intervention, including Hans-Christoph Diener, MD, PhD, professor of neurology and director, Essen Headache Centre, University of Duisburg-Essen in Germany.

During a debate on the topic a decade ago at the International Headache Congress, Dr. Diener argued that, as migraine is a complex multigene-related disorder of the brain, it doesn’t make sense that surgery would affect the epigenetics of 22 different genes.

Recently, he said that his views have not changed.

The topic remains controversial, and some neurologists are uncomfortable even openly discussing the procedure. Two clinicians who previously commented on this article later asked not to be included.

One neurologist, who asked to remain anonymous, said that Dr. Janis’s review article is “merely a review collecting 19 studies over the previous 10-plus years.”

Other limitations cited by this neurologist are the lack of consistency in procedures among the various studies and the inclusion of only four RCTs, the most recent of which was published 8 years ago, suggesting “the study was probably done closer to 9 or 10 years ago,” the neurologist said.

Dr. Blake suggested some neurologists’ reluctance could be due to limited background on the procedure, which she said isn’t widely discussed at headache meetings and is covered mostly in plastic surgery journals, not neurology literature. Access to surgery is further limited by a lack of specialists who perform the procedure and inconsistent insurance coverage.

A closer collaboration between neurologists and surgeons who perform the procedure could benefit patients, Dr. Blake noted.

“The headache doctor’s role is to identify who’s a candidate for surgery, who meets the criteria for nerve compression, and then follow that patient postoperatively, managing their medications, although usually we get them off their medications,” she added.

From Dr. Janis’s perspective, things are starting to change.

“I’m definitely seeing a greater comfort level among neurologists who are understanding where this sits in the algorithm for treatment, especially for complicated patients,” he said.

Dr. Janis receives royalties from Thieme and Springer Publishing. Dr. Blake reported no relevant conflicts. Dr. Diener received research support from the German Research Council; serves on the editorial boards of CephalalgiaLancet Neurology, and Drugs; and has received honoraria for participation in clinical trials, contribution to advisory boards, or oral presentations from AbbVie, Lilly, Lundbeck, Novartis, Pfizer, Teva, Weber & Weber, and WebMD.

A version of this article appeared on Medscape.com.

Jeffrey E. Janis, MD, is on a mission. The professor of plastic surgery, surgery, neurosurgery, and neurology at The Ohio State University Wexner Medical Center, Columbus, Ohio, wants to convince neurologists of the safety and efficacy of nerve decompression surgery for treatment-resistant headache. However, many neurologists remain unconvinced.

“Headache surgery is a viable alternative for complex patients who have failed medical management and who could benefit from this treatment approach. There’s 24 years of evidence behind this surgical technique across hundreds of different studies with different study designs,” Dr. Janis said. 

Yet this treatment approach — surgery on peripheral nerves rather than the brain or spinal cord — hasn’t garnered much support from neurologists. A scan of the agenda of topics at the recently held 2024 annual meeting of the American Headache Society showed few if any studies or presentations on this topic. And neurologists this news organization spoke to said they believe the surgery is experimental and unproven.

Experts do agree drugs don’t work for all patients with migraines. Up to 30% of patients don’t respond to the “laundry list of medications” available to treat the condition, said Dr. Janis.

Many patients have also tried, and failed, alternative treatment approaches such as massage, acupuncture, craniosacral therapy, transdermal patches, electrical stimulation, cryoablation, neurostimulation, and radiofrequency ablation.

If nothing else works, is surgery for headaches the answer?
 

Long-Held Theory

The idea that pinched, irritated, or compressed peripheral nerves can trigger migraine attacks has been around for nearly 25 years. Studies suggest that in addition to migraine, nerve compression can lead to other headache conditions, including occipital neuralgia, supraorbital neuralgia , and post-traumatic headaches.

This has led to the development of surgical techniques to deactivate various compression trigger sites — what Dr. Janis calls “pinch points” — which could involve muscles, bone, fascia, blood vessels, or scar tissue from prior trauma or surgery.

The procedure is predominantly performed by plastic surgeons, but to a lesser degree by neurosurgeons and ear, nose, and throat specialists.

Target nerves in surgical interventions include those in the frontal region of the head above the eye, temporal region, neck region, and nasal region. Affected areas are usually identified either through patient self-reports or by using a nerve block agent such as lidocaine or Botox at specific points, Dr. Janis noted. If pain subsides after an injection, that location is marked as a target.

One of the barriers to referring complicated patients for surgery is that neurologists evaluating migraine treatments “speak a different language” than surgeons performing the procedure, said Dr. Janis.

Neurologists tend to focus on reduction in monthly migraine days (MMD), while surgeons typically use the Migraine Headache Index that incorporates the frequency, intensity, and duration of migraine attacks.

“Rather than try to convince somebody to speak a different language, we thought, why don’t we just learn their language so we can build bridges and take down barriers,” said Dr. Janis, coauthor of a systematic review and meta-analysis published online recently in Plastic and Reconstructive Surgery.

Investigators examined 19 studies in the review, including five randomized controlled trials (RCTs), published from January 2020 to September 2023, with a total of 1603 participants who were mostly female and ranged in age from 9 to 72 years. Study follow-ups extended from 6 to 38 months. All but three studies were carried out in the United States, and six different compression sites were addressed during surgery.

Investigators found that across studies and by a number of measures, migraine frequency and severity improved after surgery.

Monthly migraine days decreased by 36%-92% and the number of overall migraine attacks per month dropped 25%-87.5%. Patients also reported decreases in attack duration of 41%-75% and intensity of 28%-82% across studies.

“Even using the neurologist-standard language of monthly migraine days, this surgery works,” said Dr. Janis. “Now this is documented both in the surgical literature and the nonsurgical literature.”

The most common complications were ecchymosis, hair loss or thinning, itching, dryness, and rhinorrhea, all of which Dr. Janis described as “fairly minor.” Major complications such as intraoperative bleeding and wound dehiscence were rare, occurring in 1% or less of participants.
 

 

 

‘One And Done?’

These surgeries are usually done on an outpatient basis and generally offer long-term results, Dr. Janis said.

“The idea is one and done,” he said. “The literature around this type of surgery says that whatever type of effect you get at 1 year is likely to be permanent.”

The American Society of Plastic Surgeons agrees. A 2018 position paper developed by experts and commissioned by the society reports that the intervention is safe and effective for appropriate patients, based on a comprehensive literature search and review of a large body of peer-reviewed scientific evidence.

“There is substantial, extensively replicated clinical data that demonstrates a significant reduction in [migraine headache] symptoms and frequency (even complete elimination of headache pain) following trigger site surgery,” the authors noted.

Pamela Blake, MD, a neurologist, board-certified headache specialist, and medical director at the Headache Center of River Oaks, Houston, is a proponent of what she said can be “lifesaving” headache surgery.

“If a doctor told you that you can either treat this problem with medications that you’ll need to take for the rest of your life or you can have a surgical procedure as an outpatient that has extremely low risk and has, in my experience, a 75% chance of reducing or eliminating your pain, you probably would be interested in surgery,” she said.
 

Continued Skepticism

However, other neurologists and clinicians appear doubtful about this intervention, including Hans-Christoph Diener, MD, PhD, professor of neurology and director, Essen Headache Centre, University of Duisburg-Essen in Germany.

During a debate on the topic a decade ago at the International Headache Congress, Dr. Diener argued that, as migraine is a complex multigene-related disorder of the brain, it doesn’t make sense that surgery would affect the epigenetics of 22 different genes.

Recently, he said that his views have not changed.

The topic remains controversial, and some neurologists are uncomfortable even openly discussing the procedure. Two clinicians who previously commented on this article later asked not to be included.

One neurologist, who asked to remain anonymous, said that Dr. Janis’s review article is “merely a review collecting 19 studies over the previous 10-plus years.”

Other limitations cited by this neurologist are the lack of consistency in procedures among the various studies and the inclusion of only four RCTs, the most recent of which was published 8 years ago, suggesting “the study was probably done closer to 9 or 10 years ago,” the neurologist said.

Dr. Blake suggested some neurologists’ reluctance could be due to limited background on the procedure, which she said isn’t widely discussed at headache meetings and is covered mostly in plastic surgery journals, not neurology literature. Access to surgery is further limited by a lack of specialists who perform the procedure and inconsistent insurance coverage.

A closer collaboration between neurologists and surgeons who perform the procedure could benefit patients, Dr. Blake noted.

“The headache doctor’s role is to identify who’s a candidate for surgery, who meets the criteria for nerve compression, and then follow that patient postoperatively, managing their medications, although usually we get them off their medications,” she added.

From Dr. Janis’s perspective, things are starting to change.

“I’m definitely seeing a greater comfort level among neurologists who are understanding where this sits in the algorithm for treatment, especially for complicated patients,” he said.

Dr. Janis receives royalties from Thieme and Springer Publishing. Dr. Blake reported no relevant conflicts. Dr. Diener received research support from the German Research Council; serves on the editorial boards of CephalalgiaLancet Neurology, and Drugs; and has received honoraria for participation in clinical trials, contribution to advisory boards, or oral presentations from AbbVie, Lilly, Lundbeck, Novartis, Pfizer, Teva, Weber & Weber, and WebMD.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Maternal Obesity Linked to Sudden Infant Death

Article Type
Changed
Fri, 08/02/2024 - 11:49

More than 5% of cases of sudden infant death may be linked to maternal obesity, new research showed.

“When a parent has a child that dies of sudden unexplained infant death [SUID], it’s extremely devastating,” said Jan-Marino Ramirez, PhD, the Zain Nadella Endowed Chair in Pediatric Neurosciences at the University of Washington, Seattle, and director of the Center for Integrative Brain Research at Seattle Children’s Hospital. “And the most devastating problem is that there’s no clear answer. Understanding the mechanisms will help parents understand.”

The study was published online in JAMA Pediatrics.

In the United States, approximately 3500 cases of SUID are reported yearly. After educational campaigns in the 1990s demonstrating safe infant sleep positions, rates of these fatalities dropped but have since plateaued.
 

Maternal Obesity During Pregnancy

Rates of maternal obesity are increasing globally, and more than half of women of reproductive age are overweight or obese.

“Maternal obesity before pregnancy affects placental development, gene expression, and has long-term implications,” said Patrick Catalano, MD, a professor in residence at the Departments of Reproductive Endocrinology and Obstetrics and Gynecology at Massachusetts General Hospital and Harvard Medical School in Boston.

Maternal obesity is a well-documented risk factor for adverse outcomes of pregnancy including stillbirth, preterm birth, and admission to the neonatal intensive care unit. Swedish researchers in 2014 reported maternal obesity was linked to an increase in infant mortality that increased with body mass index (BMI), but that study did not look specifically at SUID.

For their new study, Dr. Ramirez and colleagues looked at data from all live births in the United States from 2015 to 2019 recorded by the Centers for Disease Control and Prevention and the National Center for Health Statistics. Of the 18,857,694 live births occurring at 28 weeks of gestation or later, 16,545 infants died of a sudden, unexplained cause.

Rates of SUID in babies born to mothers with obesity increased in a statistically significant, dose-dependent manner relative to normal weight mothers. The unadjusted absolute risks for SUID were 0.74 cases per 1000 births for normal weight mothers, 0.99 cases at BMIs between 30 and 35, 1.17 cases at BMIs between 35 and 40, and 1.47 instances at BMI ≥ 40.

After adjustment for maternal age, race, ethnicity, and level of education, the adjusted odds ratio for a case of SUID was 1.39 among women with the highest levels of obesity (95% CI, 1.31-1.47), according to the researchers.

While the study revealed an association between maternal obesity and SUID, the basis for this connection remains unknown, the investigators noted. One possibility for the link is that obesity increases the risk for obstructive sleep apnea, which can result in intermittent hypoxia. That, in turn, causes oxidative stress, which may possibly have effects on the fetus causing effects that eventually lead to SUID in the infant.

An accompanying editorial by Jacqueline Maya, MD; Marie-France Hivert, MD, MMSc; and Lydia Shook, MD, from the Massachusetts General Hospital and Harvard Medical School, suggested that the SUID is unlikely directly influenced by high maternal BMI but rather by the metabolic concerns related to obesity such as inflammation, insulin resistance, and abnormal lipid metabolism. Epigenetics may also play a role.

“We believe the evidence for this study of an association between prepregnancy obesity and SUID is a call to action for the scientific and medical community to better understand the complex interplay of biological, social, and behavioral factors that may lead to SUID, a devastating complication that no family should experience,” the authors of the editorial wrote.

Dr. Ramirez stressed the importance of not initiating guilt because there are many factors in SUID such as genetics that cannot be controlled.

“We are far from saying a baby died because you were obese; that’s an important message to parents,” he said. What he sees as important, rather, is using this new research to elucidate further mechanisms that may allow for more targeted interventions: “If we discover that it’s due to, for example, sleep apnea, that’s something we can prevent.”

The researchers reported no conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

More than 5% of cases of sudden infant death may be linked to maternal obesity, new research showed.

“When a parent has a child that dies of sudden unexplained infant death [SUID], it’s extremely devastating,” said Jan-Marino Ramirez, PhD, the Zain Nadella Endowed Chair in Pediatric Neurosciences at the University of Washington, Seattle, and director of the Center for Integrative Brain Research at Seattle Children’s Hospital. “And the most devastating problem is that there’s no clear answer. Understanding the mechanisms will help parents understand.”

The study was published online in JAMA Pediatrics.

In the United States, approximately 3500 cases of SUID are reported yearly. After educational campaigns in the 1990s demonstrating safe infant sleep positions, rates of these fatalities dropped but have since plateaued.
 

Maternal Obesity During Pregnancy

Rates of maternal obesity are increasing globally, and more than half of women of reproductive age are overweight or obese.

“Maternal obesity before pregnancy affects placental development, gene expression, and has long-term implications,” said Patrick Catalano, MD, a professor in residence at the Departments of Reproductive Endocrinology and Obstetrics and Gynecology at Massachusetts General Hospital and Harvard Medical School in Boston.

Maternal obesity is a well-documented risk factor for adverse outcomes of pregnancy including stillbirth, preterm birth, and admission to the neonatal intensive care unit. Swedish researchers in 2014 reported maternal obesity was linked to an increase in infant mortality that increased with body mass index (BMI), but that study did not look specifically at SUID.

For their new study, Dr. Ramirez and colleagues looked at data from all live births in the United States from 2015 to 2019 recorded by the Centers for Disease Control and Prevention and the National Center for Health Statistics. Of the 18,857,694 live births occurring at 28 weeks of gestation or later, 16,545 infants died of a sudden, unexplained cause.

Rates of SUID in babies born to mothers with obesity increased in a statistically significant, dose-dependent manner relative to normal weight mothers. The unadjusted absolute risks for SUID were 0.74 cases per 1000 births for normal weight mothers, 0.99 cases at BMIs between 30 and 35, 1.17 cases at BMIs between 35 and 40, and 1.47 instances at BMI ≥ 40.

After adjustment for maternal age, race, ethnicity, and level of education, the adjusted odds ratio for a case of SUID was 1.39 among women with the highest levels of obesity (95% CI, 1.31-1.47), according to the researchers.

While the study revealed an association between maternal obesity and SUID, the basis for this connection remains unknown, the investigators noted. One possibility for the link is that obesity increases the risk for obstructive sleep apnea, which can result in intermittent hypoxia. That, in turn, causes oxidative stress, which may possibly have effects on the fetus causing effects that eventually lead to SUID in the infant.

An accompanying editorial by Jacqueline Maya, MD; Marie-France Hivert, MD, MMSc; and Lydia Shook, MD, from the Massachusetts General Hospital and Harvard Medical School, suggested that the SUID is unlikely directly influenced by high maternal BMI but rather by the metabolic concerns related to obesity such as inflammation, insulin resistance, and abnormal lipid metabolism. Epigenetics may also play a role.

“We believe the evidence for this study of an association between prepregnancy obesity and SUID is a call to action for the scientific and medical community to better understand the complex interplay of biological, social, and behavioral factors that may lead to SUID, a devastating complication that no family should experience,” the authors of the editorial wrote.

Dr. Ramirez stressed the importance of not initiating guilt because there are many factors in SUID such as genetics that cannot be controlled.

“We are far from saying a baby died because you were obese; that’s an important message to parents,” he said. What he sees as important, rather, is using this new research to elucidate further mechanisms that may allow for more targeted interventions: “If we discover that it’s due to, for example, sleep apnea, that’s something we can prevent.”

The researchers reported no conflicts of interest.

A version of this article first appeared on Medscape.com.

More than 5% of cases of sudden infant death may be linked to maternal obesity, new research showed.

“When a parent has a child that dies of sudden unexplained infant death [SUID], it’s extremely devastating,” said Jan-Marino Ramirez, PhD, the Zain Nadella Endowed Chair in Pediatric Neurosciences at the University of Washington, Seattle, and director of the Center for Integrative Brain Research at Seattle Children’s Hospital. “And the most devastating problem is that there’s no clear answer. Understanding the mechanisms will help parents understand.”

The study was published online in JAMA Pediatrics.

In the United States, approximately 3500 cases of SUID are reported yearly. After educational campaigns in the 1990s demonstrating safe infant sleep positions, rates of these fatalities dropped but have since plateaued.
 

Maternal Obesity During Pregnancy

Rates of maternal obesity are increasing globally, and more than half of women of reproductive age are overweight or obese.

“Maternal obesity before pregnancy affects placental development, gene expression, and has long-term implications,” said Patrick Catalano, MD, a professor in residence at the Departments of Reproductive Endocrinology and Obstetrics and Gynecology at Massachusetts General Hospital and Harvard Medical School in Boston.

Maternal obesity is a well-documented risk factor for adverse outcomes of pregnancy including stillbirth, preterm birth, and admission to the neonatal intensive care unit. Swedish researchers in 2014 reported maternal obesity was linked to an increase in infant mortality that increased with body mass index (BMI), but that study did not look specifically at SUID.

For their new study, Dr. Ramirez and colleagues looked at data from all live births in the United States from 2015 to 2019 recorded by the Centers for Disease Control and Prevention and the National Center for Health Statistics. Of the 18,857,694 live births occurring at 28 weeks of gestation or later, 16,545 infants died of a sudden, unexplained cause.

Rates of SUID in babies born to mothers with obesity increased in a statistically significant, dose-dependent manner relative to normal weight mothers. The unadjusted absolute risks for SUID were 0.74 cases per 1000 births for normal weight mothers, 0.99 cases at BMIs between 30 and 35, 1.17 cases at BMIs between 35 and 40, and 1.47 instances at BMI ≥ 40.

After adjustment for maternal age, race, ethnicity, and level of education, the adjusted odds ratio for a case of SUID was 1.39 among women with the highest levels of obesity (95% CI, 1.31-1.47), according to the researchers.

While the study revealed an association between maternal obesity and SUID, the basis for this connection remains unknown, the investigators noted. One possibility for the link is that obesity increases the risk for obstructive sleep apnea, which can result in intermittent hypoxia. That, in turn, causes oxidative stress, which may possibly have effects on the fetus causing effects that eventually lead to SUID in the infant.

An accompanying editorial by Jacqueline Maya, MD; Marie-France Hivert, MD, MMSc; and Lydia Shook, MD, from the Massachusetts General Hospital and Harvard Medical School, suggested that the SUID is unlikely directly influenced by high maternal BMI but rather by the metabolic concerns related to obesity such as inflammation, insulin resistance, and abnormal lipid metabolism. Epigenetics may also play a role.

“We believe the evidence for this study of an association between prepregnancy obesity and SUID is a call to action for the scientific and medical community to better understand the complex interplay of biological, social, and behavioral factors that may lead to SUID, a devastating complication that no family should experience,” the authors of the editorial wrote.

Dr. Ramirez stressed the importance of not initiating guilt because there are many factors in SUID such as genetics that cannot be controlled.

“We are far from saying a baby died because you were obese; that’s an important message to parents,” he said. What he sees as important, rather, is using this new research to elucidate further mechanisms that may allow for more targeted interventions: “If we discover that it’s due to, for example, sleep apnea, that’s something we can prevent.”

The researchers reported no conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Drugs Could Reduce AMD Treatment Burden

Article Type
Changed
Fri, 08/02/2024 - 11:44

STOCKHOLM — Current treatments for age-related macular degeneration (AMD) have proved effective and safe. However, these lifelong therapies involve frequent ocular injections. “It can be nerve-wracking for patients about to embark on this journey,” Lisa Olmos de Koo, MD, an ophthalmologist at the University of Washington Eye Institute at Harborview, Seattle, told this news organization.

At the American Society of Retina Specialists (ASRS) 2024 annual meeting, researchers from around the world presented results from clinical studies aiming at reducing the burden of AMD treatment by:

  • Identifying patients at a higher risk for degeneration and vision loss who will be more likely to respond to treatment
  • Developing gene therapies that promise to drastically reduce or eliminate the need for injections
  • Testing novel drugs with mechanisms of action that use different pathways than currently available medications, offering patients more options and longer-lasting treatments

“It’s exciting to see the broad range of novel approaches in AMD treatments,” Dimitra Skondra, MD, PhD, a retina specialist at the University of Chicago, told this news organization.
 

Whom to Treat

Anti–vascular endothelial growth factor (anti-VEGF) therapies shook the AMD treatment scene when they were introduced in the early 2000s. “It was incredible,” Dr. Olmos de Koo said. Patients with wet AMD could finally see their vision improve with each injection. “It was a great motivator to begin therapy.”

However, patients with the advanced form of dry AMD involving geographic atrophy (GA) have had less luck. Pegcetacoplan and avacincaptad pegol, the only US Food and Drug Administration (FDA)–approved treatments for GA, slow the progression of the disease but do not restore vision. In fact, vision continues to decline. “Patients want to understand if their condition is worsening and whether treatment is necessary,” Dr. Olmos de Koo said.

Researchers are developing tools to help clinicians identify lesions that are more likely to grow and reach the fovea, causing vision loss.

For example, Cleveland Clinic’s Katherine Talcott, MD, presented an analysis of the GATHER1 and GATHER2 clinical trials that showed that spectral domain optical coherence tomography can be used to examine the integrity of the ellipsoid zone for predicting GA growth and treatment response. The retina’s ellipsoid zone contains densely packed mitochondria within the inner segments of the photoreceptor cells and plays a critical role in visual function.

Dr. Talcott and her team found that more severe baseline damage of the ellipsoid zone was associated with a faster growth rate of GA.

Another analysis of the same trials, presented by Dilraj Grewal, MD, associate professor of ophthalmology, vitreoretinal surgery, and uveitis at Duke Eye Center, Durham, North Carolina, showed that intravitreal administration of avacincaptad pegol efficiently reduced GA growth whether the treated eye developed macular neovascularization or not. Avacincaptad pegol is a complement factor inhibitor that aims to reduce complement-mediated inflammation and tissue damage in the retina.

Dr. Olmos de Koo explained that clinical trials have shown that more patients develop neovascularization when treated for dry GA than they would if left untreated. This has raised the question among clinicians whether the increased risk is a valid reason to avoid treatment. “This useful analysis tells us that there is still a rationale to continue treating GA, even while you’re concurrently treating the wet component with anti-VEGF therapies,” she said.

Another biomarker of GA growth is the position of the lesion at baseline. Daniel Muth, MD, an ophthalmology consultant at the Karolinska Institutet in Stockholm, Sweden, reported the results from a long-term, retrospective analysis of fundus autofluorescence in patients with GA. His semiautomated artificial intelligence–based analysis showed that patients affected bilaterally, but whose fovea was not yet affected, exhibited a faster GA growth rate than fovea-involving patients, with an approximate 15% risk for fovea involvement.

“Those patients whose atrophy has not yet affected the very center are the most likely to benefit from preventive therapy,” Dr. Olmos de Koo said. “Left untreated, a large proportion of them will develop atrophy that does affect their central vision — that’s their reading or facial recognition ability.”

“Potential predictors of rapid growth rates guide us clinically and allow patients to make more informed decisions about whether to pursue treatments that require frequent interventions,” Dr. Olmos de Koo said.

Forecasting the side to which the cost-benefit balance of treatment will tip for each patient is a complex decision-making process, she explained. “A patient is not a statistic, but these predictive studies are one important piece of the pie.”
 

 

 

The Promise of Gene Therapy

The one-and-done promise of gene therapy could rattle the field once again. Trials presented at the ASRS24 showed a drastic reduction (from 85% to 95%) in the number of anti-VEGF and complement treatments needed following gene therapy injection, improving patient vision while relieving them from the stress of monthly injections.

But researchers are still debating the optimal corticosteroid regimen that is required for reducing the inflammatory response associated with the administration of gene therapies, especially those that use viral vectors. The main controversy is whether systemic immunosuppression is necessary or if local therapies, such as topical and intravitreal administration, can suffice.

Results presented at the meeting suggest that local therapies alone can be effective, potentially reducing the need for systemic immunosuppression.

The LUNA trial evaluated the efficacy and tolerability of ixoberogene soroparvovec, a therapy that delivers an anti-VEGF gene into the eye. Investigators included various prophylactic regimens, including local corticosteroids with and without oral prednisone. They found that local corticosteroid therapy alone effectively reduced inflammation.

Biopharma company 4DMT conducted the PRISM study, which examined a dual transgene therapy for neovascular AMD. Patients in this trial received a 20-week topical steroid taper. Only one patient (of 39) required a 6-week extension of steroid therapy. No patients experienced clinically significant intraocular inflammation, indicating that local corticosteroid therapy was effective in managing immune responses.

Currently, gene therapy clinical trials are designed for patients who have failed standard therapy or require frequent injections. “Once we figure out possible long-term side effects and how to deal with inflammation, [gene therapy] could reach many more patients,” Dr. Olmos de Koo said.
 

New Approaches Enter Pipeline

While gene therapy brings excitement to the field, it might not be for everyone, experts agreed at the ASRS24. New agents are being evaluated to offer a broader range of treatment options with longer-lasting efficacy. Results from early-phase trials presented at the meeting show favorable safety and efficacy signals.

“Finally, after a long time, we have a lot of exciting drugs for geographic atrophy in the pipeline that seem to be safe, with many studies also showing a functional outcome in addition to anatomical outcome,” Dr. Skondra told this news organization.

Current FDA-approved treatments for GA focus on inhibiting the humoral arm of the immune system through C3 and C5 inhibitors. However, a new approach targets both the humoral and cellular arms of the immune response by inhibiting macrophages that release pro-inflammatory cytokines. The goal is to convert these macrophages to a “resolution state,” potentially reducing the release of inflammatory cytokines and offering a more comprehensive treatment for wet and dry AMD, said Rishi Singh, MD, a retina surgeon at the Cole Eye Institute, Cleveland Clinic.

AVD-104, a sialic acid–coated nanoparticle developed by Aviceda Therapeutics, is a promising candidate in this approach. This 100-nm-in-diameter particle, which is only as heavy as 20 hydrogen atoms, is designed for better tissue penetration and has a pharmacokinetic profile lasting 3-4 months after a single intravitreal dose.

AVD-104 aims to repolarize macrophages into a resolution phenotype and decreases complement factor overamplification through direct binding to complement factor H, which downregulates C3 production in immune cells. This dual-action approach could offer a more effective and long-lasting treatment option.

Dr. Singh, who presented the phase 2/3 SIGLEC clinical trial assessing AVD-104, said a single dose resulted in significantly slower rates of disease progression as early as 1 month post-treatment and a notable decrease in junctional zone hyper-autofluorescence.

In addition, about 40% of patients gained vision, which was unexpected but a pleasant surprise, Dr. Singh said. “This is a small study. I don’t want anyone to walk away with the conclusion that we’ve figured out how to improve visual acuity in GA. But it’s promising.”

Other researchers are tackling GA by focusing on therapies that aim to intervene before the complement system is activated.

ONL1204 is a novel agent designed to inhibit the activation of the tumor necrosis factor FAS receptor, which is activated and upregulated in a disease state and is implicated in multiple cell death and inflammatory pathways.

Multiple preclinical models of AMD have shown that ONL1204 preserves retinal cells and inhibits inflammation by inhibiting the FAS receptor. Phase 1 trial results presented at the meeting showed that ONL1204 was safe and showed strong efficacy signals as early as 6 months after treatment initiation.

“We need to be cautiously optimistic,” Dr. Skondra said. “Larger studies will tell us if these signals are real. But it’s a very exciting time. I’m happy to see different mechanisms of action besides the complement because we can attack the disease from multiple fronts.”

Dr. Grewal declared interests with Eyepoint, Iveric Bio, Regeneron, Alumis, Apellis, DORC, and Genentech. Dr. Muth declared interests with Bayer, Canon, and Roche. Dr. Olmos de Koo declared interests with Alcon and Pixium Vision. Dr. Singh declared interests with Gyroscope, 4DMT, Aviceda, Eyepoint, Alcon, Bausch and Lomb, Novartis, and Regeneron. Dr. Skondra declared interests with Biogen, Iveric Bio, Allergan, and Trinity Health Science. Dr. Talcott declared interests with Bausch and Lomb, Eyepoint, Regeneron, REGENXBIO, Zeiss, Apellis, Genentech, Alimera, Outlook, and Iveric Bio.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

STOCKHOLM — Current treatments for age-related macular degeneration (AMD) have proved effective and safe. However, these lifelong therapies involve frequent ocular injections. “It can be nerve-wracking for patients about to embark on this journey,” Lisa Olmos de Koo, MD, an ophthalmologist at the University of Washington Eye Institute at Harborview, Seattle, told this news organization.

At the American Society of Retina Specialists (ASRS) 2024 annual meeting, researchers from around the world presented results from clinical studies aiming at reducing the burden of AMD treatment by:

  • Identifying patients at a higher risk for degeneration and vision loss who will be more likely to respond to treatment
  • Developing gene therapies that promise to drastically reduce or eliminate the need for injections
  • Testing novel drugs with mechanisms of action that use different pathways than currently available medications, offering patients more options and longer-lasting treatments

“It’s exciting to see the broad range of novel approaches in AMD treatments,” Dimitra Skondra, MD, PhD, a retina specialist at the University of Chicago, told this news organization.
 

Whom to Treat

Anti–vascular endothelial growth factor (anti-VEGF) therapies shook the AMD treatment scene when they were introduced in the early 2000s. “It was incredible,” Dr. Olmos de Koo said. Patients with wet AMD could finally see their vision improve with each injection. “It was a great motivator to begin therapy.”

However, patients with the advanced form of dry AMD involving geographic atrophy (GA) have had less luck. Pegcetacoplan and avacincaptad pegol, the only US Food and Drug Administration (FDA)–approved treatments for GA, slow the progression of the disease but do not restore vision. In fact, vision continues to decline. “Patients want to understand if their condition is worsening and whether treatment is necessary,” Dr. Olmos de Koo said.

Researchers are developing tools to help clinicians identify lesions that are more likely to grow and reach the fovea, causing vision loss.

For example, Cleveland Clinic’s Katherine Talcott, MD, presented an analysis of the GATHER1 and GATHER2 clinical trials that showed that spectral domain optical coherence tomography can be used to examine the integrity of the ellipsoid zone for predicting GA growth and treatment response. The retina’s ellipsoid zone contains densely packed mitochondria within the inner segments of the photoreceptor cells and plays a critical role in visual function.

Dr. Talcott and her team found that more severe baseline damage of the ellipsoid zone was associated with a faster growth rate of GA.

Another analysis of the same trials, presented by Dilraj Grewal, MD, associate professor of ophthalmology, vitreoretinal surgery, and uveitis at Duke Eye Center, Durham, North Carolina, showed that intravitreal administration of avacincaptad pegol efficiently reduced GA growth whether the treated eye developed macular neovascularization or not. Avacincaptad pegol is a complement factor inhibitor that aims to reduce complement-mediated inflammation and tissue damage in the retina.

Dr. Olmos de Koo explained that clinical trials have shown that more patients develop neovascularization when treated for dry GA than they would if left untreated. This has raised the question among clinicians whether the increased risk is a valid reason to avoid treatment. “This useful analysis tells us that there is still a rationale to continue treating GA, even while you’re concurrently treating the wet component with anti-VEGF therapies,” she said.

Another biomarker of GA growth is the position of the lesion at baseline. Daniel Muth, MD, an ophthalmology consultant at the Karolinska Institutet in Stockholm, Sweden, reported the results from a long-term, retrospective analysis of fundus autofluorescence in patients with GA. His semiautomated artificial intelligence–based analysis showed that patients affected bilaterally, but whose fovea was not yet affected, exhibited a faster GA growth rate than fovea-involving patients, with an approximate 15% risk for fovea involvement.

“Those patients whose atrophy has not yet affected the very center are the most likely to benefit from preventive therapy,” Dr. Olmos de Koo said. “Left untreated, a large proportion of them will develop atrophy that does affect their central vision — that’s their reading or facial recognition ability.”

“Potential predictors of rapid growth rates guide us clinically and allow patients to make more informed decisions about whether to pursue treatments that require frequent interventions,” Dr. Olmos de Koo said.

Forecasting the side to which the cost-benefit balance of treatment will tip for each patient is a complex decision-making process, she explained. “A patient is not a statistic, but these predictive studies are one important piece of the pie.”
 

 

 

The Promise of Gene Therapy

The one-and-done promise of gene therapy could rattle the field once again. Trials presented at the ASRS24 showed a drastic reduction (from 85% to 95%) in the number of anti-VEGF and complement treatments needed following gene therapy injection, improving patient vision while relieving them from the stress of monthly injections.

But researchers are still debating the optimal corticosteroid regimen that is required for reducing the inflammatory response associated with the administration of gene therapies, especially those that use viral vectors. The main controversy is whether systemic immunosuppression is necessary or if local therapies, such as topical and intravitreal administration, can suffice.

Results presented at the meeting suggest that local therapies alone can be effective, potentially reducing the need for systemic immunosuppression.

The LUNA trial evaluated the efficacy and tolerability of ixoberogene soroparvovec, a therapy that delivers an anti-VEGF gene into the eye. Investigators included various prophylactic regimens, including local corticosteroids with and without oral prednisone. They found that local corticosteroid therapy alone effectively reduced inflammation.

Biopharma company 4DMT conducted the PRISM study, which examined a dual transgene therapy for neovascular AMD. Patients in this trial received a 20-week topical steroid taper. Only one patient (of 39) required a 6-week extension of steroid therapy. No patients experienced clinically significant intraocular inflammation, indicating that local corticosteroid therapy was effective in managing immune responses.

Currently, gene therapy clinical trials are designed for patients who have failed standard therapy or require frequent injections. “Once we figure out possible long-term side effects and how to deal with inflammation, [gene therapy] could reach many more patients,” Dr. Olmos de Koo said.
 

New Approaches Enter Pipeline

While gene therapy brings excitement to the field, it might not be for everyone, experts agreed at the ASRS24. New agents are being evaluated to offer a broader range of treatment options with longer-lasting efficacy. Results from early-phase trials presented at the meeting show favorable safety and efficacy signals.

“Finally, after a long time, we have a lot of exciting drugs for geographic atrophy in the pipeline that seem to be safe, with many studies also showing a functional outcome in addition to anatomical outcome,” Dr. Skondra told this news organization.

Current FDA-approved treatments for GA focus on inhibiting the humoral arm of the immune system through C3 and C5 inhibitors. However, a new approach targets both the humoral and cellular arms of the immune response by inhibiting macrophages that release pro-inflammatory cytokines. The goal is to convert these macrophages to a “resolution state,” potentially reducing the release of inflammatory cytokines and offering a more comprehensive treatment for wet and dry AMD, said Rishi Singh, MD, a retina surgeon at the Cole Eye Institute, Cleveland Clinic.

AVD-104, a sialic acid–coated nanoparticle developed by Aviceda Therapeutics, is a promising candidate in this approach. This 100-nm-in-diameter particle, which is only as heavy as 20 hydrogen atoms, is designed for better tissue penetration and has a pharmacokinetic profile lasting 3-4 months after a single intravitreal dose.

AVD-104 aims to repolarize macrophages into a resolution phenotype and decreases complement factor overamplification through direct binding to complement factor H, which downregulates C3 production in immune cells. This dual-action approach could offer a more effective and long-lasting treatment option.

Dr. Singh, who presented the phase 2/3 SIGLEC clinical trial assessing AVD-104, said a single dose resulted in significantly slower rates of disease progression as early as 1 month post-treatment and a notable decrease in junctional zone hyper-autofluorescence.

In addition, about 40% of patients gained vision, which was unexpected but a pleasant surprise, Dr. Singh said. “This is a small study. I don’t want anyone to walk away with the conclusion that we’ve figured out how to improve visual acuity in GA. But it’s promising.”

Other researchers are tackling GA by focusing on therapies that aim to intervene before the complement system is activated.

ONL1204 is a novel agent designed to inhibit the activation of the tumor necrosis factor FAS receptor, which is activated and upregulated in a disease state and is implicated in multiple cell death and inflammatory pathways.

Multiple preclinical models of AMD have shown that ONL1204 preserves retinal cells and inhibits inflammation by inhibiting the FAS receptor. Phase 1 trial results presented at the meeting showed that ONL1204 was safe and showed strong efficacy signals as early as 6 months after treatment initiation.

“We need to be cautiously optimistic,” Dr. Skondra said. “Larger studies will tell us if these signals are real. But it’s a very exciting time. I’m happy to see different mechanisms of action besides the complement because we can attack the disease from multiple fronts.”

Dr. Grewal declared interests with Eyepoint, Iveric Bio, Regeneron, Alumis, Apellis, DORC, and Genentech. Dr. Muth declared interests with Bayer, Canon, and Roche. Dr. Olmos de Koo declared interests with Alcon and Pixium Vision. Dr. Singh declared interests with Gyroscope, 4DMT, Aviceda, Eyepoint, Alcon, Bausch and Lomb, Novartis, and Regeneron. Dr. Skondra declared interests with Biogen, Iveric Bio, Allergan, and Trinity Health Science. Dr. Talcott declared interests with Bausch and Lomb, Eyepoint, Regeneron, REGENXBIO, Zeiss, Apellis, Genentech, Alimera, Outlook, and Iveric Bio.
 

A version of this article first appeared on Medscape.com.

STOCKHOLM — Current treatments for age-related macular degeneration (AMD) have proved effective and safe. However, these lifelong therapies involve frequent ocular injections. “It can be nerve-wracking for patients about to embark on this journey,” Lisa Olmos de Koo, MD, an ophthalmologist at the University of Washington Eye Institute at Harborview, Seattle, told this news organization.

At the American Society of Retina Specialists (ASRS) 2024 annual meeting, researchers from around the world presented results from clinical studies aiming at reducing the burden of AMD treatment by:

  • Identifying patients at a higher risk for degeneration and vision loss who will be more likely to respond to treatment
  • Developing gene therapies that promise to drastically reduce or eliminate the need for injections
  • Testing novel drugs with mechanisms of action that use different pathways than currently available medications, offering patients more options and longer-lasting treatments

“It’s exciting to see the broad range of novel approaches in AMD treatments,” Dimitra Skondra, MD, PhD, a retina specialist at the University of Chicago, told this news organization.
 

Whom to Treat

Anti–vascular endothelial growth factor (anti-VEGF) therapies shook the AMD treatment scene when they were introduced in the early 2000s. “It was incredible,” Dr. Olmos de Koo said. Patients with wet AMD could finally see their vision improve with each injection. “It was a great motivator to begin therapy.”

However, patients with the advanced form of dry AMD involving geographic atrophy (GA) have had less luck. Pegcetacoplan and avacincaptad pegol, the only US Food and Drug Administration (FDA)–approved treatments for GA, slow the progression of the disease but do not restore vision. In fact, vision continues to decline. “Patients want to understand if their condition is worsening and whether treatment is necessary,” Dr. Olmos de Koo said.

Researchers are developing tools to help clinicians identify lesions that are more likely to grow and reach the fovea, causing vision loss.

For example, Cleveland Clinic’s Katherine Talcott, MD, presented an analysis of the GATHER1 and GATHER2 clinical trials that showed that spectral domain optical coherence tomography can be used to examine the integrity of the ellipsoid zone for predicting GA growth and treatment response. The retina’s ellipsoid zone contains densely packed mitochondria within the inner segments of the photoreceptor cells and plays a critical role in visual function.

Dr. Talcott and her team found that more severe baseline damage of the ellipsoid zone was associated with a faster growth rate of GA.

Another analysis of the same trials, presented by Dilraj Grewal, MD, associate professor of ophthalmology, vitreoretinal surgery, and uveitis at Duke Eye Center, Durham, North Carolina, showed that intravitreal administration of avacincaptad pegol efficiently reduced GA growth whether the treated eye developed macular neovascularization or not. Avacincaptad pegol is a complement factor inhibitor that aims to reduce complement-mediated inflammation and tissue damage in the retina.

Dr. Olmos de Koo explained that clinical trials have shown that more patients develop neovascularization when treated for dry GA than they would if left untreated. This has raised the question among clinicians whether the increased risk is a valid reason to avoid treatment. “This useful analysis tells us that there is still a rationale to continue treating GA, even while you’re concurrently treating the wet component with anti-VEGF therapies,” she said.

Another biomarker of GA growth is the position of the lesion at baseline. Daniel Muth, MD, an ophthalmology consultant at the Karolinska Institutet in Stockholm, Sweden, reported the results from a long-term, retrospective analysis of fundus autofluorescence in patients with GA. His semiautomated artificial intelligence–based analysis showed that patients affected bilaterally, but whose fovea was not yet affected, exhibited a faster GA growth rate than fovea-involving patients, with an approximate 15% risk for fovea involvement.

“Those patients whose atrophy has not yet affected the very center are the most likely to benefit from preventive therapy,” Dr. Olmos de Koo said. “Left untreated, a large proportion of them will develop atrophy that does affect their central vision — that’s their reading or facial recognition ability.”

“Potential predictors of rapid growth rates guide us clinically and allow patients to make more informed decisions about whether to pursue treatments that require frequent interventions,” Dr. Olmos de Koo said.

Forecasting the side to which the cost-benefit balance of treatment will tip for each patient is a complex decision-making process, she explained. “A patient is not a statistic, but these predictive studies are one important piece of the pie.”
 

 

 

The Promise of Gene Therapy

The one-and-done promise of gene therapy could rattle the field once again. Trials presented at the ASRS24 showed a drastic reduction (from 85% to 95%) in the number of anti-VEGF and complement treatments needed following gene therapy injection, improving patient vision while relieving them from the stress of monthly injections.

But researchers are still debating the optimal corticosteroid regimen that is required for reducing the inflammatory response associated with the administration of gene therapies, especially those that use viral vectors. The main controversy is whether systemic immunosuppression is necessary or if local therapies, such as topical and intravitreal administration, can suffice.

Results presented at the meeting suggest that local therapies alone can be effective, potentially reducing the need for systemic immunosuppression.

The LUNA trial evaluated the efficacy and tolerability of ixoberogene soroparvovec, a therapy that delivers an anti-VEGF gene into the eye. Investigators included various prophylactic regimens, including local corticosteroids with and without oral prednisone. They found that local corticosteroid therapy alone effectively reduced inflammation.

Biopharma company 4DMT conducted the PRISM study, which examined a dual transgene therapy for neovascular AMD. Patients in this trial received a 20-week topical steroid taper. Only one patient (of 39) required a 6-week extension of steroid therapy. No patients experienced clinically significant intraocular inflammation, indicating that local corticosteroid therapy was effective in managing immune responses.

Currently, gene therapy clinical trials are designed for patients who have failed standard therapy or require frequent injections. “Once we figure out possible long-term side effects and how to deal with inflammation, [gene therapy] could reach many more patients,” Dr. Olmos de Koo said.
 

New Approaches Enter Pipeline

While gene therapy brings excitement to the field, it might not be for everyone, experts agreed at the ASRS24. New agents are being evaluated to offer a broader range of treatment options with longer-lasting efficacy. Results from early-phase trials presented at the meeting show favorable safety and efficacy signals.

“Finally, after a long time, we have a lot of exciting drugs for geographic atrophy in the pipeline that seem to be safe, with many studies also showing a functional outcome in addition to anatomical outcome,” Dr. Skondra told this news organization.

Current FDA-approved treatments for GA focus on inhibiting the humoral arm of the immune system through C3 and C5 inhibitors. However, a new approach targets both the humoral and cellular arms of the immune response by inhibiting macrophages that release pro-inflammatory cytokines. The goal is to convert these macrophages to a “resolution state,” potentially reducing the release of inflammatory cytokines and offering a more comprehensive treatment for wet and dry AMD, said Rishi Singh, MD, a retina surgeon at the Cole Eye Institute, Cleveland Clinic.

AVD-104, a sialic acid–coated nanoparticle developed by Aviceda Therapeutics, is a promising candidate in this approach. This 100-nm-in-diameter particle, which is only as heavy as 20 hydrogen atoms, is designed for better tissue penetration and has a pharmacokinetic profile lasting 3-4 months after a single intravitreal dose.

AVD-104 aims to repolarize macrophages into a resolution phenotype and decreases complement factor overamplification through direct binding to complement factor H, which downregulates C3 production in immune cells. This dual-action approach could offer a more effective and long-lasting treatment option.

Dr. Singh, who presented the phase 2/3 SIGLEC clinical trial assessing AVD-104, said a single dose resulted in significantly slower rates of disease progression as early as 1 month post-treatment and a notable decrease in junctional zone hyper-autofluorescence.

In addition, about 40% of patients gained vision, which was unexpected but a pleasant surprise, Dr. Singh said. “This is a small study. I don’t want anyone to walk away with the conclusion that we’ve figured out how to improve visual acuity in GA. But it’s promising.”

Other researchers are tackling GA by focusing on therapies that aim to intervene before the complement system is activated.

ONL1204 is a novel agent designed to inhibit the activation of the tumor necrosis factor FAS receptor, which is activated and upregulated in a disease state and is implicated in multiple cell death and inflammatory pathways.

Multiple preclinical models of AMD have shown that ONL1204 preserves retinal cells and inhibits inflammation by inhibiting the FAS receptor. Phase 1 trial results presented at the meeting showed that ONL1204 was safe and showed strong efficacy signals as early as 6 months after treatment initiation.

“We need to be cautiously optimistic,” Dr. Skondra said. “Larger studies will tell us if these signals are real. But it’s a very exciting time. I’m happy to see different mechanisms of action besides the complement because we can attack the disease from multiple fronts.”

Dr. Grewal declared interests with Eyepoint, Iveric Bio, Regeneron, Alumis, Apellis, DORC, and Genentech. Dr. Muth declared interests with Bayer, Canon, and Roche. Dr. Olmos de Koo declared interests with Alcon and Pixium Vision. Dr. Singh declared interests with Gyroscope, 4DMT, Aviceda, Eyepoint, Alcon, Bausch and Lomb, Novartis, and Regeneron. Dr. Skondra declared interests with Biogen, Iveric Bio, Allergan, and Trinity Health Science. Dr. Talcott declared interests with Bausch and Lomb, Eyepoint, Regeneron, REGENXBIO, Zeiss, Apellis, Genentech, Alimera, Outlook, and Iveric Bio.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASRS 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Too Much Coffee Linked to Accelerated Cognitive Decline

Article Type
Changed
Mon, 08/05/2024 - 09:24

PHILADELPHIA – Drinking more than three cups of coffee a day is linked to more rapid cognitive decline over time, results from a large study suggest.

Investigators examined the impact of different amounts of coffee and tea on fluid intelligence — a measure of cognitive functions including abstract reasoning, pattern recognition, and logical thinking.

“It’s the old adage that too much of anything isn’t good. It’s all about balance, so moderate coffee consumption is okay but too much is probably not recommended,” said study investigator Kelsey R. Sewell, PhD, Advent Health Research Institute, Orlando, Florida. 

The findings of the study were presented at the 2024 Alzheimer’s Association International Conference (AAIC).
 

One of the World’s Most Widely Consumed Beverages

Coffee is one of the most widely consumed beverages around the world. The beans contain a range of bioactive compounds, including caffeine, chlorogenic acid, and small amounts of vitamins and minerals.

Consistent evidence from observational and epidemiologic studies indicates that intake of both coffee and tea has beneficial effects on stroke, heart failure, cancers, diabetes, and Parkinson’s disease.  

Several studies also suggest that coffee may reduce the risk for Alzheimer’s disease, said Dr. Sewell. However, there are limited longitudinal data on associations between coffee and tea intake and cognitive decline, particularly in distinct cognitive domains.

Dr. Sewell’s group previously published a study of cognitively unimpaired older adults that found greater coffee consumption was associated with slower cognitive decline and slower accumulation of brain beta-amyloid.

Their current study extends some of the prior findings and investigates the relationship between both coffee and tea intake and cognitive decline over time in a larger sample of older adults.

This new study included 8451 mostly female (60%) and White (97%) cognitively unimpaired adults older than 60 (mean age, 67.8 years) in the UK Biobank, a large-scale research resource containing in-depth, deidentified genetic and health information from half a million UK participants. Study subjects had a mean body mass index (BMI) of 26, and about 26% were apolipoprotein epsilon 4 (APOE e4) gene carriers.

Researchers divided coffee and tea consumption into tertiles: high, moderate, and no consumption.

For daily coffee consumption, 18% reported drinking four or more cups (high consumption), 58% reported drinking one to three cups (moderate consumption), and 25% reported that they never drink coffee. For daily tea consumption, 47% reported drinking four or more cups (high consumption), 38% reported drinking one to three cups (moderate consumption), and 15% reported that they never drink tea.

The study assessed cognitive function at baseline and at least two additional patient visits. 

Researchers used linear mixed models to assess the relationships between coffee and tea intake and cognitive outcomes. The models adjusted for age, sex, Townsend deprivation index (reflecting socioeconomic status), ethnicity, APOE e4 status, and BMI.
 

Steeper Decline 

Compared with high coffee consumption (four or more cups daily), people who never consumed coffee (beta, 0.06; standard error [SE], 0.02; P = .005) and those with moderate consumption (beta, 0.07; SE, 0.02; P = < .001) had slower decline in fluid intelligence after an average of 8.83 years of follow-up.

“We can see that those with high coffee consumption showed the steepest decline in fluid intelligence across the follow up, compared to those with moderate coffee consumption and those never consuming coffee,” said Dr. Sewell, referring to illustrative graphs.

At the same time, “our data suggest that across this time period, moderate coffee consumption can serve as some kind of protective factor against cognitive decline,” she added.

For tea, there was a somewhat different pattern. People who never drank tea had a greater decline in fluid intelligence, compared with those who had moderate consumption (beta, 0.06; SE, 0.02; P = .0090) or high consumption (beta, 0.06; SE, 0.02; P = .003).

Because this is an observational study, “we still need randomized controlled trials to better understand the neuroprotective mechanism of coffee and tea compounds,” said Dr. Sewell.

Responding later to a query from a meeting delegate about how moderate coffee drinking could be protective, Dr. Sewell said there are probably “different levels of mechanisms,” including at the molecular level (possibly involving amyloid toxicity) and the behavioral level (possibly involving sleep patterns).

Dr. Sewell said that she hopes this line of investigation will lead to new avenues of research in preventive strategies for Alzheimer’s disease. 

“We hope that coffee and tea intake could contribute to the development of a safe and inexpensive strategy for delaying the onset and reducing the incidence for Alzheimer’s disease.”

A limitation of the study is possible recall bias, because coffee and tea consumption were self-reported. However, this may not be much of an issue because coffee and tea consumption “is usually quite a habitual behavior,” said Dr. Sewell.

The study also had no data on midlife coffee or tea consumption and did not compare the effect of different preparation methods or types of coffee and tea — for example, green tea versus black tea. 

When asked if the study controlled for smoking, Dr. Sewell said it didn’t but added that it would be interesting to explore its impact on cognition.

Dr. Sewell reported no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

PHILADELPHIA – Drinking more than three cups of coffee a day is linked to more rapid cognitive decline over time, results from a large study suggest.

Investigators examined the impact of different amounts of coffee and tea on fluid intelligence — a measure of cognitive functions including abstract reasoning, pattern recognition, and logical thinking.

“It’s the old adage that too much of anything isn’t good. It’s all about balance, so moderate coffee consumption is okay but too much is probably not recommended,” said study investigator Kelsey R. Sewell, PhD, Advent Health Research Institute, Orlando, Florida. 

The findings of the study were presented at the 2024 Alzheimer’s Association International Conference (AAIC).
 

One of the World’s Most Widely Consumed Beverages

Coffee is one of the most widely consumed beverages around the world. The beans contain a range of bioactive compounds, including caffeine, chlorogenic acid, and small amounts of vitamins and minerals.

Consistent evidence from observational and epidemiologic studies indicates that intake of both coffee and tea has beneficial effects on stroke, heart failure, cancers, diabetes, and Parkinson’s disease.  

Several studies also suggest that coffee may reduce the risk for Alzheimer’s disease, said Dr. Sewell. However, there are limited longitudinal data on associations between coffee and tea intake and cognitive decline, particularly in distinct cognitive domains.

Dr. Sewell’s group previously published a study of cognitively unimpaired older adults that found greater coffee consumption was associated with slower cognitive decline and slower accumulation of brain beta-amyloid.

Their current study extends some of the prior findings and investigates the relationship between both coffee and tea intake and cognitive decline over time in a larger sample of older adults.

This new study included 8451 mostly female (60%) and White (97%) cognitively unimpaired adults older than 60 (mean age, 67.8 years) in the UK Biobank, a large-scale research resource containing in-depth, deidentified genetic and health information from half a million UK participants. Study subjects had a mean body mass index (BMI) of 26, and about 26% were apolipoprotein epsilon 4 (APOE e4) gene carriers.

Researchers divided coffee and tea consumption into tertiles: high, moderate, and no consumption.

For daily coffee consumption, 18% reported drinking four or more cups (high consumption), 58% reported drinking one to three cups (moderate consumption), and 25% reported that they never drink coffee. For daily tea consumption, 47% reported drinking four or more cups (high consumption), 38% reported drinking one to three cups (moderate consumption), and 15% reported that they never drink tea.

The study assessed cognitive function at baseline and at least two additional patient visits. 

Researchers used linear mixed models to assess the relationships between coffee and tea intake and cognitive outcomes. The models adjusted for age, sex, Townsend deprivation index (reflecting socioeconomic status), ethnicity, APOE e4 status, and BMI.
 

Steeper Decline 

Compared with high coffee consumption (four or more cups daily), people who never consumed coffee (beta, 0.06; standard error [SE], 0.02; P = .005) and those with moderate consumption (beta, 0.07; SE, 0.02; P = < .001) had slower decline in fluid intelligence after an average of 8.83 years of follow-up.

“We can see that those with high coffee consumption showed the steepest decline in fluid intelligence across the follow up, compared to those with moderate coffee consumption and those never consuming coffee,” said Dr. Sewell, referring to illustrative graphs.

At the same time, “our data suggest that across this time period, moderate coffee consumption can serve as some kind of protective factor against cognitive decline,” she added.

For tea, there was a somewhat different pattern. People who never drank tea had a greater decline in fluid intelligence, compared with those who had moderate consumption (beta, 0.06; SE, 0.02; P = .0090) or high consumption (beta, 0.06; SE, 0.02; P = .003).

Because this is an observational study, “we still need randomized controlled trials to better understand the neuroprotective mechanism of coffee and tea compounds,” said Dr. Sewell.

Responding later to a query from a meeting delegate about how moderate coffee drinking could be protective, Dr. Sewell said there are probably “different levels of mechanisms,” including at the molecular level (possibly involving amyloid toxicity) and the behavioral level (possibly involving sleep patterns).

Dr. Sewell said that she hopes this line of investigation will lead to new avenues of research in preventive strategies for Alzheimer’s disease. 

“We hope that coffee and tea intake could contribute to the development of a safe and inexpensive strategy for delaying the onset and reducing the incidence for Alzheimer’s disease.”

A limitation of the study is possible recall bias, because coffee and tea consumption were self-reported. However, this may not be much of an issue because coffee and tea consumption “is usually quite a habitual behavior,” said Dr. Sewell.

The study also had no data on midlife coffee or tea consumption and did not compare the effect of different preparation methods or types of coffee and tea — for example, green tea versus black tea. 

When asked if the study controlled for smoking, Dr. Sewell said it didn’t but added that it would be interesting to explore its impact on cognition.

Dr. Sewell reported no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

PHILADELPHIA – Drinking more than three cups of coffee a day is linked to more rapid cognitive decline over time, results from a large study suggest.

Investigators examined the impact of different amounts of coffee and tea on fluid intelligence — a measure of cognitive functions including abstract reasoning, pattern recognition, and logical thinking.

“It’s the old adage that too much of anything isn’t good. It’s all about balance, so moderate coffee consumption is okay but too much is probably not recommended,” said study investigator Kelsey R. Sewell, PhD, Advent Health Research Institute, Orlando, Florida. 

The findings of the study were presented at the 2024 Alzheimer’s Association International Conference (AAIC).
 

One of the World’s Most Widely Consumed Beverages

Coffee is one of the most widely consumed beverages around the world. The beans contain a range of bioactive compounds, including caffeine, chlorogenic acid, and small amounts of vitamins and minerals.

Consistent evidence from observational and epidemiologic studies indicates that intake of both coffee and tea has beneficial effects on stroke, heart failure, cancers, diabetes, and Parkinson’s disease.  

Several studies also suggest that coffee may reduce the risk for Alzheimer’s disease, said Dr. Sewell. However, there are limited longitudinal data on associations between coffee and tea intake and cognitive decline, particularly in distinct cognitive domains.

Dr. Sewell’s group previously published a study of cognitively unimpaired older adults that found greater coffee consumption was associated with slower cognitive decline and slower accumulation of brain beta-amyloid.

Their current study extends some of the prior findings and investigates the relationship between both coffee and tea intake and cognitive decline over time in a larger sample of older adults.

This new study included 8451 mostly female (60%) and White (97%) cognitively unimpaired adults older than 60 (mean age, 67.8 years) in the UK Biobank, a large-scale research resource containing in-depth, deidentified genetic and health information from half a million UK participants. Study subjects had a mean body mass index (BMI) of 26, and about 26% were apolipoprotein epsilon 4 (APOE e4) gene carriers.

Researchers divided coffee and tea consumption into tertiles: high, moderate, and no consumption.

For daily coffee consumption, 18% reported drinking four or more cups (high consumption), 58% reported drinking one to three cups (moderate consumption), and 25% reported that they never drink coffee. For daily tea consumption, 47% reported drinking four or more cups (high consumption), 38% reported drinking one to three cups (moderate consumption), and 15% reported that they never drink tea.

The study assessed cognitive function at baseline and at least two additional patient visits. 

Researchers used linear mixed models to assess the relationships between coffee and tea intake and cognitive outcomes. The models adjusted for age, sex, Townsend deprivation index (reflecting socioeconomic status), ethnicity, APOE e4 status, and BMI.
 

Steeper Decline 

Compared with high coffee consumption (four or more cups daily), people who never consumed coffee (beta, 0.06; standard error [SE], 0.02; P = .005) and those with moderate consumption (beta, 0.07; SE, 0.02; P = < .001) had slower decline in fluid intelligence after an average of 8.83 years of follow-up.

“We can see that those with high coffee consumption showed the steepest decline in fluid intelligence across the follow up, compared to those with moderate coffee consumption and those never consuming coffee,” said Dr. Sewell, referring to illustrative graphs.

At the same time, “our data suggest that across this time period, moderate coffee consumption can serve as some kind of protective factor against cognitive decline,” she added.

For tea, there was a somewhat different pattern. People who never drank tea had a greater decline in fluid intelligence, compared with those who had moderate consumption (beta, 0.06; SE, 0.02; P = .0090) or high consumption (beta, 0.06; SE, 0.02; P = .003).

Because this is an observational study, “we still need randomized controlled trials to better understand the neuroprotective mechanism of coffee and tea compounds,” said Dr. Sewell.

Responding later to a query from a meeting delegate about how moderate coffee drinking could be protective, Dr. Sewell said there are probably “different levels of mechanisms,” including at the molecular level (possibly involving amyloid toxicity) and the behavioral level (possibly involving sleep patterns).

Dr. Sewell said that she hopes this line of investigation will lead to new avenues of research in preventive strategies for Alzheimer’s disease. 

“We hope that coffee and tea intake could contribute to the development of a safe and inexpensive strategy for delaying the onset and reducing the incidence for Alzheimer’s disease.”

A limitation of the study is possible recall bias, because coffee and tea consumption were self-reported. However, this may not be much of an issue because coffee and tea consumption “is usually quite a habitual behavior,” said Dr. Sewell.

The study also had no data on midlife coffee or tea consumption and did not compare the effect of different preparation methods or types of coffee and tea — for example, green tea versus black tea. 

When asked if the study controlled for smoking, Dr. Sewell said it didn’t but added that it would be interesting to explore its impact on cognition.

Dr. Sewell reported no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Almost 50% of Global Dementia Cases May Be Preventable

Article Type
Changed
Thu, 08/01/2024 - 12:11

Nearly half of dementia cases worldwide could theoretically be prevented or delayed by eliminating 14 modifiable risk factors during an individual’s lifetime, a report from the Lancet Commission on dementia prevention, intervention, and care.

The report adds two new modifiable risk factors for dementia — high cholesterol and vision loss — to the 12 risk factors identified in the 2020 Lancet Commission report, which were linked to about 40% of all dementia cases. 

The original Lancet Commission report, published in 2017, identified nine modifiable risk factors that were estimated to be responsible for one third of dementia cases. 

“Our new report reveals that there is much more that can and should be done to reduce the risk of dementia. It’s never too early or too late to act, with opportunities to make an impact at any stage of life,” lead author Gill Livingston, MD, from University College London in England, said in a statement. 

The 57-page report was published online in The Lancet Neurology (to coincide with its presentation at the 2024 Alzheimer’s Association International Conference (AAIC).
 

‘Compelling’ New Evidence 

The 12 risk factors cited in the 2020 report are lower levels of education, hearing loss, hypertension, smoking, obesity, depression, physical inactivity, diabetes, excessive alcohol consumption, traumatic brain injury (TBI), air pollution, and social isolation. 

According to the authors of the current report, there is “new compelling evidence” that untreated vision loss and elevated low-density lipoprotein (LDL) cholesterol are also risk factors for dementia.

These two added risk factors are associated with 9% of all dementia cases — with an estimated 7% of cases caused by high LDL cholesterol from about age 40 years, and 2% of cases caused by untreated vision loss in later life, the authors said.

Out of all 14 risk factors, those tied to the greatest proportion of dementia in the global population are hearing impairment and high LDL cholesterol (7% each), along with less education in early life, and social isolation in later life (5% each), the report estimates. 

The new report also outlines 13 recommendations aimed at individuals and governments to help guard against dementia. They include preventing and treating hearing loss, vision loss, and depression; being cognitively active throughout life; using head protection in contact sports; reducing vascular risk factors (high cholesterol, diabetes, obesity, hypertension); improving air quality; and providing supportive community environments to increase social contact. 

Tara Spires-Jones, PhD, president of the British Neuroscience Association, emphasized that, while this research doesn’t directly link specific factors to dementia, it supports evidence that a healthy lifestyle — encompassing education, social activities, exercise, cognitive engagement, and avoiding head injuries and harmful factors for heart and lung health — can enhance brain resilience and prevent dementia.

In an interview, Heather M. Snyder, PhD, senior vice president of medical and scientific relations, Alzheimer’s Association, said: “Our brains are complex and what happens throughout our lives may increase or decrease our risk for dementia as we age. Protecting brain health as we age requires a comprehensive approach that includes discussions on diet, exercise, heart health, hearing, and vision.”

Also weighing in on the new report, Shaheen Lakhan, MD, PhD, neurologist and researcher based in Miami, Florida, said the addition of high cholesterol is “particularly noteworthy as it reinforces the intricate connection between vascular health and brain health — a link we’ve long suspected but can now target more effectively.”

As for vision loss, “it’s not just a matter of seeing clearly; it’s a matter of thinking clearly. Untreated vision loss can lead to social isolation, reduced physical activity, and cognitive decline,” said Dr. Lakhan. 
 

 

 

Dementia Is Not Inevitable

In his view, “the potential to prevent or delay nearly half of dementia cases by addressing these risk factors is nothing short of revolutionary. It shifts our perspective from viewing dementia as an inevitable part of aging to seeing it as a condition we can actively work to prevent,” Dr. Lakhan added.

He said the report’s emphasis on health equity is also important. 

“Dementia risk factors disproportionately affect socioeconomically disadvantaged groups and low- and middle-income countries. Addressing these disparities isn’t just a matter of fairness in the fight against dementia, equality in prevention is as important as equality in treatment,” Dr. Lakhan commented.

While the report offers hope, it also presents a challenge, he said. 

Implementing the recommended preventive measures requires a “coordinated effort from individuals, healthcare systems, and policymakers. The potential benefits, both in terms of quality of life and economic savings, make this effort not just worthwhile but imperative. Preventing dementia is not just a medical imperative — it’s an economic and humanitarian one,” Dr. Lakhan said. 

Masud Husain, PhD, with the University of Oxford in England, agreed. 

The conclusions in this report are “very important for all of us, but particularly for health policy makers and government,” he said. 

“If we did simple things well such as screening for some of the factors identified in this report, with adequate resources to perform this, we have the potential to prevent dementia on a national scale. This would be far more cost effective than developing high-tech treatments, which so far have been disappointing in their impacts on people with established dementia,” Dr. Husain said. 

The Lancet Commission was funded by University College London, Alzheimer’s Society, Alzheimer’s Research UK, and the Economic and Social Research Council. A complete list of author disclosures is available with the original article. Dr. Snyder, Dr. Lakhan, Dr. Husain and Dr. Spires-Jones have no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Nearly half of dementia cases worldwide could theoretically be prevented or delayed by eliminating 14 modifiable risk factors during an individual’s lifetime, a report from the Lancet Commission on dementia prevention, intervention, and care.

The report adds two new modifiable risk factors for dementia — high cholesterol and vision loss — to the 12 risk factors identified in the 2020 Lancet Commission report, which were linked to about 40% of all dementia cases. 

The original Lancet Commission report, published in 2017, identified nine modifiable risk factors that were estimated to be responsible for one third of dementia cases. 

“Our new report reveals that there is much more that can and should be done to reduce the risk of dementia. It’s never too early or too late to act, with opportunities to make an impact at any stage of life,” lead author Gill Livingston, MD, from University College London in England, said in a statement. 

The 57-page report was published online in The Lancet Neurology (to coincide with its presentation at the 2024 Alzheimer’s Association International Conference (AAIC).
 

‘Compelling’ New Evidence 

The 12 risk factors cited in the 2020 report are lower levels of education, hearing loss, hypertension, smoking, obesity, depression, physical inactivity, diabetes, excessive alcohol consumption, traumatic brain injury (TBI), air pollution, and social isolation. 

According to the authors of the current report, there is “new compelling evidence” that untreated vision loss and elevated low-density lipoprotein (LDL) cholesterol are also risk factors for dementia.

These two added risk factors are associated with 9% of all dementia cases — with an estimated 7% of cases caused by high LDL cholesterol from about age 40 years, and 2% of cases caused by untreated vision loss in later life, the authors said.

Out of all 14 risk factors, those tied to the greatest proportion of dementia in the global population are hearing impairment and high LDL cholesterol (7% each), along with less education in early life, and social isolation in later life (5% each), the report estimates. 

The new report also outlines 13 recommendations aimed at individuals and governments to help guard against dementia. They include preventing and treating hearing loss, vision loss, and depression; being cognitively active throughout life; using head protection in contact sports; reducing vascular risk factors (high cholesterol, diabetes, obesity, hypertension); improving air quality; and providing supportive community environments to increase social contact. 

Tara Spires-Jones, PhD, president of the British Neuroscience Association, emphasized that, while this research doesn’t directly link specific factors to dementia, it supports evidence that a healthy lifestyle — encompassing education, social activities, exercise, cognitive engagement, and avoiding head injuries and harmful factors for heart and lung health — can enhance brain resilience and prevent dementia.

In an interview, Heather M. Snyder, PhD, senior vice president of medical and scientific relations, Alzheimer’s Association, said: “Our brains are complex and what happens throughout our lives may increase or decrease our risk for dementia as we age. Protecting brain health as we age requires a comprehensive approach that includes discussions on diet, exercise, heart health, hearing, and vision.”

Also weighing in on the new report, Shaheen Lakhan, MD, PhD, neurologist and researcher based in Miami, Florida, said the addition of high cholesterol is “particularly noteworthy as it reinforces the intricate connection between vascular health and brain health — a link we’ve long suspected but can now target more effectively.”

As for vision loss, “it’s not just a matter of seeing clearly; it’s a matter of thinking clearly. Untreated vision loss can lead to social isolation, reduced physical activity, and cognitive decline,” said Dr. Lakhan. 
 

 

 

Dementia Is Not Inevitable

In his view, “the potential to prevent or delay nearly half of dementia cases by addressing these risk factors is nothing short of revolutionary. It shifts our perspective from viewing dementia as an inevitable part of aging to seeing it as a condition we can actively work to prevent,” Dr. Lakhan added.

He said the report’s emphasis on health equity is also important. 

“Dementia risk factors disproportionately affect socioeconomically disadvantaged groups and low- and middle-income countries. Addressing these disparities isn’t just a matter of fairness in the fight against dementia, equality in prevention is as important as equality in treatment,” Dr. Lakhan commented.

While the report offers hope, it also presents a challenge, he said. 

Implementing the recommended preventive measures requires a “coordinated effort from individuals, healthcare systems, and policymakers. The potential benefits, both in terms of quality of life and economic savings, make this effort not just worthwhile but imperative. Preventing dementia is not just a medical imperative — it’s an economic and humanitarian one,” Dr. Lakhan said. 

Masud Husain, PhD, with the University of Oxford in England, agreed. 

The conclusions in this report are “very important for all of us, but particularly for health policy makers and government,” he said. 

“If we did simple things well such as screening for some of the factors identified in this report, with adequate resources to perform this, we have the potential to prevent dementia on a national scale. This would be far more cost effective than developing high-tech treatments, which so far have been disappointing in their impacts on people with established dementia,” Dr. Husain said. 

The Lancet Commission was funded by University College London, Alzheimer’s Society, Alzheimer’s Research UK, and the Economic and Social Research Council. A complete list of author disclosures is available with the original article. Dr. Snyder, Dr. Lakhan, Dr. Husain and Dr. Spires-Jones have no relevant disclosures.

A version of this article appeared on Medscape.com.

Nearly half of dementia cases worldwide could theoretically be prevented or delayed by eliminating 14 modifiable risk factors during an individual’s lifetime, a report from the Lancet Commission on dementia prevention, intervention, and care.

The report adds two new modifiable risk factors for dementia — high cholesterol and vision loss — to the 12 risk factors identified in the 2020 Lancet Commission report, which were linked to about 40% of all dementia cases. 

The original Lancet Commission report, published in 2017, identified nine modifiable risk factors that were estimated to be responsible for one third of dementia cases. 

“Our new report reveals that there is much more that can and should be done to reduce the risk of dementia. It’s never too early or too late to act, with opportunities to make an impact at any stage of life,” lead author Gill Livingston, MD, from University College London in England, said in a statement. 

The 57-page report was published online in The Lancet Neurology (to coincide with its presentation at the 2024 Alzheimer’s Association International Conference (AAIC).
 

‘Compelling’ New Evidence 

The 12 risk factors cited in the 2020 report are lower levels of education, hearing loss, hypertension, smoking, obesity, depression, physical inactivity, diabetes, excessive alcohol consumption, traumatic brain injury (TBI), air pollution, and social isolation. 

According to the authors of the current report, there is “new compelling evidence” that untreated vision loss and elevated low-density lipoprotein (LDL) cholesterol are also risk factors for dementia.

These two added risk factors are associated with 9% of all dementia cases — with an estimated 7% of cases caused by high LDL cholesterol from about age 40 years, and 2% of cases caused by untreated vision loss in later life, the authors said.

Out of all 14 risk factors, those tied to the greatest proportion of dementia in the global population are hearing impairment and high LDL cholesterol (7% each), along with less education in early life, and social isolation in later life (5% each), the report estimates. 

The new report also outlines 13 recommendations aimed at individuals and governments to help guard against dementia. They include preventing and treating hearing loss, vision loss, and depression; being cognitively active throughout life; using head protection in contact sports; reducing vascular risk factors (high cholesterol, diabetes, obesity, hypertension); improving air quality; and providing supportive community environments to increase social contact. 

Tara Spires-Jones, PhD, president of the British Neuroscience Association, emphasized that, while this research doesn’t directly link specific factors to dementia, it supports evidence that a healthy lifestyle — encompassing education, social activities, exercise, cognitive engagement, and avoiding head injuries and harmful factors for heart and lung health — can enhance brain resilience and prevent dementia.

In an interview, Heather M. Snyder, PhD, senior vice president of medical and scientific relations, Alzheimer’s Association, said: “Our brains are complex and what happens throughout our lives may increase or decrease our risk for dementia as we age. Protecting brain health as we age requires a comprehensive approach that includes discussions on diet, exercise, heart health, hearing, and vision.”

Also weighing in on the new report, Shaheen Lakhan, MD, PhD, neurologist and researcher based in Miami, Florida, said the addition of high cholesterol is “particularly noteworthy as it reinforces the intricate connection between vascular health and brain health — a link we’ve long suspected but can now target more effectively.”

As for vision loss, “it’s not just a matter of seeing clearly; it’s a matter of thinking clearly. Untreated vision loss can lead to social isolation, reduced physical activity, and cognitive decline,” said Dr. Lakhan. 
 

 

 

Dementia Is Not Inevitable

In his view, “the potential to prevent or delay nearly half of dementia cases by addressing these risk factors is nothing short of revolutionary. It shifts our perspective from viewing dementia as an inevitable part of aging to seeing it as a condition we can actively work to prevent,” Dr. Lakhan added.

He said the report’s emphasis on health equity is also important. 

“Dementia risk factors disproportionately affect socioeconomically disadvantaged groups and low- and middle-income countries. Addressing these disparities isn’t just a matter of fairness in the fight against dementia, equality in prevention is as important as equality in treatment,” Dr. Lakhan commented.

While the report offers hope, it also presents a challenge, he said. 

Implementing the recommended preventive measures requires a “coordinated effort from individuals, healthcare systems, and policymakers. The potential benefits, both in terms of quality of life and economic savings, make this effort not just worthwhile but imperative. Preventing dementia is not just a medical imperative — it’s an economic and humanitarian one,” Dr. Lakhan said. 

Masud Husain, PhD, with the University of Oxford in England, agreed. 

The conclusions in this report are “very important for all of us, but particularly for health policy makers and government,” he said. 

“If we did simple things well such as screening for some of the factors identified in this report, with adequate resources to perform this, we have the potential to prevent dementia on a national scale. This would be far more cost effective than developing high-tech treatments, which so far have been disappointing in their impacts on people with established dementia,” Dr. Husain said. 

The Lancet Commission was funded by University College London, Alzheimer’s Society, Alzheimer’s Research UK, and the Economic and Social Research Council. A complete list of author disclosures is available with the original article. Dr. Snyder, Dr. Lakhan, Dr. Husain and Dr. Spires-Jones have no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Promising New Data Support GLP-1s for Dementia Prevention

Article Type
Changed
Wed, 07/31/2024 - 13:15

PHILADELPHIA – A new study supports the potential to repurpose glucagon-like peptide 1 (GLP-1) receptor agonists, used to treat type 2 diabetes and obesity, for dementia prevention.

In the phase 2b ELAD clinical trial, adults with early-stage Alzheimer’s disease taking the GLP-1 receptor agonist liraglutide exhibited slower decline in memory and thinking and experienced less brain atrophy over 12 months, compared with placebo.

“The slower loss of brain volume suggests liraglutide protects the brain, much like statins protect the heart,” study chief Paul Edison, MD, PhD, with Imperial College London, London, England, said in a statement.

“While further research is needed, liraglutide may work through various mechanisms, such as reducing inflammation in the brain, lowering insulin resistance and the toxic effects of Alzheimer’s biomarkers amyloid beta and tau, and improving how the brain’s nerve cells communicate,” Dr. Edison said.

He presented the study results at the 2024 Alzheimer’s Association International Conference (AAIC).

Brain Benefits

Liraglutide has previously demonstrated promising neuroprotective effects in animal models of Alzheimer’s disease and epidemiologic studies. 

In ELAD, 204 patients with mild to moderate Alzheimer’s disease were randomly allocated (1:1) to a daily subcutaneous injection of up to 1.8 mg of liraglutide or placebo for 12 months; 80 patients in the liraglutide group and 89 in the placebo group completed the study. 

Brain MRI was performed at baseline and at 12 months, along with neuropsychometric evaluation and 18F-fludeoxyglucose PET. 

The study’s primary endpoint — change in the cerebral glucose metabolic rate in the cortical regions of the brain (hippocampus, medial temporal lobe, and posterior cingulate) — was not met. 

However, patients taking liraglutide experienced a significant slowing of cognitive decline, compared with placebo group (P = .01), which was a key secondary outcome, calculated as a composite score of 18 different tests of memory, comprehension, language, and spatial orientation. 

Although the study was not powered to assess cognitive changes, adults taking liraglutide had an 18% slower decline in cognitive function over 12 months, compared with those on placebo, Dr. Edison reported. 

In addition, patients treated with liraglutide had nearly 50% less volume loss in several areas of the brain involved in memory, language, and decision-making, including frontal, temporal, parietal, and total gray matter, as measured by MRI. 

Liraglutide daily subcutaneous injections were safe and well tolerated in patients with Alzheimer’s disease, Dr. Edison reported. There were 25 serious side effects — 18 in the placebo group and 7 in the liraglutide group — and most were considered unlikely to be related to the study treatment. There were no deaths. 
 

Promising, Preliminary

This study shows a positive effect of liraglutide on the brain in terms of “slowing down of brain atrophy and slowing down the rate of cognitive decline,” said Howard Fillit, MD, founding executive director of the Alzheimer’s Drug Discovery Foundation, who wasn’t involved in the study.

Heather Snyder, PhD, vice-president of medical and scientific relations at the Alzheimer’s Association, said it’s “interesting” to see slowing of brain volume loss and some cognitive benefit “especially as the study was not powered necessarily to see some of those changes. The fact that they did see these changes in this small study provides a window into what may happen, but we certainly need larger phase 3 studies.”

In a statement from the UK nonprofit Science Media Centre, Tara Spires-Jones, PhD, president of the British Neuroscience Association and group leader at the UK Dementia Research Institute, called the data “promising.”

“There are clear links from strong data in the field between vascular risk factors including diabetes and obesity being associated with increased risk of dementia. The GLP-1 drug should help reduce these risk factors as well as potentially directly protecting brain cells,” Dr. Spires-Jones said. 

However, she said “more research in bigger trials is needed to confirm whether this type of treatment will be effective in people with Alzheimer’s disease.”

Stephen Evans, MSc, emeritus professor, London School of Hygiene and Tropical Medicine, noted that the repurposing of drugs is “an important avenue of research but there is a lot of uncertainty here.”

He cautioned that the “50% brain volume change may not translate to important cognitive effects, and reporting only on those who completed the full 52 weeks of treatment could bring bias into the results. It sounds like it is worth pursuing a larger trial, but these results cannot demonstrate that liraglutide can protect against dementia.”

The ongoing phase 3 EVOKE trial is investigating the effects of the GLP-1 receptor agonist semaglutide in early Alzheimer’s disease.

Funding for the study was provided by Alzheimer’s Society UK, Alzheimer’s Drug Discovery Foundation, Novo Nordisk, John and Lucille Van Geest Foundation, and the National Institute for Health and Care Research Biomedical Research Centre. Dr. Edison, Dr. Fillit, Dr. Snyder, Mr. Evans, and Dr. Spires-Jones had no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

PHILADELPHIA – A new study supports the potential to repurpose glucagon-like peptide 1 (GLP-1) receptor agonists, used to treat type 2 diabetes and obesity, for dementia prevention.

In the phase 2b ELAD clinical trial, adults with early-stage Alzheimer’s disease taking the GLP-1 receptor agonist liraglutide exhibited slower decline in memory and thinking and experienced less brain atrophy over 12 months, compared with placebo.

“The slower loss of brain volume suggests liraglutide protects the brain, much like statins protect the heart,” study chief Paul Edison, MD, PhD, with Imperial College London, London, England, said in a statement.

“While further research is needed, liraglutide may work through various mechanisms, such as reducing inflammation in the brain, lowering insulin resistance and the toxic effects of Alzheimer’s biomarkers amyloid beta and tau, and improving how the brain’s nerve cells communicate,” Dr. Edison said.

He presented the study results at the 2024 Alzheimer’s Association International Conference (AAIC).

Brain Benefits

Liraglutide has previously demonstrated promising neuroprotective effects in animal models of Alzheimer’s disease and epidemiologic studies. 

In ELAD, 204 patients with mild to moderate Alzheimer’s disease were randomly allocated (1:1) to a daily subcutaneous injection of up to 1.8 mg of liraglutide or placebo for 12 months; 80 patients in the liraglutide group and 89 in the placebo group completed the study. 

Brain MRI was performed at baseline and at 12 months, along with neuropsychometric evaluation and 18F-fludeoxyglucose PET. 

The study’s primary endpoint — change in the cerebral glucose metabolic rate in the cortical regions of the brain (hippocampus, medial temporal lobe, and posterior cingulate) — was not met. 

However, patients taking liraglutide experienced a significant slowing of cognitive decline, compared with placebo group (P = .01), which was a key secondary outcome, calculated as a composite score of 18 different tests of memory, comprehension, language, and spatial orientation. 

Although the study was not powered to assess cognitive changes, adults taking liraglutide had an 18% slower decline in cognitive function over 12 months, compared with those on placebo, Dr. Edison reported. 

In addition, patients treated with liraglutide had nearly 50% less volume loss in several areas of the brain involved in memory, language, and decision-making, including frontal, temporal, parietal, and total gray matter, as measured by MRI. 

Liraglutide daily subcutaneous injections were safe and well tolerated in patients with Alzheimer’s disease, Dr. Edison reported. There were 25 serious side effects — 18 in the placebo group and 7 in the liraglutide group — and most were considered unlikely to be related to the study treatment. There were no deaths. 
 

Promising, Preliminary

This study shows a positive effect of liraglutide on the brain in terms of “slowing down of brain atrophy and slowing down the rate of cognitive decline,” said Howard Fillit, MD, founding executive director of the Alzheimer’s Drug Discovery Foundation, who wasn’t involved in the study.

Heather Snyder, PhD, vice-president of medical and scientific relations at the Alzheimer’s Association, said it’s “interesting” to see slowing of brain volume loss and some cognitive benefit “especially as the study was not powered necessarily to see some of those changes. The fact that they did see these changes in this small study provides a window into what may happen, but we certainly need larger phase 3 studies.”

In a statement from the UK nonprofit Science Media Centre, Tara Spires-Jones, PhD, president of the British Neuroscience Association and group leader at the UK Dementia Research Institute, called the data “promising.”

“There are clear links from strong data in the field between vascular risk factors including diabetes and obesity being associated with increased risk of dementia. The GLP-1 drug should help reduce these risk factors as well as potentially directly protecting brain cells,” Dr. Spires-Jones said. 

However, she said “more research in bigger trials is needed to confirm whether this type of treatment will be effective in people with Alzheimer’s disease.”

Stephen Evans, MSc, emeritus professor, London School of Hygiene and Tropical Medicine, noted that the repurposing of drugs is “an important avenue of research but there is a lot of uncertainty here.”

He cautioned that the “50% brain volume change may not translate to important cognitive effects, and reporting only on those who completed the full 52 weeks of treatment could bring bias into the results. It sounds like it is worth pursuing a larger trial, but these results cannot demonstrate that liraglutide can protect against dementia.”

The ongoing phase 3 EVOKE trial is investigating the effects of the GLP-1 receptor agonist semaglutide in early Alzheimer’s disease.

Funding for the study was provided by Alzheimer’s Society UK, Alzheimer’s Drug Discovery Foundation, Novo Nordisk, John and Lucille Van Geest Foundation, and the National Institute for Health and Care Research Biomedical Research Centre. Dr. Edison, Dr. Fillit, Dr. Snyder, Mr. Evans, and Dr. Spires-Jones had no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

PHILADELPHIA – A new study supports the potential to repurpose glucagon-like peptide 1 (GLP-1) receptor agonists, used to treat type 2 diabetes and obesity, for dementia prevention.

In the phase 2b ELAD clinical trial, adults with early-stage Alzheimer’s disease taking the GLP-1 receptor agonist liraglutide exhibited slower decline in memory and thinking and experienced less brain atrophy over 12 months, compared with placebo.

“The slower loss of brain volume suggests liraglutide protects the brain, much like statins protect the heart,” study chief Paul Edison, MD, PhD, with Imperial College London, London, England, said in a statement.

“While further research is needed, liraglutide may work through various mechanisms, such as reducing inflammation in the brain, lowering insulin resistance and the toxic effects of Alzheimer’s biomarkers amyloid beta and tau, and improving how the brain’s nerve cells communicate,” Dr. Edison said.

He presented the study results at the 2024 Alzheimer’s Association International Conference (AAIC).

Brain Benefits

Liraglutide has previously demonstrated promising neuroprotective effects in animal models of Alzheimer’s disease and epidemiologic studies. 

In ELAD, 204 patients with mild to moderate Alzheimer’s disease were randomly allocated (1:1) to a daily subcutaneous injection of up to 1.8 mg of liraglutide or placebo for 12 months; 80 patients in the liraglutide group and 89 in the placebo group completed the study. 

Brain MRI was performed at baseline and at 12 months, along with neuropsychometric evaluation and 18F-fludeoxyglucose PET. 

The study’s primary endpoint — change in the cerebral glucose metabolic rate in the cortical regions of the brain (hippocampus, medial temporal lobe, and posterior cingulate) — was not met. 

However, patients taking liraglutide experienced a significant slowing of cognitive decline, compared with placebo group (P = .01), which was a key secondary outcome, calculated as a composite score of 18 different tests of memory, comprehension, language, and spatial orientation. 

Although the study was not powered to assess cognitive changes, adults taking liraglutide had an 18% slower decline in cognitive function over 12 months, compared with those on placebo, Dr. Edison reported. 

In addition, patients treated with liraglutide had nearly 50% less volume loss in several areas of the brain involved in memory, language, and decision-making, including frontal, temporal, parietal, and total gray matter, as measured by MRI. 

Liraglutide daily subcutaneous injections were safe and well tolerated in patients with Alzheimer’s disease, Dr. Edison reported. There were 25 serious side effects — 18 in the placebo group and 7 in the liraglutide group — and most were considered unlikely to be related to the study treatment. There were no deaths. 
 

Promising, Preliminary

This study shows a positive effect of liraglutide on the brain in terms of “slowing down of brain atrophy and slowing down the rate of cognitive decline,” said Howard Fillit, MD, founding executive director of the Alzheimer’s Drug Discovery Foundation, who wasn’t involved in the study.

Heather Snyder, PhD, vice-president of medical and scientific relations at the Alzheimer’s Association, said it’s “interesting” to see slowing of brain volume loss and some cognitive benefit “especially as the study was not powered necessarily to see some of those changes. The fact that they did see these changes in this small study provides a window into what may happen, but we certainly need larger phase 3 studies.”

In a statement from the UK nonprofit Science Media Centre, Tara Spires-Jones, PhD, president of the British Neuroscience Association and group leader at the UK Dementia Research Institute, called the data “promising.”

“There are clear links from strong data in the field between vascular risk factors including diabetes and obesity being associated with increased risk of dementia. The GLP-1 drug should help reduce these risk factors as well as potentially directly protecting brain cells,” Dr. Spires-Jones said. 

However, she said “more research in bigger trials is needed to confirm whether this type of treatment will be effective in people with Alzheimer’s disease.”

Stephen Evans, MSc, emeritus professor, London School of Hygiene and Tropical Medicine, noted that the repurposing of drugs is “an important avenue of research but there is a lot of uncertainty here.”

He cautioned that the “50% brain volume change may not translate to important cognitive effects, and reporting only on those who completed the full 52 weeks of treatment could bring bias into the results. It sounds like it is worth pursuing a larger trial, but these results cannot demonstrate that liraglutide can protect against dementia.”

The ongoing phase 3 EVOKE trial is investigating the effects of the GLP-1 receptor agonist semaglutide in early Alzheimer’s disease.

Funding for the study was provided by Alzheimer’s Society UK, Alzheimer’s Drug Discovery Foundation, Novo Nordisk, John and Lucille Van Geest Foundation, and the National Institute for Health and Care Research Biomedical Research Centre. Dr. Edison, Dr. Fillit, Dr. Snyder, Mr. Evans, and Dr. Spires-Jones had no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Red Meat Tied to Increased Dementia Risk

Article Type
Changed
Wed, 07/31/2024 - 13:08

PHILADELPHIA – Higher intake of processed red meat, including bacon, hot dogs, and sausages, is associated with an elevated dementia risk, preliminary research shows.

Study participants who consumed 0.25 or more servings of processed meat per day, or roughly two servings per week, had a 15% higher risk for dementia, compared with those who consumed less than 0.10 serving per day, which is about three servings per month. 

“Our study found a higher intake of red meat — particularly processed red meat — was associated with a higher risk of developing dementia, as well as worse cognition,” said study author Yuhan Li, MHS, research assistant, Channing Division of Network Medicine, Brigham and Women’s Hospital, Boston, Massachusetts.

However, the study also showed that replacing processed red meat with nuts and legumes could potentially lower this increased risk.

The findings were presented on at the 2024 Alzheimer’s Association International Conference (AAIC).

Inconsistent Research 

Previous studies have shown an inconsistent association between red meat intake and cognitive health.

To assess the relationship between diet and dementia, the researchers used data from the Nurses’ Health Study, which began recruiting female registered nurses aged 30-55 years in 1976, and the Health Professionals Follow-Up Study, which began recruiting male health professionals aged 40-75 in 1986.

They assessed processed red meat intake by validated semi-quantitative food frequency questionnaires administered every 2-4 years. Participants were asked how often they consumed a serving of processed red meat.

Investigators also assessed intake of unprocessed red meat, including beef, pork, or lamb as a main dish, in a sandwich or hamburger, or in a mixed dish. 

The investigators also looked at participants’ intake of nuts and legumes.

Dementia outcome was a composite endpoint of self-reported dementia and dementia-related death. “Specifically, participants reported a physician diagnosis of Alzheimer’s disease or other forms of dementia by questionnaire. Deaths were identified through state vital statistics records, the National Death Index, family reports, and the postal system,” said Ms. Li.
 

Three Cognitive Outcomes

Researchers examined three outcomes: dementia, subjective cognitive decline, and objective cognitive function. For dementia, they ascertained incident cases in 87,424 individuals in the UK’s National Health Service database without Parkinson’s disease or baseline dementia, stroke, or cancer. 

They longitudinally collected information on subjective cognitive decline from 33,908 Nurses’ Health Study participants and 10,058 participants in the Health Professionals Follow-Up Study.

Cognitive function was assessed using the Telephone Interview for Cognitive Status (1995-2008) in a subset of 17,458 Nurses’ Health Study participants.

Over a follow-up of 38 years (1980-2018), there were 6856 dementia cases in the Nurses’ Health Study. Participants with processed red meat intake of 0.25 or more serving/day, compared with less than 0.10 serving/day, had 15% higher risk for dementia (hazard ratio [HR], 1.15; 95% CI, 1.08-1.23; P < .001). 

In addition to an increased risk for dementia, intake of processed red meat was associated with accelerated cognitive aging in global cognition (1.61 years per 1–serving/day increment; 95% CI, 0.20, 3.03) and verbal memory (1.69 years per 1–serving/day increment; 95% CI, 0.13, 3.25; both P = .03).

Participants with processed red meat intake of 0.25 or more serving/day had a 14% higher likelihood of subjective cognitive decline, compared with those with intake less than 0.10 serving/day (odds ratio [OR], 1.14; 95% CI, 1.04-1.24; P = .004). 

For unprocessed red meat, consuming 1.00 or more serving/day versus less than 0.50 serving/day was associated with a 16% higher likelihood of subjective cognitive decline (OR, 1.16; 95% CI, 1.04-1.30; P = .02). 
 

 

 

Substitution Analysis

Researchers modeled the effects of replacing 1 serving/day of processed red meat with 1 serving/day of nuts and legumes on cognitive outcomes. They did this by treating food intakes as continuous variables and calculating the differences in coefficients of the two food items.

They found that substituting legumes and nuts was associated with a 23% lower risk for dementia (HR, 0.77; 95% CI, 0.69-0.86), 1.37 fewer years of cognitive aging (95% CI, –2.49 to –0.25), and 20% lower odds of subjective cognitive decline (OR, 0.80, 95% CI, 0.69-0.92).

The research cannot determine whether it’s the processing method itself or the type of red meat that affects cognition, Ms. Li cautioned. 

“Our study is an epidemiologic study, not a biological mechanism study, but based on our findings, red meat may be related to worse cognition, and processed red meat may add additional risk,” she said. 

She also noted that because the study focused solely on red meats, the study cannot determine the potential on the impact of other processed meats on cognition.

Although the study doesn’t address a possible mechanism linking processed red meat with cognition, Ms. Li said it’s possible such meats have high levels of relatively harmful substances, such as nitrites, N-nitroso compounds, and sodium, and that “these carry the additional risk to brain health.”

There are currently no specific guidelines regarding the “safe” amount of processed meat consumption specifically related to cognition, she said.

The study is important because of its large sample size, long follow-up period, and inclusion of repeated measurements of diet, the investigators noted. In addition, researchers assessed both processed and unprocessed red meat and evaluated multiple cognitive outcomes.

The investigators plan to assess the association between other modifiable factors and cognitive health.
 

Experts Weigh In 

In a comment, Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer’s Association, agreed past studies on the topic have been “mixed,” with only some studies reporting links between cognition or dementia and processed red meat. 

Another unique aspect of the study, said Dr. Sexton, was the replacement analysis showing the brain benefits of eating nuts and legumes in place of processed red meat. “So, it’s not just suggesting to people what not to do, but also what they can be doing instead.”

That’s why this large study with more than 130,000 adults that tracked individuals for close to 40 years in some cases “is so valuable,” she added.

In a release from the Science Media Centre in the United Kingdom, several other experts commented on the study. Among them, Kevin McConway, PhD, emeritus professor of applied statistics at the Open University, Milton Keynes, England, said that “it’s pretty well impossible to get a clear message from the information that is available so far about this research. It is a conference paper, and all we have seen so far is a press release, a brief summary of the research, and a diagram. There isn’t a detailed, peer-reviewed research report, not yet anyway. Putting out limited information like this isn’t the right way to report science.”

Dr. McConway also noted that the observational study recorded participants’ diets and dementia diagnoses over several years without assigning specific diets. Those who ate more red processed meat had higher rates of dementia and cognitive decline. However, it’s unclear if these differences are caused by red meat consumption or other factors, such as diet, age, ethnicity, or location.

Researchers typically adjust for these factors, but the available information doesn’t specify what adjustments were made or their impact, he noted, and without detailed data, it’s impossible to evaluate the study’s quality. Although eating more red processed meat might increase dementia risk, more research is needed to confirm this, Dr. McConway added. 

Also commenting, Sebastian Walsh, a National Institute for Health and Care Research doctoral fellow who researches population-level approaches to dementia risk reduction at University of Cambridge, Cambridge, England, said that without seeing the full paper, it’s difficult to know exactly what to make of the study’s findings. 

“On the surface, this is a large and long study. But it isn’t clear how the analysis was done — specifically what other factors were taken into account when looking at this apparent relationship between red meat and dementia.

“Despite a lot of research looking at specific foods and different diseases, the basic public health advice that eating a healthy, balanced diet is good for health is essentially unchanged. Most people know and accept this. What is most important is to find ways of supporting people, particularly those from poorer backgrounds, to follow this advice and address the obesity epidemic,” said Mr. Walsh. 

The study was funded by a National Institutes of Health research grant. Ms. Li reports no relevant conflicts of interest. Dr. Sexton, Dr. McConway, and Mr. Walsh report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

PHILADELPHIA – Higher intake of processed red meat, including bacon, hot dogs, and sausages, is associated with an elevated dementia risk, preliminary research shows.

Study participants who consumed 0.25 or more servings of processed meat per day, or roughly two servings per week, had a 15% higher risk for dementia, compared with those who consumed less than 0.10 serving per day, which is about three servings per month. 

“Our study found a higher intake of red meat — particularly processed red meat — was associated with a higher risk of developing dementia, as well as worse cognition,” said study author Yuhan Li, MHS, research assistant, Channing Division of Network Medicine, Brigham and Women’s Hospital, Boston, Massachusetts.

However, the study also showed that replacing processed red meat with nuts and legumes could potentially lower this increased risk.

The findings were presented on at the 2024 Alzheimer’s Association International Conference (AAIC).

Inconsistent Research 

Previous studies have shown an inconsistent association between red meat intake and cognitive health.

To assess the relationship between diet and dementia, the researchers used data from the Nurses’ Health Study, which began recruiting female registered nurses aged 30-55 years in 1976, and the Health Professionals Follow-Up Study, which began recruiting male health professionals aged 40-75 in 1986.

They assessed processed red meat intake by validated semi-quantitative food frequency questionnaires administered every 2-4 years. Participants were asked how often they consumed a serving of processed red meat.

Investigators also assessed intake of unprocessed red meat, including beef, pork, or lamb as a main dish, in a sandwich or hamburger, or in a mixed dish. 

The investigators also looked at participants’ intake of nuts and legumes.

Dementia outcome was a composite endpoint of self-reported dementia and dementia-related death. “Specifically, participants reported a physician diagnosis of Alzheimer’s disease or other forms of dementia by questionnaire. Deaths were identified through state vital statistics records, the National Death Index, family reports, and the postal system,” said Ms. Li.
 

Three Cognitive Outcomes

Researchers examined three outcomes: dementia, subjective cognitive decline, and objective cognitive function. For dementia, they ascertained incident cases in 87,424 individuals in the UK’s National Health Service database without Parkinson’s disease or baseline dementia, stroke, or cancer. 

They longitudinally collected information on subjective cognitive decline from 33,908 Nurses’ Health Study participants and 10,058 participants in the Health Professionals Follow-Up Study.

Cognitive function was assessed using the Telephone Interview for Cognitive Status (1995-2008) in a subset of 17,458 Nurses’ Health Study participants.

Over a follow-up of 38 years (1980-2018), there were 6856 dementia cases in the Nurses’ Health Study. Participants with processed red meat intake of 0.25 or more serving/day, compared with less than 0.10 serving/day, had 15% higher risk for dementia (hazard ratio [HR], 1.15; 95% CI, 1.08-1.23; P < .001). 

In addition to an increased risk for dementia, intake of processed red meat was associated with accelerated cognitive aging in global cognition (1.61 years per 1–serving/day increment; 95% CI, 0.20, 3.03) and verbal memory (1.69 years per 1–serving/day increment; 95% CI, 0.13, 3.25; both P = .03).

Participants with processed red meat intake of 0.25 or more serving/day had a 14% higher likelihood of subjective cognitive decline, compared with those with intake less than 0.10 serving/day (odds ratio [OR], 1.14; 95% CI, 1.04-1.24; P = .004). 

For unprocessed red meat, consuming 1.00 or more serving/day versus less than 0.50 serving/day was associated with a 16% higher likelihood of subjective cognitive decline (OR, 1.16; 95% CI, 1.04-1.30; P = .02). 
 

 

 

Substitution Analysis

Researchers modeled the effects of replacing 1 serving/day of processed red meat with 1 serving/day of nuts and legumes on cognitive outcomes. They did this by treating food intakes as continuous variables and calculating the differences in coefficients of the two food items.

They found that substituting legumes and nuts was associated with a 23% lower risk for dementia (HR, 0.77; 95% CI, 0.69-0.86), 1.37 fewer years of cognitive aging (95% CI, –2.49 to –0.25), and 20% lower odds of subjective cognitive decline (OR, 0.80, 95% CI, 0.69-0.92).

The research cannot determine whether it’s the processing method itself or the type of red meat that affects cognition, Ms. Li cautioned. 

“Our study is an epidemiologic study, not a biological mechanism study, but based on our findings, red meat may be related to worse cognition, and processed red meat may add additional risk,” she said. 

She also noted that because the study focused solely on red meats, the study cannot determine the potential on the impact of other processed meats on cognition.

Although the study doesn’t address a possible mechanism linking processed red meat with cognition, Ms. Li said it’s possible such meats have high levels of relatively harmful substances, such as nitrites, N-nitroso compounds, and sodium, and that “these carry the additional risk to brain health.”

There are currently no specific guidelines regarding the “safe” amount of processed meat consumption specifically related to cognition, she said.

The study is important because of its large sample size, long follow-up period, and inclusion of repeated measurements of diet, the investigators noted. In addition, researchers assessed both processed and unprocessed red meat and evaluated multiple cognitive outcomes.

The investigators plan to assess the association between other modifiable factors and cognitive health.
 

Experts Weigh In 

In a comment, Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer’s Association, agreed past studies on the topic have been “mixed,” with only some studies reporting links between cognition or dementia and processed red meat. 

Another unique aspect of the study, said Dr. Sexton, was the replacement analysis showing the brain benefits of eating nuts and legumes in place of processed red meat. “So, it’s not just suggesting to people what not to do, but also what they can be doing instead.”

That’s why this large study with more than 130,000 adults that tracked individuals for close to 40 years in some cases “is so valuable,” she added.

In a release from the Science Media Centre in the United Kingdom, several other experts commented on the study. Among them, Kevin McConway, PhD, emeritus professor of applied statistics at the Open University, Milton Keynes, England, said that “it’s pretty well impossible to get a clear message from the information that is available so far about this research. It is a conference paper, and all we have seen so far is a press release, a brief summary of the research, and a diagram. There isn’t a detailed, peer-reviewed research report, not yet anyway. Putting out limited information like this isn’t the right way to report science.”

Dr. McConway also noted that the observational study recorded participants’ diets and dementia diagnoses over several years without assigning specific diets. Those who ate more red processed meat had higher rates of dementia and cognitive decline. However, it’s unclear if these differences are caused by red meat consumption or other factors, such as diet, age, ethnicity, or location.

Researchers typically adjust for these factors, but the available information doesn’t specify what adjustments were made or their impact, he noted, and without detailed data, it’s impossible to evaluate the study’s quality. Although eating more red processed meat might increase dementia risk, more research is needed to confirm this, Dr. McConway added. 

Also commenting, Sebastian Walsh, a National Institute for Health and Care Research doctoral fellow who researches population-level approaches to dementia risk reduction at University of Cambridge, Cambridge, England, said that without seeing the full paper, it’s difficult to know exactly what to make of the study’s findings. 

“On the surface, this is a large and long study. But it isn’t clear how the analysis was done — specifically what other factors were taken into account when looking at this apparent relationship between red meat and dementia.

“Despite a lot of research looking at specific foods and different diseases, the basic public health advice that eating a healthy, balanced diet is good for health is essentially unchanged. Most people know and accept this. What is most important is to find ways of supporting people, particularly those from poorer backgrounds, to follow this advice and address the obesity epidemic,” said Mr. Walsh. 

The study was funded by a National Institutes of Health research grant. Ms. Li reports no relevant conflicts of interest. Dr. Sexton, Dr. McConway, and Mr. Walsh report no relevant disclosures.

A version of this article first appeared on Medscape.com.

PHILADELPHIA – Higher intake of processed red meat, including bacon, hot dogs, and sausages, is associated with an elevated dementia risk, preliminary research shows.

Study participants who consumed 0.25 or more servings of processed meat per day, or roughly two servings per week, had a 15% higher risk for dementia, compared with those who consumed less than 0.10 serving per day, which is about three servings per month. 

“Our study found a higher intake of red meat — particularly processed red meat — was associated with a higher risk of developing dementia, as well as worse cognition,” said study author Yuhan Li, MHS, research assistant, Channing Division of Network Medicine, Brigham and Women’s Hospital, Boston, Massachusetts.

However, the study also showed that replacing processed red meat with nuts and legumes could potentially lower this increased risk.

The findings were presented on at the 2024 Alzheimer’s Association International Conference (AAIC).

Inconsistent Research 

Previous studies have shown an inconsistent association between red meat intake and cognitive health.

To assess the relationship between diet and dementia, the researchers used data from the Nurses’ Health Study, which began recruiting female registered nurses aged 30-55 years in 1976, and the Health Professionals Follow-Up Study, which began recruiting male health professionals aged 40-75 in 1986.

They assessed processed red meat intake by validated semi-quantitative food frequency questionnaires administered every 2-4 years. Participants were asked how often they consumed a serving of processed red meat.

Investigators also assessed intake of unprocessed red meat, including beef, pork, or lamb as a main dish, in a sandwich or hamburger, or in a mixed dish. 

The investigators also looked at participants’ intake of nuts and legumes.

Dementia outcome was a composite endpoint of self-reported dementia and dementia-related death. “Specifically, participants reported a physician diagnosis of Alzheimer’s disease or other forms of dementia by questionnaire. Deaths were identified through state vital statistics records, the National Death Index, family reports, and the postal system,” said Ms. Li.
 

Three Cognitive Outcomes

Researchers examined three outcomes: dementia, subjective cognitive decline, and objective cognitive function. For dementia, they ascertained incident cases in 87,424 individuals in the UK’s National Health Service database without Parkinson’s disease or baseline dementia, stroke, or cancer. 

They longitudinally collected information on subjective cognitive decline from 33,908 Nurses’ Health Study participants and 10,058 participants in the Health Professionals Follow-Up Study.

Cognitive function was assessed using the Telephone Interview for Cognitive Status (1995-2008) in a subset of 17,458 Nurses’ Health Study participants.

Over a follow-up of 38 years (1980-2018), there were 6856 dementia cases in the Nurses’ Health Study. Participants with processed red meat intake of 0.25 or more serving/day, compared with less than 0.10 serving/day, had 15% higher risk for dementia (hazard ratio [HR], 1.15; 95% CI, 1.08-1.23; P < .001). 

In addition to an increased risk for dementia, intake of processed red meat was associated with accelerated cognitive aging in global cognition (1.61 years per 1–serving/day increment; 95% CI, 0.20, 3.03) and verbal memory (1.69 years per 1–serving/day increment; 95% CI, 0.13, 3.25; both P = .03).

Participants with processed red meat intake of 0.25 or more serving/day had a 14% higher likelihood of subjective cognitive decline, compared with those with intake less than 0.10 serving/day (odds ratio [OR], 1.14; 95% CI, 1.04-1.24; P = .004). 

For unprocessed red meat, consuming 1.00 or more serving/day versus less than 0.50 serving/day was associated with a 16% higher likelihood of subjective cognitive decline (OR, 1.16; 95% CI, 1.04-1.30; P = .02). 
 

 

 

Substitution Analysis

Researchers modeled the effects of replacing 1 serving/day of processed red meat with 1 serving/day of nuts and legumes on cognitive outcomes. They did this by treating food intakes as continuous variables and calculating the differences in coefficients of the two food items.

They found that substituting legumes and nuts was associated with a 23% lower risk for dementia (HR, 0.77; 95% CI, 0.69-0.86), 1.37 fewer years of cognitive aging (95% CI, –2.49 to –0.25), and 20% lower odds of subjective cognitive decline (OR, 0.80, 95% CI, 0.69-0.92).

The research cannot determine whether it’s the processing method itself or the type of red meat that affects cognition, Ms. Li cautioned. 

“Our study is an epidemiologic study, not a biological mechanism study, but based on our findings, red meat may be related to worse cognition, and processed red meat may add additional risk,” she said. 

She also noted that because the study focused solely on red meats, the study cannot determine the potential on the impact of other processed meats on cognition.

Although the study doesn’t address a possible mechanism linking processed red meat with cognition, Ms. Li said it’s possible such meats have high levels of relatively harmful substances, such as nitrites, N-nitroso compounds, and sodium, and that “these carry the additional risk to brain health.”

There are currently no specific guidelines regarding the “safe” amount of processed meat consumption specifically related to cognition, she said.

The study is important because of its large sample size, long follow-up period, and inclusion of repeated measurements of diet, the investigators noted. In addition, researchers assessed both processed and unprocessed red meat and evaluated multiple cognitive outcomes.

The investigators plan to assess the association between other modifiable factors and cognitive health.
 

Experts Weigh In 

In a comment, Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer’s Association, agreed past studies on the topic have been “mixed,” with only some studies reporting links between cognition or dementia and processed red meat. 

Another unique aspect of the study, said Dr. Sexton, was the replacement analysis showing the brain benefits of eating nuts and legumes in place of processed red meat. “So, it’s not just suggesting to people what not to do, but also what they can be doing instead.”

That’s why this large study with more than 130,000 adults that tracked individuals for close to 40 years in some cases “is so valuable,” she added.

In a release from the Science Media Centre in the United Kingdom, several other experts commented on the study. Among them, Kevin McConway, PhD, emeritus professor of applied statistics at the Open University, Milton Keynes, England, said that “it’s pretty well impossible to get a clear message from the information that is available so far about this research. It is a conference paper, and all we have seen so far is a press release, a brief summary of the research, and a diagram. There isn’t a detailed, peer-reviewed research report, not yet anyway. Putting out limited information like this isn’t the right way to report science.”

Dr. McConway also noted that the observational study recorded participants’ diets and dementia diagnoses over several years without assigning specific diets. Those who ate more red processed meat had higher rates of dementia and cognitive decline. However, it’s unclear if these differences are caused by red meat consumption or other factors, such as diet, age, ethnicity, or location.

Researchers typically adjust for these factors, but the available information doesn’t specify what adjustments were made or their impact, he noted, and without detailed data, it’s impossible to evaluate the study’s quality. Although eating more red processed meat might increase dementia risk, more research is needed to confirm this, Dr. McConway added. 

Also commenting, Sebastian Walsh, a National Institute for Health and Care Research doctoral fellow who researches population-level approaches to dementia risk reduction at University of Cambridge, Cambridge, England, said that without seeing the full paper, it’s difficult to know exactly what to make of the study’s findings. 

“On the surface, this is a large and long study. But it isn’t clear how the analysis was done — specifically what other factors were taken into account when looking at this apparent relationship between red meat and dementia.

“Despite a lot of research looking at specific foods and different diseases, the basic public health advice that eating a healthy, balanced diet is good for health is essentially unchanged. Most people know and accept this. What is most important is to find ways of supporting people, particularly those from poorer backgrounds, to follow this advice and address the obesity epidemic,” said Mr. Walsh. 

The study was funded by a National Institutes of Health research grant. Ms. Li reports no relevant conflicts of interest. Dr. Sexton, Dr. McConway, and Mr. Walsh report no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Tau Blood Test Flags Preclinical Alzheimer’s Disease

Article Type
Changed
Wed, 07/31/2024 - 13:09

Plasma phosphorylated (p)-tau217 testing can help identify preclinical Alzheimer’s disease, which could aid clinical trial recruitment.

Recruiting preclinical Alzheimer’s disease participants for clinical research is challenging, owing to a lack of symptoms and the high cost and invasiveness of cerebrospinal fluid (CSF) tests and brain amyloid PET imaging.

Plasma p-tau217 has consistently shown high performance in detecting Alzheimer’s disease pathology in patients with mild cognitive impairment and dementia, but there has been concern that it may have lower accuracy in cognitively unimpaired adults, said lead investigator Gemma Salvadó, PhD, with the Clinical Memory Research Unit, Lund University, Lund, Sweden.

However, “our study shows that plasma p-tau217, alone or in combination with invasive tests, can be used accurately to assess amyloid positivity in cognitively unimpaired participants, to streamline the inclusion of these participants in preventive clinical trials,” she said. 

The findings were presented at the 2024 Alzheimer’s Association International Conference (AAIC).
 

Correlation to CSF, PET Amyloid Status

The investigators assessed the clinical accuracy of plasma p-tau217 as a prescreening method in 2917 cognitively unimpaired adults (mean age, 67 years; 57% women) across 12 independent cohorts who had available plasma p-tau217 and amyloid beta PET imaging or CSF samples. 

They found that plasma p-tau217 levels correlated with amyloid beta CSF status and PET load. 

As a standalone test, plasma p-tau217 identified amyloid beta PET–positive cognitively normal adults with a positive predictive value of 80% or greater. 

The positive predictive value increased to 95% or greater when amyloid beta CSF or PET was used to confirm a positive plasma p-tau217 result. 

As a first step, plasma p-tau217 could significantly reduce the number of invasive tests performed because only individuals with a positive p-tau217 test would go on to PET imaging or CSF sampling, Dr. Salvadó told conference attendees. This may reduce trial recruitment costs and get more patients enrolled. 

Although the study had a large sample size, “these results should be replicated in independent studies, [in] more heterogeneous participants, and coming from the clinical setting instead of observational studies to avoid possible bias,” Dr. Salvadó added. 
 

A New Diagnostic Era 

Commenting on the research, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, said what’s particularly interesting about this study is that the researchers examined multiple cohorts of cognitively unimpaired individuals and “consistently” found that plasma p-tau217 could identify individuals with amyloid-positive PET and CSF with high accuracy. 

“This may reduce the need for more expensive and more invasive scans or lumbar punctures to confirm if an individual has the biology,” Dr. Snyder said. 

“Blood tests are revolutionizing Alzheimer’s detection, diagnosis and ultimately treatment,” added Howard Fillit, MD, cofounder and chief science officer of the Alzheimer’s Drug Discovery Foundation. 

He predicted that blood tests will “soon replace more invasive and costly PET scans as the standard of care and serve as the first line of defense in diagnosing the disease.”

“After many years of research, the field is in a place where we have novel biomarkers and diagnostics to support a diagnosis,” the way cholesterol is used to help detect heart disease, said Dr. Fillit. 

“The diagnostic framework for Alzheimer’s — an incredibly complex disease — is constantly evolving. As we usher in the new era of care, we are moving closer to the day when blood tests will be complemented by digital tools to provide precise and timely diagnoses and risk assessments backed by numerous data points, complementing existing cognitive tests,” he added. 

Funding for the study was provided by the Alzheimer’s Association, the European Union’s Horizon 2020 Research and Innovation Program, Alzheimerfonden, and Strategic Research Area MultiPark. Dr. Salvadó, Dr. Snyder, and Dr. Fillit have no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Plasma phosphorylated (p)-tau217 testing can help identify preclinical Alzheimer’s disease, which could aid clinical trial recruitment.

Recruiting preclinical Alzheimer’s disease participants for clinical research is challenging, owing to a lack of symptoms and the high cost and invasiveness of cerebrospinal fluid (CSF) tests and brain amyloid PET imaging.

Plasma p-tau217 has consistently shown high performance in detecting Alzheimer’s disease pathology in patients with mild cognitive impairment and dementia, but there has been concern that it may have lower accuracy in cognitively unimpaired adults, said lead investigator Gemma Salvadó, PhD, with the Clinical Memory Research Unit, Lund University, Lund, Sweden.

However, “our study shows that plasma p-tau217, alone or in combination with invasive tests, can be used accurately to assess amyloid positivity in cognitively unimpaired participants, to streamline the inclusion of these participants in preventive clinical trials,” she said. 

The findings were presented at the 2024 Alzheimer’s Association International Conference (AAIC).
 

Correlation to CSF, PET Amyloid Status

The investigators assessed the clinical accuracy of plasma p-tau217 as a prescreening method in 2917 cognitively unimpaired adults (mean age, 67 years; 57% women) across 12 independent cohorts who had available plasma p-tau217 and amyloid beta PET imaging or CSF samples. 

They found that plasma p-tau217 levels correlated with amyloid beta CSF status and PET load. 

As a standalone test, plasma p-tau217 identified amyloid beta PET–positive cognitively normal adults with a positive predictive value of 80% or greater. 

The positive predictive value increased to 95% or greater when amyloid beta CSF or PET was used to confirm a positive plasma p-tau217 result. 

As a first step, plasma p-tau217 could significantly reduce the number of invasive tests performed because only individuals with a positive p-tau217 test would go on to PET imaging or CSF sampling, Dr. Salvadó told conference attendees. This may reduce trial recruitment costs and get more patients enrolled. 

Although the study had a large sample size, “these results should be replicated in independent studies, [in] more heterogeneous participants, and coming from the clinical setting instead of observational studies to avoid possible bias,” Dr. Salvadó added. 
 

A New Diagnostic Era 

Commenting on the research, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, said what’s particularly interesting about this study is that the researchers examined multiple cohorts of cognitively unimpaired individuals and “consistently” found that plasma p-tau217 could identify individuals with amyloid-positive PET and CSF with high accuracy. 

“This may reduce the need for more expensive and more invasive scans or lumbar punctures to confirm if an individual has the biology,” Dr. Snyder said. 

“Blood tests are revolutionizing Alzheimer’s detection, diagnosis and ultimately treatment,” added Howard Fillit, MD, cofounder and chief science officer of the Alzheimer’s Drug Discovery Foundation. 

He predicted that blood tests will “soon replace more invasive and costly PET scans as the standard of care and serve as the first line of defense in diagnosing the disease.”

“After many years of research, the field is in a place where we have novel biomarkers and diagnostics to support a diagnosis,” the way cholesterol is used to help detect heart disease, said Dr. Fillit. 

“The diagnostic framework for Alzheimer’s — an incredibly complex disease — is constantly evolving. As we usher in the new era of care, we are moving closer to the day when blood tests will be complemented by digital tools to provide precise and timely diagnoses and risk assessments backed by numerous data points, complementing existing cognitive tests,” he added. 

Funding for the study was provided by the Alzheimer’s Association, the European Union’s Horizon 2020 Research and Innovation Program, Alzheimerfonden, and Strategic Research Area MultiPark. Dr. Salvadó, Dr. Snyder, and Dr. Fillit have no relevant disclosures.

A version of this article appeared on Medscape.com.

Plasma phosphorylated (p)-tau217 testing can help identify preclinical Alzheimer’s disease, which could aid clinical trial recruitment.

Recruiting preclinical Alzheimer’s disease participants for clinical research is challenging, owing to a lack of symptoms and the high cost and invasiveness of cerebrospinal fluid (CSF) tests and brain amyloid PET imaging.

Plasma p-tau217 has consistently shown high performance in detecting Alzheimer’s disease pathology in patients with mild cognitive impairment and dementia, but there has been concern that it may have lower accuracy in cognitively unimpaired adults, said lead investigator Gemma Salvadó, PhD, with the Clinical Memory Research Unit, Lund University, Lund, Sweden.

However, “our study shows that plasma p-tau217, alone or in combination with invasive tests, can be used accurately to assess amyloid positivity in cognitively unimpaired participants, to streamline the inclusion of these participants in preventive clinical trials,” she said. 

The findings were presented at the 2024 Alzheimer’s Association International Conference (AAIC).
 

Correlation to CSF, PET Amyloid Status

The investigators assessed the clinical accuracy of plasma p-tau217 as a prescreening method in 2917 cognitively unimpaired adults (mean age, 67 years; 57% women) across 12 independent cohorts who had available plasma p-tau217 and amyloid beta PET imaging or CSF samples. 

They found that plasma p-tau217 levels correlated with amyloid beta CSF status and PET load. 

As a standalone test, plasma p-tau217 identified amyloid beta PET–positive cognitively normal adults with a positive predictive value of 80% or greater. 

The positive predictive value increased to 95% or greater when amyloid beta CSF or PET was used to confirm a positive plasma p-tau217 result. 

As a first step, plasma p-tau217 could significantly reduce the number of invasive tests performed because only individuals with a positive p-tau217 test would go on to PET imaging or CSF sampling, Dr. Salvadó told conference attendees. This may reduce trial recruitment costs and get more patients enrolled. 

Although the study had a large sample size, “these results should be replicated in independent studies, [in] more heterogeneous participants, and coming from the clinical setting instead of observational studies to avoid possible bias,” Dr. Salvadó added. 
 

A New Diagnostic Era 

Commenting on the research, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, said what’s particularly interesting about this study is that the researchers examined multiple cohorts of cognitively unimpaired individuals and “consistently” found that plasma p-tau217 could identify individuals with amyloid-positive PET and CSF with high accuracy. 

“This may reduce the need for more expensive and more invasive scans or lumbar punctures to confirm if an individual has the biology,” Dr. Snyder said. 

“Blood tests are revolutionizing Alzheimer’s detection, diagnosis and ultimately treatment,” added Howard Fillit, MD, cofounder and chief science officer of the Alzheimer’s Drug Discovery Foundation. 

He predicted that blood tests will “soon replace more invasive and costly PET scans as the standard of care and serve as the first line of defense in diagnosing the disease.”

“After many years of research, the field is in a place where we have novel biomarkers and diagnostics to support a diagnosis,” the way cholesterol is used to help detect heart disease, said Dr. Fillit. 

“The diagnostic framework for Alzheimer’s — an incredibly complex disease — is constantly evolving. As we usher in the new era of care, we are moving closer to the day when blood tests will be complemented by digital tools to provide precise and timely diagnoses and risk assessments backed by numerous data points, complementing existing cognitive tests,” he added. 

Funding for the study was provided by the Alzheimer’s Association, the European Union’s Horizon 2020 Research and Innovation Program, Alzheimerfonden, and Strategic Research Area MultiPark. Dr. Salvadó, Dr. Snyder, and Dr. Fillit have no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Blood Biomarkers Are Highly Accurate in Diagnosing Alzheimer’s Disease

Article Type
Changed
Tue, 07/30/2024 - 12:34

Amyloid beta (Abeta) and tau protein blood biomarkers are highly accurate in identifying Alzheimer’s disease in patients with cognitive symptoms attending primary and secondary care clinics, new research showed.

Accurate early diagnosis of Alzheimer’s disease is important because two monoclonal antibodies donanemab (Kisunla) and lecanemab (Leqembi) are now approved by the Food and Drug Administration (FDA) for early-stage Alzheimer’s disease. However, the use of these agents requires amyloid confirmation.

A key finding of the study was that primary care physicians had a diagnostic accuracy of 61%, and dementia specialists had an accuracy of 73%, after completing standard clinical evaluations and before seeing results of the blood test or other Alzheimer’s disease biomarkers, while the blood test used in the study had an accuracy of 91% for correctly classifying clinical, biomarker-verified Alzheimer’s disease.

“This underscores the potential improvement in diagnostic accuracy, especially in primary care, when implementing such a blood test,” said study investigator Sebastian Palmqvist, MD, PhD, associate professor of neurology at Lund University, Lund, and a consultant at Skåne University Hospital, Malmö, Sweden. “It also highlights the challenges in accurately identifying Alzheimer’s disease based solely on clinical evaluation and cognitive testing, even for specialists.”

The findings were presented at the 2024 Alzheimer’s Association International Conference (AAIC) and simultaneously published online in JAMA.

The study included two cohorts from primary and secondary care clinics in Sweden. Researchers analyzed plasma samples together at one time point in a single batch.

It also included two cohorts from Swedish primary and secondary care clinics where the plasma samples were analyzed prospectively (biweekly) in batches throughout the enrollment period, which more closely resembles clinical practice.

Primary care physicians and dementia specialists documented whether they believed their patients had Alzheimer’s disease pathology, basing the diagnoses on the standard evaluation that includes clinical examination, cognitive testing, and a CT scan prior to seeing any Alzheimer’s disease biomarker results.

They reported their certainty of the presence of Alzheimer’s disease pathology on a scale from 0 (not at all certain) to 10 (completely certain).

Plasma analyses were performed by personnel blinded to all clinical or biomarker data. Mass spectrometry assays were used to analyze Abeta42, Abeta40, phosphorylated tau 217 (p-tau217), and non–p-tau217.

Biomarkers used in the study included the percentage of plasma p-tau217, which is the ratio of p-tau217 relative to non–p-tau217, and the Abeta42 to Abeta40 ratio (the amyloid probability score 2 [APS2]). Researchers determined p-tau217 alone and when combined with the APS2.

The study included 1213 patients with cognitive symptoms — mean age 74.2 years and 48% women. Researchers applied biomarker cutoff values to the primary care cohort (n = 307) and the secondary care cohort (n = 300) and then evaluated the blood test prospectively in the primary care cohort (n = 208) and the secondary care cohort (n = 398).

The blood biomarker cutoff value was set at 90% specificity for Alzheimer’s disease pathology (the 1 cutoff-value approach). A 2 cutoff-value approach (using 1 upper and 1 lower cutoff value) was also used with values corresponding to 95% sensitivity and 95% specificity.

The primary outcome was presence of Alzheimer’s disease pathology. A positive finding of the Abeta biomarker was defined according to the FDA-approved cutoff value (≤ 0.072). A positive finding of the tau biomarker was defined as a p-tau217 level > 11.42 pg/mL in cerebrospinal fluid.

Researchers calculated the positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy, as well as area under the curve (AUC) values.
 

 

 

Accuracy in Specialty Versus Primary Care

When the plasma samples were analyzed in a single batch in the primary care cohort, the AUC was 0.97 when the APS2 was used. In the secondary care cohort, the AUC was 0.96 when the APS2 was used.

When plasma samples were analyzed prospectively (biweekly) in the primary care cohort, the AUC was 0.96 when the APS2 was used. In the secondary care cohort, the AUC was 0.97 when the APS2 was used.

The 2 cutoff-value approach achieved PPVs of 97%-99% in patients with cognitive impairment, which is the target population of currently available antiamyloid treatments.

Although NPVs were slightly lower in these patients (87%-92% using the APS2), “we argue that a very high positive predictive value is probably more important in diagnosing patients as having Alzheimer’s disease, especially before initiating costly and burdensome antiamyloid treatment,” the investigators noted.

The PPVs were less than optimal for accurate identification of Alzheimer’s disease pathology in patients with subjective cognitive decline regardless of the cutoff-value approach used. The researchers pointed out that this could be a disadvantage for clinical trials that include patients with presymptomatic Alzheimer’s disease but not in clinical practice because there are no clinical criteria for diagnosing Alzheimer’s disease at the subjective cognitive decline stage.

The NPVs were higher in patients with subjective cognitive decline (91%-94% for the APS2 or percentage of p-tau217 alone). This indicates the blood test would be more useful for ruling out underlying Alzheimer’s disease when only subtle symptoms are present, the researchers noted.

As for doctors identifying clinical Alzheimer’s disease, primary care physicians had a diagnostic accuracy of 61% (95% CI, 53%-69%) versus 91% (95% CI, 86%-96%) using the APS2. Dementia specialists had a diagnostic accuracy of 73% (95% CI, 68%-79%) versus 91% (95% CI, 86%-95%) using the APS2.

In the overall population, the diagnostic accuracy using the APS2 (90%; 95% CI, 88%-92%) was not different from that using the percentage of p-tau217 alone (90%; 95% CI, 88%-91%).

Very little was known about how a blood test would perform in a primary care setting, said Dr. Palmqvist. “Seeing that the test was just as accurate in primary care (about 90%) as it was in secondary care is really encouraging, especially since primary care is the first, and often final, point of entry into the healthcare system for cognitive evaluations.”

He said he was surprised the biomarkers performed so well in prospective, biweekly analyses throughout the study. “Previous studies have only demonstrated their effectiveness when all collected samples are analyzed at a single time point, which does not reflect how a blood test is used in clinical practice.”

He added that he was surprised that the tests were just as accurate in primary care as in a memory clinic setting with referred patients. This, despite older age and higher prevalence of comorbidities in primary care, such as chronic kidney disease (present in 26% of the primary care cohort), can be a confounding factor causing increased concentrations of p-tau217.
 

Next Steps

The diagnostic accuracy of the blood tests is on par with FDA-cleared cerebrospinal fluid biomarkers, noted the investigators, led by senior author Oskar Hansson, MD, PhD, Clinical Memory Research Unit, Department of Clinical Sciences Malmö, Faculty of Medicine, Lund University, Lund, Sweden.

As blood tests are “more time effective, cost effective, and convenient” for patients, “they could also potentially replace cerebrospinal fluid tests and PET,” they added.

Dr. Palmqvist emphasized that these tests should not be used as stand-alone diagnostic tools for Alzheimer’s disease but should complement the standard clinical evaluation that includes cognitive testing and a thorough interview with the patient and a spouse or relative.

“This is crucial because Alzheimer’s disease pathology can be asymptomatic for many years, and cognitive symptoms in some patients with Alzheimer’s disease pathology may primarily result from other conditions. Misinterpreting a positive Alzheimer’s disease blood test could lead to underdiagnosis of common non–Alzheimer’s disease conditions.”

With new antiamyloid treatments possibly slowing disease progression by 30%-40% when initiated early on, a blood test for Alzheimer’s disease could lead to more people receiving an accurate and earlier diagnosis, said Dr. Palmqvist. “This could potentially result in a better response to treatment. Results from drug trials clearly indicate that the earlier treatment begins, the more effectively it can slow disease progression.”

The test used in the study is already available in the United States, the investigators said, and a similar test will be accessible in Sweden within a few months. “However, the rollout will probably be gradual and will depend on how international and national guidelines recommend their use, so developing these guidelines will be a crucial next step for widespread implementation, particularly in primary care,” said Dr. Palmqvist.

He also underlined the importance of replicating the findings in more diverse populations. “This will help ensure the tests’ reliability and effectiveness across various demographic and clinical contexts.”

An important next research step is to examine how implementing a blood test for Alzheimer’s disease affects patient care. “This includes looking at changes in management, such as referrals, other examinations, and the initiation of appropriate treatments,” said Dr. Palmqvist.

Another study presented at the meeting showed that a highly accurate blood test could significantly reduce diagnostic wait times.
 

 

 

Convincing Research

In an accompanying editorial, Stephen Salloway, MD, Departments of Psychiatry and Neurology, Warren Alpert Medical School, Brown University, Providence, Rhode Island, and colleagues said the study “makes the case convincingly that highly sensitive blood measures of Alzheimer’s disease can be integrated into the clinical decision-making process, including in the primary care setting.”

These tests, they wrote, “can be used to enhance the ability of clinicians to accurately identify individuals with cognitive impairment and dementia due to Alzheimer’s disease.

“Current practice should focus on using these blood biomarkers in individuals with cognitive impairment rather than in those with normal cognition or subjective cognitive decline until further research demonstrates effective interventions for individuals considered cognitively normal with elevated levels of amyloid.”

A key limitation of the study was the lack of diversity in the study sample. This makes it difficult to generalize the results across other ethnic and racial groups, the editorialists noted. Plasma assays for Alzheimer’s disease in the United States will require approval from the FDA and coverage by the Centers for Medicare & Medicaid Services to be widely adopted.

The editorialists also pointed out that advances in the diagnosis and treatment of Alzheimer’s disease will require important changes to healthcare models, including providing additional resources and staffing.

The study was supported by the Alzheimer’s Association, National Institute on Aging, European Research Council, Swedish Research Council, the GHR Foundation, and other groups. The study was conducted as an academic collaboration between Lund University and C2N Diagnostics in the United States. Lund University or its affiliated researchers received no funding or compensation from C2N Diagnostics. C2N Diagnostics performed the plasma analyses blinded to any biomarker or clinical data and had no role in the statistical analysis or results. Dr. Palmqvist reported receiving institutional research support from ki:elements, Alzheimer’s Drug Discovery Foundation, and Avid Radiopharmaceuticals and consultancy or speaker fees from BioArctic, Biogen, Esai, Eli Lilly, and Roche. Dr. Hansson reported receiving personal fees from AC Immune, ALZpath, BioArctic, Biogen, Cerveau, Eisai, Eli Lilly, Fujirebio, Roche, Bristol-Myers Squibb, Merck, Novartis, Novo Nordisk, Roche, Sanofi, and Siemens and institutional research support from ADX, AVID Radiopharmaceuticals, Biogen, Eli Lilly, Eisai, Fujirebio, GE Healthcare, Pfizer, and Roche. Dr. Salloway reported receiving grants from Biogen, Roche, Lilly, Genentech, Eisai, and Novartis; personal fees from Biogen, Roche, Lilly, Genentech, Eisai, Novo Nordisk, Prothena, AbbVie, Acumen, and Kisbee; and nonfinancial support (travel expenses for conference attendance) from Biogen, Roche, Lilly, and Acumen.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Amyloid beta (Abeta) and tau protein blood biomarkers are highly accurate in identifying Alzheimer’s disease in patients with cognitive symptoms attending primary and secondary care clinics, new research showed.

Accurate early diagnosis of Alzheimer’s disease is important because two monoclonal antibodies donanemab (Kisunla) and lecanemab (Leqembi) are now approved by the Food and Drug Administration (FDA) for early-stage Alzheimer’s disease. However, the use of these agents requires amyloid confirmation.

A key finding of the study was that primary care physicians had a diagnostic accuracy of 61%, and dementia specialists had an accuracy of 73%, after completing standard clinical evaluations and before seeing results of the blood test or other Alzheimer’s disease biomarkers, while the blood test used in the study had an accuracy of 91% for correctly classifying clinical, biomarker-verified Alzheimer’s disease.

“This underscores the potential improvement in diagnostic accuracy, especially in primary care, when implementing such a blood test,” said study investigator Sebastian Palmqvist, MD, PhD, associate professor of neurology at Lund University, Lund, and a consultant at Skåne University Hospital, Malmö, Sweden. “It also highlights the challenges in accurately identifying Alzheimer’s disease based solely on clinical evaluation and cognitive testing, even for specialists.”

The findings were presented at the 2024 Alzheimer’s Association International Conference (AAIC) and simultaneously published online in JAMA.

The study included two cohorts from primary and secondary care clinics in Sweden. Researchers analyzed plasma samples together at one time point in a single batch.

It also included two cohorts from Swedish primary and secondary care clinics where the plasma samples were analyzed prospectively (biweekly) in batches throughout the enrollment period, which more closely resembles clinical practice.

Primary care physicians and dementia specialists documented whether they believed their patients had Alzheimer’s disease pathology, basing the diagnoses on the standard evaluation that includes clinical examination, cognitive testing, and a CT scan prior to seeing any Alzheimer’s disease biomarker results.

They reported their certainty of the presence of Alzheimer’s disease pathology on a scale from 0 (not at all certain) to 10 (completely certain).

Plasma analyses were performed by personnel blinded to all clinical or biomarker data. Mass spectrometry assays were used to analyze Abeta42, Abeta40, phosphorylated tau 217 (p-tau217), and non–p-tau217.

Biomarkers used in the study included the percentage of plasma p-tau217, which is the ratio of p-tau217 relative to non–p-tau217, and the Abeta42 to Abeta40 ratio (the amyloid probability score 2 [APS2]). Researchers determined p-tau217 alone and when combined with the APS2.

The study included 1213 patients with cognitive symptoms — mean age 74.2 years and 48% women. Researchers applied biomarker cutoff values to the primary care cohort (n = 307) and the secondary care cohort (n = 300) and then evaluated the blood test prospectively in the primary care cohort (n = 208) and the secondary care cohort (n = 398).

The blood biomarker cutoff value was set at 90% specificity for Alzheimer’s disease pathology (the 1 cutoff-value approach). A 2 cutoff-value approach (using 1 upper and 1 lower cutoff value) was also used with values corresponding to 95% sensitivity and 95% specificity.

The primary outcome was presence of Alzheimer’s disease pathology. A positive finding of the Abeta biomarker was defined according to the FDA-approved cutoff value (≤ 0.072). A positive finding of the tau biomarker was defined as a p-tau217 level > 11.42 pg/mL in cerebrospinal fluid.

Researchers calculated the positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy, as well as area under the curve (AUC) values.
 

 

 

Accuracy in Specialty Versus Primary Care

When the plasma samples were analyzed in a single batch in the primary care cohort, the AUC was 0.97 when the APS2 was used. In the secondary care cohort, the AUC was 0.96 when the APS2 was used.

When plasma samples were analyzed prospectively (biweekly) in the primary care cohort, the AUC was 0.96 when the APS2 was used. In the secondary care cohort, the AUC was 0.97 when the APS2 was used.

The 2 cutoff-value approach achieved PPVs of 97%-99% in patients with cognitive impairment, which is the target population of currently available antiamyloid treatments.

Although NPVs were slightly lower in these patients (87%-92% using the APS2), “we argue that a very high positive predictive value is probably more important in diagnosing patients as having Alzheimer’s disease, especially before initiating costly and burdensome antiamyloid treatment,” the investigators noted.

The PPVs were less than optimal for accurate identification of Alzheimer’s disease pathology in patients with subjective cognitive decline regardless of the cutoff-value approach used. The researchers pointed out that this could be a disadvantage for clinical trials that include patients with presymptomatic Alzheimer’s disease but not in clinical practice because there are no clinical criteria for diagnosing Alzheimer’s disease at the subjective cognitive decline stage.

The NPVs were higher in patients with subjective cognitive decline (91%-94% for the APS2 or percentage of p-tau217 alone). This indicates the blood test would be more useful for ruling out underlying Alzheimer’s disease when only subtle symptoms are present, the researchers noted.

As for doctors identifying clinical Alzheimer’s disease, primary care physicians had a diagnostic accuracy of 61% (95% CI, 53%-69%) versus 91% (95% CI, 86%-96%) using the APS2. Dementia specialists had a diagnostic accuracy of 73% (95% CI, 68%-79%) versus 91% (95% CI, 86%-95%) using the APS2.

In the overall population, the diagnostic accuracy using the APS2 (90%; 95% CI, 88%-92%) was not different from that using the percentage of p-tau217 alone (90%; 95% CI, 88%-91%).

Very little was known about how a blood test would perform in a primary care setting, said Dr. Palmqvist. “Seeing that the test was just as accurate in primary care (about 90%) as it was in secondary care is really encouraging, especially since primary care is the first, and often final, point of entry into the healthcare system for cognitive evaluations.”

He said he was surprised the biomarkers performed so well in prospective, biweekly analyses throughout the study. “Previous studies have only demonstrated their effectiveness when all collected samples are analyzed at a single time point, which does not reflect how a blood test is used in clinical practice.”

He added that he was surprised that the tests were just as accurate in primary care as in a memory clinic setting with referred patients. This, despite older age and higher prevalence of comorbidities in primary care, such as chronic kidney disease (present in 26% of the primary care cohort), can be a confounding factor causing increased concentrations of p-tau217.
 

Next Steps

The diagnostic accuracy of the blood tests is on par with FDA-cleared cerebrospinal fluid biomarkers, noted the investigators, led by senior author Oskar Hansson, MD, PhD, Clinical Memory Research Unit, Department of Clinical Sciences Malmö, Faculty of Medicine, Lund University, Lund, Sweden.

As blood tests are “more time effective, cost effective, and convenient” for patients, “they could also potentially replace cerebrospinal fluid tests and PET,” they added.

Dr. Palmqvist emphasized that these tests should not be used as stand-alone diagnostic tools for Alzheimer’s disease but should complement the standard clinical evaluation that includes cognitive testing and a thorough interview with the patient and a spouse or relative.

“This is crucial because Alzheimer’s disease pathology can be asymptomatic for many years, and cognitive symptoms in some patients with Alzheimer’s disease pathology may primarily result from other conditions. Misinterpreting a positive Alzheimer’s disease blood test could lead to underdiagnosis of common non–Alzheimer’s disease conditions.”

With new antiamyloid treatments possibly slowing disease progression by 30%-40% when initiated early on, a blood test for Alzheimer’s disease could lead to more people receiving an accurate and earlier diagnosis, said Dr. Palmqvist. “This could potentially result in a better response to treatment. Results from drug trials clearly indicate that the earlier treatment begins, the more effectively it can slow disease progression.”

The test used in the study is already available in the United States, the investigators said, and a similar test will be accessible in Sweden within a few months. “However, the rollout will probably be gradual and will depend on how international and national guidelines recommend their use, so developing these guidelines will be a crucial next step for widespread implementation, particularly in primary care,” said Dr. Palmqvist.

He also underlined the importance of replicating the findings in more diverse populations. “This will help ensure the tests’ reliability and effectiveness across various demographic and clinical contexts.”

An important next research step is to examine how implementing a blood test for Alzheimer’s disease affects patient care. “This includes looking at changes in management, such as referrals, other examinations, and the initiation of appropriate treatments,” said Dr. Palmqvist.

Another study presented at the meeting showed that a highly accurate blood test could significantly reduce diagnostic wait times.
 

 

 

Convincing Research

In an accompanying editorial, Stephen Salloway, MD, Departments of Psychiatry and Neurology, Warren Alpert Medical School, Brown University, Providence, Rhode Island, and colleagues said the study “makes the case convincingly that highly sensitive blood measures of Alzheimer’s disease can be integrated into the clinical decision-making process, including in the primary care setting.”

These tests, they wrote, “can be used to enhance the ability of clinicians to accurately identify individuals with cognitive impairment and dementia due to Alzheimer’s disease.

“Current practice should focus on using these blood biomarkers in individuals with cognitive impairment rather than in those with normal cognition or subjective cognitive decline until further research demonstrates effective interventions for individuals considered cognitively normal with elevated levels of amyloid.”

A key limitation of the study was the lack of diversity in the study sample. This makes it difficult to generalize the results across other ethnic and racial groups, the editorialists noted. Plasma assays for Alzheimer’s disease in the United States will require approval from the FDA and coverage by the Centers for Medicare & Medicaid Services to be widely adopted.

The editorialists also pointed out that advances in the diagnosis and treatment of Alzheimer’s disease will require important changes to healthcare models, including providing additional resources and staffing.

The study was supported by the Alzheimer’s Association, National Institute on Aging, European Research Council, Swedish Research Council, the GHR Foundation, and other groups. The study was conducted as an academic collaboration between Lund University and C2N Diagnostics in the United States. Lund University or its affiliated researchers received no funding or compensation from C2N Diagnostics. C2N Diagnostics performed the plasma analyses blinded to any biomarker or clinical data and had no role in the statistical analysis or results. Dr. Palmqvist reported receiving institutional research support from ki:elements, Alzheimer’s Drug Discovery Foundation, and Avid Radiopharmaceuticals and consultancy or speaker fees from BioArctic, Biogen, Esai, Eli Lilly, and Roche. Dr. Hansson reported receiving personal fees from AC Immune, ALZpath, BioArctic, Biogen, Cerveau, Eisai, Eli Lilly, Fujirebio, Roche, Bristol-Myers Squibb, Merck, Novartis, Novo Nordisk, Roche, Sanofi, and Siemens and institutional research support from ADX, AVID Radiopharmaceuticals, Biogen, Eli Lilly, Eisai, Fujirebio, GE Healthcare, Pfizer, and Roche. Dr. Salloway reported receiving grants from Biogen, Roche, Lilly, Genentech, Eisai, and Novartis; personal fees from Biogen, Roche, Lilly, Genentech, Eisai, Novo Nordisk, Prothena, AbbVie, Acumen, and Kisbee; and nonfinancial support (travel expenses for conference attendance) from Biogen, Roche, Lilly, and Acumen.

A version of this article appeared on Medscape.com.

Amyloid beta (Abeta) and tau protein blood biomarkers are highly accurate in identifying Alzheimer’s disease in patients with cognitive symptoms attending primary and secondary care clinics, new research showed.

Accurate early diagnosis of Alzheimer’s disease is important because two monoclonal antibodies donanemab (Kisunla) and lecanemab (Leqembi) are now approved by the Food and Drug Administration (FDA) for early-stage Alzheimer’s disease. However, the use of these agents requires amyloid confirmation.

A key finding of the study was that primary care physicians had a diagnostic accuracy of 61%, and dementia specialists had an accuracy of 73%, after completing standard clinical evaluations and before seeing results of the blood test or other Alzheimer’s disease biomarkers, while the blood test used in the study had an accuracy of 91% for correctly classifying clinical, biomarker-verified Alzheimer’s disease.

“This underscores the potential improvement in diagnostic accuracy, especially in primary care, when implementing such a blood test,” said study investigator Sebastian Palmqvist, MD, PhD, associate professor of neurology at Lund University, Lund, and a consultant at Skåne University Hospital, Malmö, Sweden. “It also highlights the challenges in accurately identifying Alzheimer’s disease based solely on clinical evaluation and cognitive testing, even for specialists.”

The findings were presented at the 2024 Alzheimer’s Association International Conference (AAIC) and simultaneously published online in JAMA.

The study included two cohorts from primary and secondary care clinics in Sweden. Researchers analyzed plasma samples together at one time point in a single batch.

It also included two cohorts from Swedish primary and secondary care clinics where the plasma samples were analyzed prospectively (biweekly) in batches throughout the enrollment period, which more closely resembles clinical practice.

Primary care physicians and dementia specialists documented whether they believed their patients had Alzheimer’s disease pathology, basing the diagnoses on the standard evaluation that includes clinical examination, cognitive testing, and a CT scan prior to seeing any Alzheimer’s disease biomarker results.

They reported their certainty of the presence of Alzheimer’s disease pathology on a scale from 0 (not at all certain) to 10 (completely certain).

Plasma analyses were performed by personnel blinded to all clinical or biomarker data. Mass spectrometry assays were used to analyze Abeta42, Abeta40, phosphorylated tau 217 (p-tau217), and non–p-tau217.

Biomarkers used in the study included the percentage of plasma p-tau217, which is the ratio of p-tau217 relative to non–p-tau217, and the Abeta42 to Abeta40 ratio (the amyloid probability score 2 [APS2]). Researchers determined p-tau217 alone and when combined with the APS2.

The study included 1213 patients with cognitive symptoms — mean age 74.2 years and 48% women. Researchers applied biomarker cutoff values to the primary care cohort (n = 307) and the secondary care cohort (n = 300) and then evaluated the blood test prospectively in the primary care cohort (n = 208) and the secondary care cohort (n = 398).

The blood biomarker cutoff value was set at 90% specificity for Alzheimer’s disease pathology (the 1 cutoff-value approach). A 2 cutoff-value approach (using 1 upper and 1 lower cutoff value) was also used with values corresponding to 95% sensitivity and 95% specificity.

The primary outcome was presence of Alzheimer’s disease pathology. A positive finding of the Abeta biomarker was defined according to the FDA-approved cutoff value (≤ 0.072). A positive finding of the tau biomarker was defined as a p-tau217 level > 11.42 pg/mL in cerebrospinal fluid.

Researchers calculated the positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy, as well as area under the curve (AUC) values.
 

 

 

Accuracy in Specialty Versus Primary Care

When the plasma samples were analyzed in a single batch in the primary care cohort, the AUC was 0.97 when the APS2 was used. In the secondary care cohort, the AUC was 0.96 when the APS2 was used.

When plasma samples were analyzed prospectively (biweekly) in the primary care cohort, the AUC was 0.96 when the APS2 was used. In the secondary care cohort, the AUC was 0.97 when the APS2 was used.

The 2 cutoff-value approach achieved PPVs of 97%-99% in patients with cognitive impairment, which is the target population of currently available antiamyloid treatments.

Although NPVs were slightly lower in these patients (87%-92% using the APS2), “we argue that a very high positive predictive value is probably more important in diagnosing patients as having Alzheimer’s disease, especially before initiating costly and burdensome antiamyloid treatment,” the investigators noted.

The PPVs were less than optimal for accurate identification of Alzheimer’s disease pathology in patients with subjective cognitive decline regardless of the cutoff-value approach used. The researchers pointed out that this could be a disadvantage for clinical trials that include patients with presymptomatic Alzheimer’s disease but not in clinical practice because there are no clinical criteria for diagnosing Alzheimer’s disease at the subjective cognitive decline stage.

The NPVs were higher in patients with subjective cognitive decline (91%-94% for the APS2 or percentage of p-tau217 alone). This indicates the blood test would be more useful for ruling out underlying Alzheimer’s disease when only subtle symptoms are present, the researchers noted.

As for doctors identifying clinical Alzheimer’s disease, primary care physicians had a diagnostic accuracy of 61% (95% CI, 53%-69%) versus 91% (95% CI, 86%-96%) using the APS2. Dementia specialists had a diagnostic accuracy of 73% (95% CI, 68%-79%) versus 91% (95% CI, 86%-95%) using the APS2.

In the overall population, the diagnostic accuracy using the APS2 (90%; 95% CI, 88%-92%) was not different from that using the percentage of p-tau217 alone (90%; 95% CI, 88%-91%).

Very little was known about how a blood test would perform in a primary care setting, said Dr. Palmqvist. “Seeing that the test was just as accurate in primary care (about 90%) as it was in secondary care is really encouraging, especially since primary care is the first, and often final, point of entry into the healthcare system for cognitive evaluations.”

He said he was surprised the biomarkers performed so well in prospective, biweekly analyses throughout the study. “Previous studies have only demonstrated their effectiveness when all collected samples are analyzed at a single time point, which does not reflect how a blood test is used in clinical practice.”

He added that he was surprised that the tests were just as accurate in primary care as in a memory clinic setting with referred patients. This, despite older age and higher prevalence of comorbidities in primary care, such as chronic kidney disease (present in 26% of the primary care cohort), can be a confounding factor causing increased concentrations of p-tau217.
 

Next Steps

The diagnostic accuracy of the blood tests is on par with FDA-cleared cerebrospinal fluid biomarkers, noted the investigators, led by senior author Oskar Hansson, MD, PhD, Clinical Memory Research Unit, Department of Clinical Sciences Malmö, Faculty of Medicine, Lund University, Lund, Sweden.

As blood tests are “more time effective, cost effective, and convenient” for patients, “they could also potentially replace cerebrospinal fluid tests and PET,” they added.

Dr. Palmqvist emphasized that these tests should not be used as stand-alone diagnostic tools for Alzheimer’s disease but should complement the standard clinical evaluation that includes cognitive testing and a thorough interview with the patient and a spouse or relative.

“This is crucial because Alzheimer’s disease pathology can be asymptomatic for many years, and cognitive symptoms in some patients with Alzheimer’s disease pathology may primarily result from other conditions. Misinterpreting a positive Alzheimer’s disease blood test could lead to underdiagnosis of common non–Alzheimer’s disease conditions.”

With new antiamyloid treatments possibly slowing disease progression by 30%-40% when initiated early on, a blood test for Alzheimer’s disease could lead to more people receiving an accurate and earlier diagnosis, said Dr. Palmqvist. “This could potentially result in a better response to treatment. Results from drug trials clearly indicate that the earlier treatment begins, the more effectively it can slow disease progression.”

The test used in the study is already available in the United States, the investigators said, and a similar test will be accessible in Sweden within a few months. “However, the rollout will probably be gradual and will depend on how international and national guidelines recommend their use, so developing these guidelines will be a crucial next step for widespread implementation, particularly in primary care,” said Dr. Palmqvist.

He also underlined the importance of replicating the findings in more diverse populations. “This will help ensure the tests’ reliability and effectiveness across various demographic and clinical contexts.”

An important next research step is to examine how implementing a blood test for Alzheimer’s disease affects patient care. “This includes looking at changes in management, such as referrals, other examinations, and the initiation of appropriate treatments,” said Dr. Palmqvist.

Another study presented at the meeting showed that a highly accurate blood test could significantly reduce diagnostic wait times.
 

 

 

Convincing Research

In an accompanying editorial, Stephen Salloway, MD, Departments of Psychiatry and Neurology, Warren Alpert Medical School, Brown University, Providence, Rhode Island, and colleagues said the study “makes the case convincingly that highly sensitive blood measures of Alzheimer’s disease can be integrated into the clinical decision-making process, including in the primary care setting.”

These tests, they wrote, “can be used to enhance the ability of clinicians to accurately identify individuals with cognitive impairment and dementia due to Alzheimer’s disease.

“Current practice should focus on using these blood biomarkers in individuals with cognitive impairment rather than in those with normal cognition or subjective cognitive decline until further research demonstrates effective interventions for individuals considered cognitively normal with elevated levels of amyloid.”

A key limitation of the study was the lack of diversity in the study sample. This makes it difficult to generalize the results across other ethnic and racial groups, the editorialists noted. Plasma assays for Alzheimer’s disease in the United States will require approval from the FDA and coverage by the Centers for Medicare & Medicaid Services to be widely adopted.

The editorialists also pointed out that advances in the diagnosis and treatment of Alzheimer’s disease will require important changes to healthcare models, including providing additional resources and staffing.

The study was supported by the Alzheimer’s Association, National Institute on Aging, European Research Council, Swedish Research Council, the GHR Foundation, and other groups. The study was conducted as an academic collaboration between Lund University and C2N Diagnostics in the United States. Lund University or its affiliated researchers received no funding or compensation from C2N Diagnostics. C2N Diagnostics performed the plasma analyses blinded to any biomarker or clinical data and had no role in the statistical analysis or results. Dr. Palmqvist reported receiving institutional research support from ki:elements, Alzheimer’s Drug Discovery Foundation, and Avid Radiopharmaceuticals and consultancy or speaker fees from BioArctic, Biogen, Esai, Eli Lilly, and Roche. Dr. Hansson reported receiving personal fees from AC Immune, ALZpath, BioArctic, Biogen, Cerveau, Eisai, Eli Lilly, Fujirebio, Roche, Bristol-Myers Squibb, Merck, Novartis, Novo Nordisk, Roche, Sanofi, and Siemens and institutional research support from ADX, AVID Radiopharmaceuticals, Biogen, Eli Lilly, Eisai, Fujirebio, GE Healthcare, Pfizer, and Roche. Dr. Salloway reported receiving grants from Biogen, Roche, Lilly, Genentech, Eisai, and Novartis; personal fees from Biogen, Roche, Lilly, Genentech, Eisai, Novo Nordisk, Prothena, AbbVie, Acumen, and Kisbee; and nonfinancial support (travel expenses for conference attendance) from Biogen, Roche, Lilly, and Acumen.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Alzheimer’s Blood Test in Primary Care Could Slash Diagnostic, Treatment Wait Times

Article Type
Changed
Tue, 07/30/2024 - 11:56

As disease-modifying treatments for Alzheimer’s disease (AD) become available, equipping primary care physicians with a highly accurate blood test could significantly reduce diagnostic wait times. Currently, the patient diagnostic journey is often prolonged owing to the limited number of AD specialists, causing concern among healthcare providers and patients alike. Now, a new study suggests that use of high-performing blood tests in primary care could identify potential patients with AD much earlier, possibly reducing wait times for specialist care and receipt of treatment.

“We need to triage in primary care and send preferentially the ones that actually could be eligible for treatment, and not those who are just worried because their grandmother reported that she has Alzheimer’s,” lead researcher Soeren Mattke, MD, DSc, told this news organization.

“By combining a brief cognitive test with an accurate blood test of Alzheimer’s pathology in primary care, we can reduce unnecessary referrals, and shorten appointment wait times,” said Dr. Mattke, director of the Brain Health Observatory at the University of Southern California in Los Angeles.

The findings were presented at the Alzheimer’s Association International Conference (AAIC) 2024.
 

Projected Wait Times 100 Months by 2033

The investigators used a Markov model to estimate wait times for patients eligible for AD treatment, taking into account constrained capacity for specialist visits.

The model included the projected US population of people aged 55 years or older from 2023 to 2032. It assumed that individuals would undergo a brief cognitive assessment in primary care and, if suggestive of early-stage cognitive impairment, be referred to a AD specialist under three scenarios: no blood test, blood test to rule out AD pathology, and blood test to confirm AD pathology.

According to the model, without an accurate blood test for AD pathology, projected wait times to see a specialist are about 12 months in 2024 and will increase to more than 100 months in 2033, largely owing to a lack of specialist appointments.

In contrast, with the availability of an accurate blood test to rule out AD, average wait times would be just 3 months in 2024 and increase to only about 13 months in 2033, because far fewer patients would need to see a specialist.

Availability of a blood test to rule in AD pathology in primary care would have a limited effect on wait times because 50% of patients would still undergo confirmatory testing based on expert assumptions, the model suggests.
 

Prioritizing Resources 

“Millions of people have mild memory complaints, and if they all start coming to neurologists, it could completely flood the system and create long wait times for everybody,” Dr. Mattke told this news organization.

The problem, he said, is that brief cognitive tests performed in primary care are not particularly specific for mild cognitive impairment.

“They work pretty well for manifest advanced dementia but for mild cognitive impairment, which is a very subtle, symptomatic disease, they are only about 75% accurate. One quarter are false-positives. That’s a lot of people,” Dr. Mattke said.

He also noted that although earlier blood tests were about 75% accurate, they are now about 90% accurate, “so we are getting to a level where we can pretty much say with confidence that this is likely Alzheimer’s,” Dr. Mattke said.

Commenting on this research for this news organization, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, said it is clear that blood tests, “once confirmed, could have a significant impact on the wait times” for dementia assessment. 

“After an initial blood test, we might be able to rule out or rule in individuals who should go to a specialist for further follow-up and testing. This allows us to really ensure that we’re prioritizing resources accordingly,” said Dr. Snyder, who was not involved in the study. 

This project was supported by a research contract from C2N Diagnostics LLC to USC. Dr. Mattke serves on the board of directors of Senscio Systems Inc. and the scientific advisory board of ALZPath and Boston Millennia Partners and has received consulting fees from Biogen, C2N, Eisai, Eli Lilly, Novartis, and Roche/Genentech. Dr. Snyder has no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

As disease-modifying treatments for Alzheimer’s disease (AD) become available, equipping primary care physicians with a highly accurate blood test could significantly reduce diagnostic wait times. Currently, the patient diagnostic journey is often prolonged owing to the limited number of AD specialists, causing concern among healthcare providers and patients alike. Now, a new study suggests that use of high-performing blood tests in primary care could identify potential patients with AD much earlier, possibly reducing wait times for specialist care and receipt of treatment.

“We need to triage in primary care and send preferentially the ones that actually could be eligible for treatment, and not those who are just worried because their grandmother reported that she has Alzheimer’s,” lead researcher Soeren Mattke, MD, DSc, told this news organization.

“By combining a brief cognitive test with an accurate blood test of Alzheimer’s pathology in primary care, we can reduce unnecessary referrals, and shorten appointment wait times,” said Dr. Mattke, director of the Brain Health Observatory at the University of Southern California in Los Angeles.

The findings were presented at the Alzheimer’s Association International Conference (AAIC) 2024.
 

Projected Wait Times 100 Months by 2033

The investigators used a Markov model to estimate wait times for patients eligible for AD treatment, taking into account constrained capacity for specialist visits.

The model included the projected US population of people aged 55 years or older from 2023 to 2032. It assumed that individuals would undergo a brief cognitive assessment in primary care and, if suggestive of early-stage cognitive impairment, be referred to a AD specialist under three scenarios: no blood test, blood test to rule out AD pathology, and blood test to confirm AD pathology.

According to the model, without an accurate blood test for AD pathology, projected wait times to see a specialist are about 12 months in 2024 and will increase to more than 100 months in 2033, largely owing to a lack of specialist appointments.

In contrast, with the availability of an accurate blood test to rule out AD, average wait times would be just 3 months in 2024 and increase to only about 13 months in 2033, because far fewer patients would need to see a specialist.

Availability of a blood test to rule in AD pathology in primary care would have a limited effect on wait times because 50% of patients would still undergo confirmatory testing based on expert assumptions, the model suggests.
 

Prioritizing Resources 

“Millions of people have mild memory complaints, and if they all start coming to neurologists, it could completely flood the system and create long wait times for everybody,” Dr. Mattke told this news organization.

The problem, he said, is that brief cognitive tests performed in primary care are not particularly specific for mild cognitive impairment.

“They work pretty well for manifest advanced dementia but for mild cognitive impairment, which is a very subtle, symptomatic disease, they are only about 75% accurate. One quarter are false-positives. That’s a lot of people,” Dr. Mattke said.

He also noted that although earlier blood tests were about 75% accurate, they are now about 90% accurate, “so we are getting to a level where we can pretty much say with confidence that this is likely Alzheimer’s,” Dr. Mattke said.

Commenting on this research for this news organization, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, said it is clear that blood tests, “once confirmed, could have a significant impact on the wait times” for dementia assessment. 

“After an initial blood test, we might be able to rule out or rule in individuals who should go to a specialist for further follow-up and testing. This allows us to really ensure that we’re prioritizing resources accordingly,” said Dr. Snyder, who was not involved in the study. 

This project was supported by a research contract from C2N Diagnostics LLC to USC. Dr. Mattke serves on the board of directors of Senscio Systems Inc. and the scientific advisory board of ALZPath and Boston Millennia Partners and has received consulting fees from Biogen, C2N, Eisai, Eli Lilly, Novartis, and Roche/Genentech. Dr. Snyder has no relevant disclosures.

A version of this article first appeared on Medscape.com.

As disease-modifying treatments for Alzheimer’s disease (AD) become available, equipping primary care physicians with a highly accurate blood test could significantly reduce diagnostic wait times. Currently, the patient diagnostic journey is often prolonged owing to the limited number of AD specialists, causing concern among healthcare providers and patients alike. Now, a new study suggests that use of high-performing blood tests in primary care could identify potential patients with AD much earlier, possibly reducing wait times for specialist care and receipt of treatment.

“We need to triage in primary care and send preferentially the ones that actually could be eligible for treatment, and not those who are just worried because their grandmother reported that she has Alzheimer’s,” lead researcher Soeren Mattke, MD, DSc, told this news organization.

“By combining a brief cognitive test with an accurate blood test of Alzheimer’s pathology in primary care, we can reduce unnecessary referrals, and shorten appointment wait times,” said Dr. Mattke, director of the Brain Health Observatory at the University of Southern California in Los Angeles.

The findings were presented at the Alzheimer’s Association International Conference (AAIC) 2024.
 

Projected Wait Times 100 Months by 2033

The investigators used a Markov model to estimate wait times for patients eligible for AD treatment, taking into account constrained capacity for specialist visits.

The model included the projected US population of people aged 55 years or older from 2023 to 2032. It assumed that individuals would undergo a brief cognitive assessment in primary care and, if suggestive of early-stage cognitive impairment, be referred to a AD specialist under three scenarios: no blood test, blood test to rule out AD pathology, and blood test to confirm AD pathology.

According to the model, without an accurate blood test for AD pathology, projected wait times to see a specialist are about 12 months in 2024 and will increase to more than 100 months in 2033, largely owing to a lack of specialist appointments.

In contrast, with the availability of an accurate blood test to rule out AD, average wait times would be just 3 months in 2024 and increase to only about 13 months in 2033, because far fewer patients would need to see a specialist.

Availability of a blood test to rule in AD pathology in primary care would have a limited effect on wait times because 50% of patients would still undergo confirmatory testing based on expert assumptions, the model suggests.
 

Prioritizing Resources 

“Millions of people have mild memory complaints, and if they all start coming to neurologists, it could completely flood the system and create long wait times for everybody,” Dr. Mattke told this news organization.

The problem, he said, is that brief cognitive tests performed in primary care are not particularly specific for mild cognitive impairment.

“They work pretty well for manifest advanced dementia but for mild cognitive impairment, which is a very subtle, symptomatic disease, they are only about 75% accurate. One quarter are false-positives. That’s a lot of people,” Dr. Mattke said.

He also noted that although earlier blood tests were about 75% accurate, they are now about 90% accurate, “so we are getting to a level where we can pretty much say with confidence that this is likely Alzheimer’s,” Dr. Mattke said.

Commenting on this research for this news organization, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer’s Association, said it is clear that blood tests, “once confirmed, could have a significant impact on the wait times” for dementia assessment. 

“After an initial blood test, we might be able to rule out or rule in individuals who should go to a specialist for further follow-up and testing. This allows us to really ensure that we’re prioritizing resources accordingly,” said Dr. Snyder, who was not involved in the study. 

This project was supported by a research contract from C2N Diagnostics LLC to USC. Dr. Mattke serves on the board of directors of Senscio Systems Inc. and the scientific advisory board of ALZPath and Boston Millennia Partners and has received consulting fees from Biogen, C2N, Eisai, Eli Lilly, Novartis, and Roche/Genentech. Dr. Snyder has no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAIC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article